Sample records for combustion database edition

  1. Third millenium ideal gas and condensed phase thermochemical database for combustion (with update from active thermochemical tables).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burcat, A.; Ruscic, B.; Chemistry

    2005-07-29

    The thermochemical database of species involved in combustion processes is and has been available for free use for over 25 years. It was first published in print in 1984, approximately 8 years after it was first assembled, and contained 215 species at the time. This is the 7th printed edition and most likely will be the last one in print in the present format, which involves substantial manual labor. The database currently contains more than 1300 species, specifically organic molecules and radicals, but also inorganic species connected to combustion and air pollution. Since 1991 this database is freely available onmore » the internet, at the Technion-IIT ftp server, and it is continuously expanded and corrected. The database is mirrored daily at an official mirror site, and at random at about a dozen unofficial mirror and 'finger' sites. The present edition contains numerous corrections and many recalculations of data of provisory type by the G3//B3LYP method, a high-accuracy composite ab initio calculation. About 300 species are newly calculated and are not yet published elsewhere. In anticipation of the full coupling, which is under development, the database started incorporating the available (as yet unpublished) values from Active Thermochemical Tables. The electronic version now also contains an XML file of the main database to allow transfer to other formats and ease finding specific information of interest. The database is used by scientists, educators, engineers and students at all levels, dealing primarily with combustion and air pollution, jet engines, rocket propulsion, fireworks, but also by researchers involved in upper atmosphere kinetics, astrophysics, abrasion metallurgy, etc. This introductory article contains explanations of the database and the means to use it, its sources, ways of calculation, and assessments of the accuracy of data.« less

  2. REDIdb: the RNA editing database.

    PubMed

    Picardi, Ernesto; Regina, Teresa Maria Rosaria; Brennicke, Axel; Quagliariello, Carla

    2007-01-01

    The RNA Editing Database (REDIdb) is an interactive, web-based database created and designed with the aim to allocate RNA editing events such as substitutions, insertions and deletions occurring in a wide range of organisms. The database contains both fully and partially sequenced DNA molecules for which editing information is available either by experimental inspection (in vitro) or by computational detection (in silico). Each record of REDIdb is organized in a specific flat-file containing a description of the main characteristics of the entry, a feature table with the editing events and related details and a sequence zone with both the genomic sequence and the corresponding edited transcript. REDIdb is a relational database in which the browsing and identification of editing sites has been simplified by means of two facilities to either graphically display genomic or cDNA sequences or to show the corresponding alignment. In both cases, all editing sites are highlighted in colour and their relative positions are detailed by mousing over. New editing positions can be directly submitted to REDIdb after a user-specific registration to obtain authorized secure access. This first version of REDIdb database stores 9964 editing events and can be freely queried at http://biologia.unical.it/py_script/search.html.

  3. REDIdb: an upgraded bioinformatics resource for organellar RNA editing sites.

    PubMed

    Picardi, Ernesto; Regina, Teresa M R; Verbitskiy, Daniil; Brennicke, Axel; Quagliariello, Carla

    2011-03-01

    RNA editing is a post-transcriptional molecular process whereby the information in a genetic message is modified from that in the corresponding DNA template by means of nucleotide substitutions, insertions and/or deletions. It occurs mostly in organelles by clade-specific diverse and unrelated biochemical mechanisms. RNA editing events have been annotated in primary databases as GenBank and at more sophisticated level in the specialized databases REDIdb, dbRES and EdRNA. At present, REDIdb is the only freely available database that focuses on the organellar RNA editing process and annotates each editing modification in its biological context. Here we present an updated and upgraded release of REDIdb with a web-interface refurbished with graphical and computational facilities that improve RNA editing investigations. Details of the REDIdb features and novelties are illustrated and compared to other RNA editing databases. REDIdb is freely queried at http://biologia.unical.it/py_script/REDIdb/. Copyright © 2010 Elsevier B.V. and Mitochondria Research Society. All rights reserved.

  4. Reference Guide to Non-combustion Technologies for Remediation of Persistent Organic Pollutants in Soil, Second Edition - 2010

    EPA Pesticide Factsheets

    This report is the second edition of the U.S. Environmental Protection Agency's (US EPA's) 2005 report and provides a high level summary of information on the applicability of existing and emerging noncombustion technologies for the remediation of...

  5. 49 CFR 630.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... TRANSPORTATION NATIONAL TRANSIT DATABASE § 630.3 Definitions. (a) Except as otherwise provided, terms defined in... current editions of the National Transit Database Reporting Manuals and the NTD Uniform System of Accounts... benefits from assistance under 49 U.S.C. 5307 or 5311. Current edition of the National Transit Database...

  6. 49 CFR 630.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... TRANSPORTATION NATIONAL TRANSIT DATABASE § 630.3 Definitions. (a) Except as otherwise provided, terms defined in... current editions of the National Transit Database Reporting Manuals and the NTD Uniform System of Accounts... benefits from assistance under 49 U.S.C. 5307 or 5311. Current edition of the National Transit Database...

  7. 49 CFR 630.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... TRANSPORTATION NATIONAL TRANSIT DATABASE § 630.3 Definitions. (a) Except as otherwise provided, terms defined in... current editions of the National Transit Database Reporting Manuals and the NTD Uniform System of Accounts... benefits from assistance under 49 U.S.C. 5307 or 5311. Current edition of the National Transit Database...

  8. 49 CFR 630.3 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... TRANSPORTATION NATIONAL TRANSIT DATABASE § 630.3 Definitions. (a) Except as otherwise provided, terms defined in... current editions of the National Transit Database Reporting Manuals and the NTD Uniform System of Accounts... benefits from assistance under 49 U.S.C. 5307 or 5311. Current edition of the National Transit Database...

  9. 49 CFR 630.3 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... TRANSPORTATION NATIONAL TRANSIT DATABASE § 630.3 Definitions. (a) Except as otherwise provided, terms defined in... current editions of the National Transit Database Reporting Manuals and the NTD Uniform System of Accounts... benefits from assistance under 49 U.S.C. 5307 or 5311. Current edition of the National Transit Database...

  10. An automated system for terrain database construction

    NASA Technical Reports Server (NTRS)

    Johnson, L. F.; Fretz, R. K.; Logan, T. L.; Bryant, N. A.

    1987-01-01

    An automated Terrain Database Preparation System (TDPS) for the construction and editing of terrain databases used in computerized wargaming simulation exercises has been developed. The TDPS system operates under the TAE executive, and it integrates VICAR/IBIS image processing and Geographic Information System software with CAD/CAM data capture and editing capabilities. The terrain database includes such features as roads, rivers, vegetation, and terrain roughness.

  11. A binary linear programming formulation of the graph edit distance.

    PubMed

    Justice, Derek; Hero, Alfred

    2006-08-01

    A binary linear programming formulation of the graph edit distance for unweighted, undirected graphs with vertex attributes is derived and applied to a graph recognition problem. A general formulation for editing graphs is used to derive a graph edit distance that is proven to be a metric, provided the cost function for individual edit operations is a metric. Then, a binary linear program is developed for computing this graph edit distance, and polynomial time methods for determining upper and lower bounds on the solution of the binary program are derived by applying solution methods for standard linear programming and the assignment problem. A recognition problem of comparing a sample input graph to a database of known prototype graphs in the context of a chemical information system is presented as an application of the new method. The costs associated with various edit operations are chosen by using a minimum normalized variance criterion applied to pairwise distances between nearest neighbors in the database of prototypes. The new metric is shown to perform quite well in comparison to existing metrics when applied to a database of chemical graphs.

  12. Supplement a to compilation of air pollutant emission factors. Volume 1. Stationary point and area sources. Fifth edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-02-01

    This Supplement to AP-42 addresses pollutant-generating activity from Bituminous and Subbituminous Coal Combustion; Anthracite Coal Combustion; Fuel Oil Combustion; Natural Gas Combustion; Wood Waste Combustion in Boilers; Lignite Combustion; Waste Oil Combustion: Stationary Gas Turbines for Electricity Generation; Heavy-duty Natural Gas-fired Pipeline Compressor Engines; Large Stationary Diesel and all Stationary Dual-fuel engines; Natural Gas Processing; Organic Liquid Storage Tanks; Meat Smokehouses; Meat Rendering Plants; Canned Fruits and Vegetables; Dehydrated Fruits and Vegetables; Pickles, Sauces and Salad Dressing; Grain Elevators and Processes; Cereal Breakfast Foods; Pasta Manufacturing; Vegetable Oil Processing; Wines and Brandy; Coffee Roasting; Charcoal; Coal Cleaning; Frit Manufacturing; Sandmore » and Gravel Processing; Diatomite Processing; Talc Processing; Vermiculite Processing; paved Roads; and Unpaved Roads. Also included is information on Generalized Particle Size Distributions.« less

  13. The HITRAN 2008 Molecular Spectroscopic Database

    NASA Technical Reports Server (NTRS)

    Rothman, Laurence S.; Gordon, Iouli E.; Barbe, Alain; Benner, D. Chris; Bernath, Peter F.; Birk, Manfred; Boudon, V.; Brown, Linda R.; Campargue, Alain; Champion, J.-P.; hide

    2009-01-01

    This paper describes the status of the 2008 edition of the HITRAN molecular spectroscopic database. The new edition is the first official public release since the 2004 edition, although a number of crucial updates had been made available online since 2004. The HITRAN compilation consists of several components that serve as input for radiative-transfer calculation codes: individual line parameters for the microwave through visible spectra of molecules in the gas phase; absorption cross-sections for molecules having dense spectral features, i.e., spectra in which the individual lines are not resolved; individual line parameters and absorption cross sections for bands in the ultra-violet; refractive indices of aerosols, tables and files of general properties associated with the database; and database management software. The line-by-line portion of the database contains spectroscopic parameters for forty-two molecules including many of their isotopologues.

  14. Selective access and editing in a database

    NASA Technical Reports Server (NTRS)

    Maluf, David A. (Inventor); Gawdiak, Yuri O. (Inventor)

    2010-01-01

    Method and system for providing selective access to different portions of a database by different subgroups of database users. Where N users are involved, up to 2.sup.N-1 distinguishable access subgroups in a group space can be formed, where no two access subgroups have the same members. Two or more members of a given access subgroup can edit, substantially simultaneously, a document accessible to each member.

  15. e23D: database and visualization of A-to-I RNA editing sites mapped to 3D protein structures.

    PubMed

    Solomon, Oz; Eyal, Eran; Amariglio, Ninette; Unger, Ron; Rechavi, Gidi

    2016-07-15

    e23D, a database of A-to-I RNA editing sites from human, mouse and fly mapped to evolutionary related protein 3D structures, is presented. Genomic coordinates of A-to-I RNA editing sites are converted to protein coordinates and mapped onto 3D structures from PDB or theoretical models from ModBase. e23D allows visualization of the protein structure, modeling of recoding events and orientation of the editing with respect to nearby genomic functional sites from databases of disease causing mutations and genomic polymorphism. http://www.sheba-cancer.org.il/e23D CONTACT: oz.solomon@live.biu.ac.il or Eran.Eyal@sheba.health.gov.il. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. ELSI Bibliography: Ethical legal and social implications of the Human Genome Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yesley, M.S.

    This second edition of the ELSI Bibliography provides a current and comprehensive resource for identifying publications on the major topics related to the ethical, legal and social issues (ELSI) of the Human Genome Project. Since the first edition of the ELSI Bibliography was printed last year, new publications and earlier ones identified by additional searching have doubled our computer database of ELSI publications to over 5600 entries. The second edition of the ELSI Bibliography reflects this growth of the underlying computer database. Researchers should note that an extensive collection of publications in the database is available for public use atmore » the General Law Library of Los Alamos National Laboratory (LANL).« less

  17. REDIdb 3.0: A Comprehensive Collection of RNA Editing Events in Plant Organellar Genomes.

    PubMed

    Lo Giudice, Claudio; Pesole, Graziano; Picardi, Ernesto

    2018-01-01

    RNA editing is an important epigenetic mechanism by which genome-encoded transcripts are modified by substitutions, insertions and/or deletions. It was first discovered in kinetoplastid protozoa followed by its reporting in a wide range of organisms. In plants, RNA editing occurs mostly by cytidine (C) to uridine (U) conversion in translated regions of organelle mRNAs and tends to modify affected codons restoring evolutionary conserved aminoacid residues. RNA editing has also been described in non-protein coding regions such as group II introns and structural RNAs. Despite its impact on organellar transcriptome and proteome complexity, current primary databases still do not provide a specific field for RNA editing events. To overcome these limitations, we developed REDIdb a specialized database for RNA editing modifications in plant organelles. Hereafter we describe its third release containing more than 26,000 events in a completely novel web interface to accommodate RNA editing in its genomics, biological and evolutionary context through whole genome maps and multiple sequence alignments. REDIdb is freely available at http://srv00.recas.ba.infn.it/redidb/index.html.

  18. Transportation-markings database : international marine aids to navigation. Volume 1, parts C and D

    DOT National Transportation Integrated Search

    1988-01-01

    This monograph is the second edition of Volume I, Parts C and D of what was formerly termed Transportation Markings: A Study in Communication. The first edition of Volume I also included Parts A and B. The original edition was published by University...

  19. [Presence of the biomedical periodicals of Hungarian editions in international databases].

    PubMed

    Vasas, Lívia; Hercsel, Imréné

    2006-01-15

    Presence of the biomedical periodicals of Hungarian editions in international databases. The majority of Hungarian scientific results in medical and related sciences are published in scientific periodicals of foreign edition with high impact factor (IF) values, and they appear in international scientific literature in foreign languages. In this study the authors dealt with the presence and registered citation in international databases of those periodicals only, which had been published in Hungary and/or in cooperation with foreign publishing companies. The examination went back to year 1980 and covered a 25-year long period. 110 periodicals were selected for more detailed examination. The authors analyzed the situation of the current periodicals in the three most often visited databases (MEDLINE, EMBASE, Web of Science), and discovered, that the biomedical scientific periodicals of Hungarian interests were not represented with reasonable emphasis in the relevant international bibliographic databases. Because of the great number of data the scientific literature of medicine and related sciences could not be represented in its entirety, this publication, however, might give useful information for the inquirers, and call the attention of the competent people.

  20. ExpEdit: a webserver to explore human RNA editing in RNA-Seq experiments.

    PubMed

    Picardi, Ernesto; D'Antonio, Mattia; Carrabino, Danilo; Castrignanò, Tiziana; Pesole, Graziano

    2011-05-01

    ExpEdit is a web application for assessing RNA editing in human at known or user-specified sites supported by transcript data obtained by RNA-Seq experiments. Mapping data (in SAM/BAM format) or directly sequence reads [in FASTQ/short read archive (SRA) format] can be provided as input to carry out a comparative analysis against a large collection of known editing sites collected in DARNED database as well as other user-provided potentially edited positions. Results are shown as dynamic tables containing University of California, Santa Cruz (UCSC) links for a quick examination of the genomic context. ExpEdit is freely available on the web at http://www.caspur.it/ExpEdit/.

  1. Internal combustion engine fuel controls. December 1970-December 1989 (Citations from the US Patent data base). Report for December 1970-December 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1990-01-01

    This bibliography contains citations of selected patents concerning fuel control devices, and methods used to regulate speed and load in internal combustion engines. Techniques utilized to control air-fuel ratios by sensing pressure, temperature, and exhaust composition, and the employment of electronic and feedback devices are discussed. Methods used for engine protection and optimum fuel conservation are considered. (This updated bibliography contains 327 citations, 160 of which are new entries to the previous edition.)

  2. Database Search Strategies & Tips. Reprints from the Best of "ONLINE" [and]"DATABASE."

    ERIC Educational Resources Information Center

    Online, Inc., Weston, CT.

    Reprints of 17 articles presenting strategies and tips for searching databases online appear in this collection, which is one in a series of volumes of reprints from "ONLINE" and "DATABASE" magazines. Edited for information professionals who use electronically distributed databases, these articles address such topics as: (1)…

  3. Identification of genomic sites for CRISPR/Cas9-based genome editing in the Vitis vinifera genome.

    PubMed

    Wang, Yi; Liu, Xianju; Ren, Chong; Zhong, Gan-Yuan; Yang, Long; Li, Shaohua; Liang, Zhenchang

    2016-04-21

    CRISPR/Cas9 has been recently demonstrated as an effective and popular genome editing tool for modifying genomes of humans, animals, microorganisms, and plants. Success of such genome editing is highly dependent on the availability of suitable target sites in the genomes to be edited. Many specific target sites for CRISPR/Cas9 have been computationally identified for several annual model and crop species, but such sites have not been reported for perennial, woody fruit species. In this study, we identified and characterized five types of CRISPR/Cas9 target sites in the widely cultivated grape species Vitis vinifera and developed a user-friendly database for editing grape genomes in the future. A total of 35,767,960 potential CRISPR/Cas9 target sites were identified from grape genomes in this study. Among them, 22,597,817 target sites were mapped to specific genomic locations and 7,269,788 were found to be highly specific. Protospacers and PAMs were found to distribute uniformly and abundantly in the grape genomes. They were present in all the structural elements of genes with the coding region having the highest abundance. Five PAM types, TGG, AGG, GGG, CGG and NGG, were observed. With the exception of the NGG type, they were abundantly present in the grape genomes. Synteny analysis of similar genes revealed that the synteny of protospacers matched the synteny of homologous genes. A user-friendly database containing protospacers and detailed information of the sites was developed and is available for public use at the Grape-CRISPR website ( http://biodb.sdau.edu.cn/gc/index.html ). Grape genomes harbour millions of potential CRISPR/Cas9 target sites. These sites are widely distributed among and within chromosomes with predominant abundance in the coding regions of genes. We developed a publicly-accessible Grape-CRISPR database for facilitating the use of the CRISPR/Cas9 system as a genome editing tool for functional studies and molecular breeding of grapes. Among other functions, the database allows users to identify and select multi-protospacers for editing similar sequences in grape genomes simultaneously.

  4. [International bibliographic databases--Current Contents on disk and in FTP format (Internet): presentation and guide].

    PubMed

    Bloch-Mouillet, E

    1999-01-01

    This paper aims to provide technical and practical advice about finding references using Current Contents on disk (Macintosh or PC) or via the Internet (FTP). Seven editions are published each week. They are all organized in the same way and have the same search engine. The Life Sciences edition, extensively used in medical research, is presented here in detail, as an example. This methodological note explains, in French, how to use this reference database. It is designed to be a practical guide for browsing and searching the database, and particularly for creating search profiles adapted to the needs of researchers.

  5. Analytical Studies of Three-Dimensional Combustion Processes

    DTIC Science & Technology

    1989-05-01

    Include Area Code) 22c OFFICE SYMBOL Raghunath S. Boray 513-255-9991 WRDC/POPT DD Form 1473, JUN 86 Previous editions are obsolete. SECURITY...enthalpy, and momentum are calculated for each finite volume by summing the contributions from all groups of droplets. Thus, ( Sm )i,J N ((PpM-p)in

  6. [Relevance of the hemovigilance regional database for the shared medical file identity server].

    PubMed

    Doly, A; Fressy, P; Garraud, O

    2008-11-01

    The French Health Products Safety Agency coordinates the national initiative of computerization of blood products traceability within regional blood banks and public and private hospitals. The Auvergne-Loire Regional French Blood Service, based in Saint-Etienne, together with a number of public hospitals set up a transfusion data network named EDITAL. After four years of progressive implementation and experimentation, a software enabling standardized data exchange has built up a regional nominative database, endorsed by the Traceability Computerization National Committee in 2004. This database now provides secured web access to a regional transfusion history enabling biologists and all hospital and family practitioners to take in charge the patient follow-up. By running independently from the softwares of its partners, EDITAL database provides reference for the regional identity server.

  7. Experimental study of H2O spectroscopic parameters in the near-IR (6940 7440 cm-1) for gas sensing applications at elevated temperature

    NASA Astrophysics Data System (ADS)

    Liu, Xiang; Zhou, Xin; Jeffries, Jay B.; Hanson, Ronald K.

    2007-02-01

    Tunable diode laser (TDL) absorption sensors of water vapor are attractive for temperature, gas composition, velocity, pressure, and mass flux measurements in a variety of practical applications including hydrocarbon-fueled combustion systems. Optimized design of these sensors requires a complete catalog of the assigned transitions with accurate spectroscopic data; our particular interest has been in the 2ν1, 2ν3, and ν1+ν3 bands in the near-IR where telecommunications diode lasers are available. In support of this need, fully resolved absorption spectra of H2O vapor in the spectral range of 6940 7440 cm-1 (1344 1441 nm) have been measured as a function of temperature (296 1000 K) and pressure (1 800 Torr), and quantitative spectroscopic parameters inferred from these spectra compared to published data from Toth, HITRAN 2000 and HITRAN 2004. The peak absorbances were measured for more than 100 strong transitions at 296 and 828 K, and linestrengths determined for 47 strong lines in this region. In addition to reference linestrengths S(296 K), the air-broadening coefficients γair(296 K) and temperature exponents n were inferred for strong transitions in five narrow regions, near 7185.60, 7203.89, 7405.11, 7426.14 and 7435.62 cm-1 that had been targeted as attractive for future diagnostics applications. Most of the measured results, determined within an accuracy of 5%, are found to be in better agreement with HITRAN 2004 than with earlier editions of this database. Large discrepancies (>10%) between measurements and HITRAN 2004 database are identified for some of the probed transitions. These new spectroscopic data for H2O provide a useful test of the sensor design capabilities of HITRAN 2004 for combustion and other applications at elevated temperatures.

  8. Psychology's struggle for existence: Second edition, 1913.

    PubMed

    Wundt, Wilhelm; Lamiell, James T

    2013-08-01

    Presents an English translation of Wilhelm Wundt's Psychology's struggle for existence: Second edition, 1913, by James T. Lamiell in August, 2012. In his essay, Wundt advised against the impending divorce of psychology from philosophy. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  9. Applications of Technology to CAS Data-Base Production.

    ERIC Educational Resources Information Center

    Weisgerber, David W.

    1984-01-01

    Reviews the economic importance of applying computer technology to Chemical Abstracts Service database production from 1973 to 1983. Database building, technological applications for editorial processing (online editing, Author Index Manufacturing System), and benefits (increased staff productivity, reduced rate of increase of cost of services,…

  10. Military Curriculum Materials for Vocational and Technical Education. Engine Principles, 8-3. Edition 5.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This individualized, self-paced course for independent study in engine principles has been adapted from military curriculum materials for vocational education use. The course provides the student with basic information on engine principles including different kinds of combustion engines, lubrication systems, and cooling systems. It is organized…

  11. The GLAS editing procedures for the FGGE level II-B data collected during SOP-1 and 2

    NASA Technical Reports Server (NTRS)

    Baker, W.; Edelmann, D.; Carus, H.

    1981-01-01

    The modifications made to the FGGE Level II-b data are discussed and the FORTRAN program developed to perform the modifications is described. It is suggested that the edited database is the most accurate one available for FGGE SOP-1 and 2.

  12. The International Handbook of Universities. Twenty-Second Edition

    ERIC Educational Resources Information Center

    Palgrave Macmillan, 2010

    2010-01-01

    The new "International Handbook of Universities" is now 2-volumes and includes single-user access to World Higher Education Database Online. This "Twenty-second Edition" is the most comprehensive guide to university-level education worldwide, providing detailed information on higher education institutions that offer at least a post-graduate degree…

  13. Interoperability, Data Control and Battlespace Visualization using XML, XSLT and X3D

    DTIC Science & Technology

    2003-09-01

    26 Rosenthal, Arnon, Seligman , Len and Costello, Roger, XML, Databases, and Interoperability, Federal Database Colloquium, AFCEA, San Diego...79 Rosenthal, Arnon, Seligman , Len and Costello, Roger, “XML, Databases, and Interoperability”, Federal Database Colloquium, AFCEA, San Diego, 1999... Linda , Mastering XML, Premium Edition, SYBEX, 2001 Wooldridge, Michael , An Introduction to MultiAgent Systems, Wiley, 2002 PAPERS Abernathy, M

  14. Artemis and ACT: viewing, annotating and comparing sequences stored in a relational database.

    PubMed

    Carver, Tim; Berriman, Matthew; Tivey, Adrian; Patel, Chinmay; Böhme, Ulrike; Barrell, Barclay G; Parkhill, Julian; Rajandream, Marie-Adèle

    2008-12-01

    Artemis and Artemis Comparison Tool (ACT) have become mainstream tools for viewing and annotating sequence data, particularly for microbial genomes. Since its first release, Artemis has been continuously developed and supported with additional functionality for editing and analysing sequences based on feedback from an active user community of laboratory biologists and professional annotators. Nevertheless, its utility has been somewhat restricted by its limitation to reading and writing from flat files. Therefore, a new version of Artemis has been developed, which reads from and writes to a relational database schema, and allows users to annotate more complex, often large and fragmented, genome sequences. Artemis and ACT have now been extended to read and write directly to the Generic Model Organism Database (GMOD, http://www.gmod.org) Chado relational database schema. In addition, a Gene Builder tool has been developed to provide structured forms and tables to edit coordinates of gene models and edit functional annotation, based on standard ontologies, controlled vocabularies and free text. Artemis and ACT are freely available (under a GPL licence) for download (for MacOSX, UNIX and Windows) at the Wellcome Trust Sanger Institute web sites: http://www.sanger.ac.uk/Software/Artemis/ http://www.sanger.ac.uk/Software/ACT/

  15. Library Micro-Computing, Vol. 2. Reprints from the Best of "ONLINE" [and]"DATABASE."

    ERIC Educational Resources Information Center

    Online, Inc., Weston, CT.

    Reprints of 19 articles pertaining to library microcomputing appear in this collection, the second of two volumes on this topic in a series of volumes of reprints from "ONLINE" and "DATABASE" magazines. Edited for information professionals who use electronically distributed databases, these articles address such topics as: (1)…

  16. Major research topics in combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hussaini, M.Y.; Kumar, A.; Voigt, R.G.

    1992-01-01

    The Institute for Computer Applications in Science and Engineering (ICASE) and NASA Langley Research Center (LaRC) hosted a workshop on October 2--4, 1989 to discuss some combustion problems of technological interest to LaRC and to foster interaction with the academic community in these research areas. The topics chosen for this purpose were flame structure, flame holding/extinction, chemical kinetics, turbulence-kinetics interaction, transition to detonation, and reacting free shear layers. This document contains the papers and edited versions of general discussions on these topics. The lead paper set the stage for the meeting by discussing the status and issues of supersonic combustionmore » relevant to the scramjet engine. Experts were then called upon to review the current knowledge in the aforementioned areas, to focus on how this knowledge can be extended and applied to high-speed combustion, and to suggest future directions of research in these areas.« less

  17. Data-Base Software For Tracking Technological Developments

    NASA Technical Reports Server (NTRS)

    Aliberti, James A.; Wright, Simon; Monteith, Steve K.

    1996-01-01

    Technology Tracking System (TechTracS) computer program developed for use in storing and retrieving information on technology and related patent information developed under auspices of NASA Headquarters and NASA's field centers. Contents of data base include multiple scanned still images and quick-time movies as well as text. TechTracS includes word-processing, report-editing, chart-and-graph-editing, and search-editing subprograms. Extensive keyword searching capabilities enable rapid location of technologies, innovators, and companies. System performs routine functions automatically and serves multiple users.

  18. The HITRAN molecular data base - Editions of 1991 and 1992

    NASA Technical Reports Server (NTRS)

    Rothman, Laurence S.; Gamache, R. R.; Tipping, R. H.; Rinsland, C. P.; Smith, M. A. H.; Benner, D. C.; Devi, V. M.; Flaud, J.-M.; Camy-Peyret, C.; Perrin, A.

    1992-01-01

    We describe in this paper the modifications, improvements, and enhancements to the HITRAN molecular absorption database that have occurred in the two editions of 1991 and 1992. The current database includes line parameters for 31 species and their isotopomers that are significant for terrestrial atmospheric studies. This line-by-line portion of HITRAN presently contains about 709,000 transitions between 0 and 23,000/cm and contains three molecules not present in earlier versions: COF2, SF6, and H2S. The HITRAN compilation has substantially more information on chlorofluorocarbons and other molecular species that exhibit dense spectra which are not amenable to line-by-line representation. The user access of the database has been advanced, and new media forms are now available for use on personal computers.

  19. Emissions & Generation Resource Integrated Database (eGRID), eGRID2012

    EPA Pesticide Factsheets

    The Emissions & Generation Resource Integrated Database (eGRID) is a comprehensive source of data on the environmental characteristics of almost all electric power generated in the United States. These environmental characteristics include air emissions for nitrogen oxides, sulfur dioxide, carbon dioxide, methane, and nitrous oxide; emissions rates; net generation; resource mix; and many other attributes. eGRID2012 Version 1.0 is the eighth edition of eGRID, which contains the complete release of year 2009 data, as well as year 2007, 2005, and 2004 data. For year 2009 data, all the data are contained in a single Microsoft Excel workbook, which contains boiler, generator, plant, state, power control area, eGRID subregion, NERC region, U.S. total and grid gross loss factor tabs. Full documentation, summary data, eGRID subregion and NERC region representational maps, and GHG emission factors are also released in this edition. The fourth edition of eGRID, eGRID2002 Version 2.01, containing year 1996 through 2000 data is located on the eGRID Archive page (http://www.epa.gov/cleanenergy/energy-resources/egrid/archive.html). The current edition of eGRID and the archived edition of eGRID contain the following years of data: 1996 - 2000, 2004, 2005, and 2007. eGRID has no other years of data.

  20. U.S. Army Natick Soldier Research, Development & Engineering Center Testing Facilities And Equipment. Second Edition

    DTIC Science & Technology

    2011-04-01

    30 Freeze Dryer ................................................. 30 High-Pressure Processing ............................... 30 Microwave Digestive...PP1 Power Platform Energy Analyzer ..... 41 Quintox Gas Combustion Analyzer .................... 41 FLIR Systems SC2000 Thermacam Handheld IR ...electronically directly to the contractor or printed on plotter paper , oak tag, or on CD. alloy steel, stainless steel, aluminum, copper and copper alloys

  1. The HITRAN2016 molecular spectroscopic database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, I. E.; Rothman, L. S.; Hill, C.

    This paper describes the contents of the 2016 edition of the HITRAN molecular spectroscopic compilation. The new edition replaces the previous HITRAN edition of 2012 and its updates during the intervening years. The HITRAN molecular absorption compilation is comprised of five major components: the traditional line-by-line spectroscopic parameters required for high-resolution radiative-transfer codes, infrared absorption cross-sections for molecules not yet amenable to representation in a line-by-line form, collision-induced absorption data, aerosol indices of refraction, and general tables such as partition sums that apply globally to the data. The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additionalmore » absorption phenomena, added line-shape formalisms, and validity. Moreover, molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth. Of considerable note, experimental IR cross-sections for almost 200 additional significant molecules have been added to the database.« less

  2. Library Micro-Computing, Vol. 1. Reprints from the Best of "ONLINE" [and]"DATABASE."

    ERIC Educational Resources Information Center

    Online, Inc., Weston, CT.

    Reprints of 18 articles pertaining to library microcomputing appear in this collection, the first of two volumes on this topic in a series of volumes of reprints from "ONLINE" and "DATABASE" magazines. Edited for information professionals who use electronically distributed databases, these articles address such topics as: (1) an integrated library…

  3. The Research Potential of the Electronic OED Database at the University of Waterloo: A Case Study.

    ERIC Educational Resources Information Center

    Berg, Donna Lee

    1991-01-01

    Discusses the history and structure of the online database of the second edition of the Oxford English Dictionary (OED) and the software tools developed at the University of Waterloo to manipulate the unusually complex database. Four sample searches that indicate some types of problems that might be encountered are appended. (DB)

  4. DNAAlignEditor: DNA alignment editor tool

    PubMed Central

    Sanchez-Villeda, Hector; Schroeder, Steven; Flint-Garcia, Sherry; Guill, Katherine E; Yamasaki, Masanori; McMullen, Michael D

    2008-01-01

    Background With advances in DNA re-sequencing methods and Next-Generation parallel sequencing approaches, there has been a large increase in genomic efforts to define and analyze the sequence variability present among individuals within a species. For very polymorphic species such as maize, this has lead to a need for intuitive, user-friendly software that aids the biologist, often with naïve programming capability, in tracking, editing, displaying, and exporting multiple individual sequence alignments. To fill this need we have developed a novel DNA alignment editor. Results We have generated a nucleotide sequence alignment editor (DNAAlignEditor) that provides an intuitive, user-friendly interface for manual editing of multiple sequence alignments with functions for input, editing, and output of sequence alignments. The color-coding of nucleotide identity and the display of associated quality score aids in the manual alignment editing process. DNAAlignEditor works as a client/server tool having two main components: a relational database that collects the processed alignments and a user interface connected to database through universal data access connectivity drivers. DNAAlignEditor can be used either as a stand-alone application or as a network application with multiple users concurrently connected. Conclusion We anticipate that this software will be of general interest to biologists and population genetics in editing DNA sequence alignments and analyzing natural sequence variation regardless of species, and will be particularly useful for manual alignment editing of sequences in species with high levels of polymorphism. PMID:18366684

  5. The AJCC 8th Edition Staging System for Soft Tissue Sarcoma of the Extremities or Trunk: A Cohort Study of the SEER Database.

    PubMed

    Cates, Justin M M

    2018-02-01

    Background: The AJCC recently published the 8th edition of its cancer staging system. Significant changes were made to the staging algorithm for soft tissue sarcoma (STS) of the extremities or trunk, including the addition of 2 additional T (size) classifications in lieu of tumor depth and grouping lymph node metastasis (LNM) with distant metastasis as stage IV disease. Whether these changes improve staging system performance is questionable. Patients and Methods: This retrospective cohort analysis of 21,396 adult patients with STS of the extremity or trunk in the SEER database compares the AJCC 8th edition staging system with the 7th edition and a newly proposed staging algorithm using a variety of statistical techniques. The effect of tumor size on disease-specific survival was assessed by flexible, nonlinear Cox proportional hazard regression using restricted cubic splines and fractional polynomials. Results: The slope of covariate-adjusted log hazards for sarcoma-specific survival decreases for tumors >8 cm in greatest dimension, limiting prognostic information contributed by the new T4 classification in the AJCC 8th edition. Anatomic depth independently provides significant prognostic information. LNM is not equivalent to distant, non-nodal metastasis. Based on these findings, an alternative staging system is proposed and demonstrated to outperform both AJCC staging schemes. The analyses presented also disclose no evidence of improved clinical performance of the 8th edition compared with the previous edition. Conclusions: The AJCC 8th edition staging system for STS is no better than the previous 7th edition. Instead, a proposed staging system based on histologic grade, tumor size, and anatomic depth shows significantly higher predictive accuracy, with higher model concordance than either AJCC staging system. Changes to existing staging systems should improve the performance of prognostic models. Until such improvements are documented, AJCC committees should refrain from modifying established staging schemes. Copyright © 2018 by the National Comprehensive Cancer Network.

  6. Scale-Independent Relational Query Processing

    DTIC Science & Technology

    2013-10-04

    source options are also available, including Postgresql, MySQL , and SQLite. These mod- ern relational databases are generally very complex software systems...and Their Application to Data Stream Management. IGI Global, 2010. [68] George Reese. Database Programming with JDBC and Java , Second Edition. Ed. by

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yesley, M.S.; Ossorio, P.N.

    This report updates and expands the second edition of the ELSI Bibliography, published in 1993. The Bibliography and Supplement provides a comprehensive resource for identifying publications on the major topics related to the ethical, legal and social issues (ELSI) of the Human Genome Project. The Bibliography and Supplement are extracted from a database compiled at Los Alamos National Laboratory with the support of the Office of Energy Research, US Department of Energy. The second edition of the ELSI Bibliography was dated May 1993 but included publications added to the database until fall 1993. This Supplement reflects approximately 1,000 entries addedmore » to the database during the past year, bringing the total to approximately 7,000 entries. More than half of the new entries were published in the last year, and the remainder are earlier publications not previously included in the database. Most of the new entries were published in the academic and professional literature. The remainder are press reports from newspapers of record and scientific journals. The topical listing of the second edition has been followed in the Supplement, with a few changes. The topics of Cystic Fibrosis, Huntington`s Disease, and Sickle Cell Anemia have been combined in a single topic, Disorders. Also, all the entries published in the past year are included in a new topic, Publications: September 1993--September 1994, which provides a comprehensive view of recent reporting and commentary on the science and ELSI of genetics.« less

  8. A new edition of the Mars 1:5,000,000 map series

    NASA Technical Reports Server (NTRS)

    Batson, R. M.; Mcewen, Alfred S.; Wu, Sherman S. C.

    1991-01-01

    A new edition of the Mars 1:5,000,000 scale map series is in preparation. Two sheets will be made for each quadrangle. Sheet one will show shaded relief, contours, and nomenclature. Sheet 2 will be a full-color photomosaic prepared on the Mars digital image model (MDIM) base co-registered with the Mars low-resolution color database. The latter will have an abbreviated graticule (latitude/longitude ticks only) and no other line overprint. The four major databases used to assemble this series are now virtually complete. These are: (1) Viking-revised shaded relief maps at 1:5,000,000 scale; (2) contour maps at 1:2,000,000 scale; (3) the Mars digital image model; and (4) a color image mosaic of Mars. Together, these databases form the most complete planetwide cartographic definition of Mars that can be compiled with existing data. The new edition will supersede the published Mars 1:5,000,000 scale maps, including the original shaded relief and topographic maps made primarily with Mariner 9 data and the Viking-revised shaded relief and controlled photomosaic series. Publication of the new series will begin in late 1991 or early 1992, and it should be completed in two years.

  9. Wolf Testing: Open Source Testing Software

    NASA Astrophysics Data System (ADS)

    Braasch, P.; Gay, P. L.

    2004-12-01

    Wolf Testing is software for easily creating and editing exams. Wolf Testing allows the user to create an exam from a database of questions, view it on screen, and easily print it along with the corresponding answer guide. The questions can be multiple choice, short answer, long answer, or true and false varieties. This software can be accessed securely from any location, allowing the user to easily create exams from home. New questions, which can include associated pictures, can be added through a web-interface. After adding in questions, they can be edited, deleted, or duplicated into multiple versions. Long-term test creation is simplified, as you are able to quickly see what questions you have asked in the past and insert them, with or without editing, into future tests. All tests are archived in the database. Written in PHP and MySQL, this software can be installed on any UNIX / Linux platform, including Macintosh OS X. The secure interface keeps students out, and allows you to decide who can create tests and who can edit information already in the database. Tests can be output as either html with pictures or rich text without pictures, and there are plans to add PDF and MS Word formats as well. We would like to thank Dr. Wolfgang Rueckner and the Harvard University Science Center for providing incentive to start this project, computers and resources to complete this project, and inspiration for the project's name. We would also like to thank Dr. Ronald Newburgh for his assistance in beta testing.

  10. Profiling RNA editing in human tissues: towards the inosinome Atlas

    PubMed Central

    Picardi, Ernesto; Manzari, Caterina; Mastropasqua, Francesca; Aiello, Italia; D’Erchia, Anna Maria; Pesole, Graziano

    2015-01-01

    Adenine to Inosine RNA editing is a widespread co- and post-transcriptional mechanism mediated by ADAR enzymes acting on double stranded RNA. It has a plethora of biological effects, appears to be particularly pervasive in humans with respect to other mammals, and is implicated in a number of diverse human pathologies. Here we present the first human inosinome atlas comprising 3,041,422 A-to-I events identified in six tissues from three healthy individuals. Matched directional total-RNA-Seq and whole genome sequence datasets were generated and analysed within a dedicated computational framework, also capable of detecting hyper-edited reads. Inosinome profiles are tissue specific and edited gene sets consistently show enrichment of genes involved in neurological disorders and cancer. Overall frequency of editing also varies, but is strongly correlated with ADAR expression levels. The inosinome database is available at: http://srv00.ibbe.cnr.it/editing/. PMID:26449202

  11. Updates to Simulation of a Single-Element Lean-Direct Injection Combustor Using a Polyhedral Mesh Derived From Hanging-Node Elements

    NASA Technical Reports Server (NTRS)

    Wey, Changju Thomas; Liu, Nan-Suey

    2014-01-01

    This paper summarizes the procedures of inserting a thin-layer mesh to existing inviscid polyhedral mesh either with or without hanging-node elements as well as presents sample results from its applications to the numerical solution of a single-element LDI combustor using a releasable edition of the National Combustion Code (NCC).

  12. Updates to Simulation of a Single-Element Lean-Direct Injection Combustor Using a Polyhedral Mesh Derived from Hanging-Node Elements

    NASA Technical Reports Server (NTRS)

    Wey, Thomas; Liu, Nan-Suey

    2014-01-01

    This paper summarizes the procedures of inserting a thin-layer mesh to existing inviscid polyhedral mesh either with or without hanging-node elements as well as presents sample results from its applications to the numerical solution of a single-element LDI combustor using a releasable edition of the National Combustion Code (NCC).

  13. Permitting Considerations for Installation of Inlet Air Foggers on Simple Cycle Combustion Turbines at the Duke Power Lincoln Combustion Turbine Facility

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  14. A Bibliography of Publications about the Educational Resources Information Center (Covering the Period 1985-1988).

    ERIC Educational Resources Information Center

    Brandhorst, Ted, Ed.

    The result of a comprehensive search for writings about the Educational Resources Information Center (ERIC) published between 1985 and 1988, this annotated bibliography lists 107 documents and journal articles about ERIC that were entered in the ERIC database during that period. The 1964-1978 edition cited 269 items. The 1979-1984 edition cited…

  15. Combustion of interacting droplet arrays in a microgravity environment

    NASA Technical Reports Server (NTRS)

    Dietrich, Daniel L.

    1995-01-01

    This research program involves the study of one and two dimensional arrays of droplets in a buoyant-free environment. The purpose of the work is to extend the database and theories that exist for single droplets into the regime where droplet interactions are important. The eventual goal being to use the results of this work as inputs to models on spray combustion where droplets seldom burn individually; instead the combustion history of a droplet is strongly influenced by the presence of the neighboring droplets. Throughout the course of the work, a number of related aspects of isolated droplet combustion have also been investigated. This paper will review our progress in microgravity droplet array combustion, advanced diagnostics (specifically L2) applied to isolated droplet combustion, and radiative extinction large droplet flames. A small-scale droplet combustion experiment being developed for the Space Shuttle will also be described.

  16. Competitive Intelligence: Finding the Clues Online.

    ERIC Educational Resources Information Center

    Combs, Richard; Moorhead, John

    1990-01-01

    Defines and discusses competitive intelligence for business decision making and suggests the use of online databases to start looking for relevant information. The best databases to use are described, designing the search strategy is explained, reviewing and editing results are discussed, and the presentation of results is considered. (LRW)

  17. Choosing the Right Database Management Program.

    ERIC Educational Resources Information Center

    Vockell, Edward L.; Kopenec, Donald

    1989-01-01

    Provides a comparison of four database management programs commonly used in schools: AppleWorks, the DOS 3.3 and ProDOS versions of PFS, and MECC's Data Handler. Topics discussed include information storage, spelling checkers, editing functions, search strategies, graphs, printout formats, library applications, and HyperCard. (LRW)

  18. A Parent's Guide to the ERIC Database. Where To Turn with Your Questions about Schooling. Revised Edition.

    ERIC Educational Resources Information Center

    Howley, Craig B.; And Others

    This guide explains what the Educational Resources Information Center (ERIC) database is and how it can be used by parents to learn more about schooling and parenting. The guide also presents sample records of 55 documents in the ERIC database. The cited resources are particularly relevant to parents' concerns about meeting children's basic needs,…

  19. Updates to Simulation of a Single-Element Lean-Direct Injection Combustor Using Arbitary Polyhedral Meshes

    NASA Technical Reports Server (NTRS)

    Wey, Thomas; Liu, Nan-Suey

    2015-01-01

    This paper summarizes the procedures of (1) generating control volumes anchored at the nodes of a mesh; and (2) generating staggered control volumes via mesh reconstructions, in terms of either mesh realignment or mesh refinement, as well as presents sample results from their applications to the numerical solution of a single-element LDI combustor using a releasable edition of the National Combustion Code (NCC).

  20. Assessment of the Flammability of Aircraft Hydraulic Fluids

    DTIC Science & Technology

    1979-07-01

    and C. Y. Ito, Editors, " Thermophysical Properties of Selected Aerospace Materials," Part 1, Thermal Radiation Properties , Purdue University., 1976...34 Thermophysical Properties of Selected Aerospace Materials," Part 1, Thermal Radiation Properties , Purdue University, 1976. 9. J. M. Kuchta, "Summary of...propagation properties , and heats of combustion of a number of aircraft fluids. These included currently used (cont’d) FtORM DD I JAN 7 1473 EDITION

  1. Combustion Science to Reduce PM Emissions for Military Platforms

    DTIC Science & Technology

    2012-01-01

    355 7.0 References 356 Appendix: List of Archival Publications and Conference Papers 376 vi List...carbonaddition HITRAN Database of infra-red spectra HP High Pressure HW Harris and Weiner ICCD Intensified charge coupled device ID internal diameter IR ...archival publication based on this work received a distinguished outstanding paper award at the 32nd International Combustion Symposium

  2. Performance Evaluation of a Database System in a Multiple Backend Configurations,

    DTIC Science & Technology

    1984-10-01

    leaving a systemn process , the * internal performance measuremnents of MMSD have been carried out. Mathodo lo.- gies for constructing test databases...access d i rectory data via the AT, EDIT, and CDT. In designing the test database, one of the key concepts is the choice of the directory attributes in...internal timing. These requests are selected since they retrieve the seIaI lest portion of the test database and the processing time for each request is

  3. The Status of Cognitive Psychology Journals: An Impact Factor Approach

    ERIC Educational Resources Information Center

    Togia, Aspasia

    2013-01-01

    The purpose of this study was to examine the impact factor of cognitive psychology journals indexed in the Science and Social Sciences edition of "Journal Citation Reports" ("JCR") database over a period of 10 consecutive years. Cognitive psychology journals were indexed in 11 different subject categories of the database. Their mean impact factor…

  4. Extending the Online Public Access Catalog into the Microcomputer Environment.

    ERIC Educational Resources Information Center

    Sutton, Brett

    1990-01-01

    Describes PCBIS, a database program for MS-DOS microcomputers that features a utility for automatically converting online public access catalog search results stored as text files into structured database files that can be searched, sorted, edited, and printed. Topics covered include the general features of the program, record structure, record…

  5. Using the Internet, Online Services, and CD-ROMs for Writing Research and Term Papers, Second Edition. Neal-Schuman NetGuide Series.

    ERIC Educational Resources Information Center

    Harmon, Charles, Ed.

    Like its predecessor, this second edition combines the best of two traditional generic texts: "how to write a term paper" and "how to use the library." Particularly helpful for high school and first year college students, this guide explains how to utilize online library catalogs, the most commonly available indexes and databases (in print,…

  6. Pulse combustion engineering research laboratory for indirect heating applications (PERL-IH). Final report, October 1, 1989-June 30, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belles, F.E.

    1993-01-01

    Uncontrolled NOx emissions from a variety of pulse combustors were measured. The implementation of flue-gas recirculation to reduce NOx was studied. A flexible workstation for parametric testing was built and used to study the phasing between pressure and heat release, and effects of fuel/air mixing on performance. Exhaust-pipe heat transfer was analyzed. An acoustic model of pulse combustion was developed. Technical support was provided to manufacturers on noise, ignition and condensation. A computerized bibliographic database on pulse combustion was created.

  7. High-Resolution Photoionization, Photoelectron and Photodissociation Studies. Determination of Accurate Energetic and Spectroscopic Database for Combustion Radicals and Molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, Cheuk-Yiu

    2016-04-25

    The main goal of this research program was to obtain accurate thermochemical and spectroscopic data, such as ionization energies (IEs), 0 K bond dissociation energies, 0 K heats of formation, and spectroscopic constants for radicals and molecules and their ions of relevance to combustion chemistry. Two unique, generally applicable vacuum ultraviolet (VUV) laser photoion-photoelectron apparatuses have been developed in our group, which have used for high-resolution photoionization, photoelectron, and photodissociation studies for many small molecules of combustion relevance.

  8. Emissions of trace gases and aerosols during the open combustion of biomass in the laboratory

    Treesearch

    Gavin R. McMeeking; Sonia M. Kreidenweis; Stephen Baker; Christian M. Carrico; Judith C. Chow; Jeffrey L. Collett; Wei Min Hao; Amanda S. Holden; Thomas W. Kirchstetter; William C. Malm; Hans Moosmuller; Amy P. Sullivan; Cyle E. Wold

    2009-01-01

    We characterized the gas- and speciated aerosol-phase emissions from the open combustion of 33 different plant species during a series of 255 controlled laboratory burns during the Fire Laboratory at Missoula Experiments (FLAME). The plant species we tested were chosen to improve the existing database for U.S. domestic fuels: laboratory-based emission...

  9. Extended Edited Synoptic Cloud Reports from Ships and Land Stations Over the Globe, 1952-2009 (NDP-026C)

    DOE Data Explorer

    Hahn, C. J. [University of Arizona; Warren, S. G. [University of Washington; Eastman, R.

    1999-08-01

    This database contains surface synoptic weather reports for the entire globe, gathered from various available data sets. The reports were processed, edited, and rewritten to provide a single dataset of individual observations of clouds, spanning the 57 years 1952-2008 for ship data and the 39 years 1971-2009 for land station data. In addition to the cloud portion of the synoptic report, each edited report also includes the associated pressure, present weather, wind, air temperature, and dew point (and sea surface temperature over oceans). This data set is called the "Extended Edited Cloud Report Archive" (EECRA). The EECRA is based solely on visual cloud observations from weather stations, reported in the WMO synoptic code (WMO, 1974). Reports must contain cloud-type information to be included in the archive. Past data sources include those from the Fleet Numerical Oceanographic Center (FNOC, 1971-1976) and the National Centers for Environmental Prediction (NCEP, 1977-1996). This update uses data from a new source, the 'Integrated Surface Database' (ISD, 1997-2009; Smith et al., 2011). Our past analyses of the EECRA identified a subset of 5388 weather stations that were determined to produce reliable day and night observations of cloud amount and type. The update contains observations only from this subset of stations. Details concerning processing, previous problems, contents, and comments are available in the archive's original documentation . The EECRA contains about 81 million cloud observations from ships and 380 million from land stations. The data files have been compressed using unix. Unix/linux users can "uncompress" or "gunzip" the files after downloading. If you're interested in the NDP-026C database, then you'll also want to explore its related data products, NDP-026D and NDP-026E.

  10. Narrowing the Gender Gap:Empowering Women through Literacy Programmes: Case Studies from the UNESCO Effective Literacy and Numeracy Practices Database (LitBase) http://www.unesco.org/uil/litbase/. 2nd Edition

    ERIC Educational Resources Information Center

    Hanemann, Ulrike, Ed.

    2015-01-01

    UIL has published a second edition of a collection of case studies of promising literacy programmes that seek to empower women. "Narrowing the Gender Gap: Empowering Women through Literacy Programmes" (originally published in 2013 as "Literacy Programmes with a Focus on Women to Reduce Gender Disparities") responds to the…

  11. Directory of On-Line Networks, Databases and Bulletin Boards on Assistive Technology. Second Edition. RESNA Technical Assistance Project.

    ERIC Educational Resources Information Center

    RESNA: Association for the Advancement of Rehabilitation Technology, Washington, DC.

    This resource directory provides a selective listing of electronic networks, online databases, and bulletin boards that highlight technology-related services and products. For each resource, the following information is provided: name, address, and telephone number; description; target audience; hardware/software needs to access the system;…

  12. CRISPR-Mediated Base Editing Enables Efficient Disruption of Eukaryotic Genes through Induction of STOP Codons.

    PubMed

    Billon, Pierre; Bryant, Eric E; Joseph, Sarah A; Nambiar, Tarun S; Hayward, Samuel B; Rothstein, Rodney; Ciccia, Alberto

    2017-09-21

    Standard CRISPR-mediated gene disruption strategies rely on Cas9-induced DNA double-strand breaks (DSBs). Here, we show that CRISPR-dependent base editing efficiently inactivates genes by precisely converting four codons (CAA, CAG, CGA, and TGG) into STOP codons without DSB formation. To facilitate gene inactivation by induction of STOP codons (iSTOP), we provide access to a database of over 3.4 million single guide RNAs (sgRNAs) for iSTOP (sgSTOPs) targeting 97%-99% of genes in eight eukaryotic species, and we describe a restriction fragment length polymorphism (RFLP) assay that allows the rapid detection of iSTOP-mediated editing in cell populations and clones. To simplify the selection of sgSTOPs, our resource includes annotations for off-target propensity, percentage of isoforms targeted, prediction of nonsense-mediated decay, and restriction enzymes for RFLP analysis. Additionally, our database includes sgSTOPs that could be employed to precisely model over 32,000 cancer-associated nonsense mutations. Altogether, this work provides a comprehensive resource for DSB-free gene disruption by iSTOP. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. HITRAN2016: Part I. Line lists for H_2O, CO_2, O_3, N_2O, CO, CH_4, and O_2

    NASA Astrophysics Data System (ADS)

    Gordon, Iouli E.; Rothman, Laurence S.; Tan, Yan; Kochanov, Roman V.; Hill, Christian

    2017-06-01

    The HITRAN2016 database is now officially released. Plethora of experimental and theoretical molecular spectroscopic data were collected, evaluated and vetted before compiling the new edition of the database. The database is now distributed through the dynamic user interface HITRANonline (available at www.hitran.org) which offers many flexible options for browsing and downloading the data. In addition HITRAN Application Programming Interface (HAPI) offers modern ways to download the HITRAN data and use it to carry out sophisticated calculations. The line-by-line lists for almost all of the 47 HITRAN molecules were updated in comparison with the previous compilation (HITRAN2012. Some of the most important updates for major atmospheric absorbers, such as H_2O, CO_2, O_3, N_2O, CO, CH_4, and O_2, will be presented in this talk, while the trace gases will be presented in the next talk by Y. Tan. The HITRAN2016 database now provides alternative line-shape representations for a number of molecules, as well as broadening by gases dominant in planetary atmospheres. In addition, substantial extension and improvement of cross-section data is featured, which will be described in a dedicated talk by R. V. Kochanov. The new edition of the database is a substantial step forward to improve retrievals of the planetary atmospheric constituents in comparison with previous editions, while offering new ways of working with the data. The HITRAN database is supported by the NASA AURA and PDART program grants NNX14AI55G and NNX16AG51G. I. E. Gordon, L. S. Rothman, C. Hill, R. V. Kochanov, Y. Tan, et al. The HITRAN2016 Molecular Spectroscopic Database. JQSRT 2017;submitted. Many spectroscopists and atmospheric scientists worldwide have contributed data to the database or provided invaluable validations. C. Hill, I. E. Gordon, R. V. Kochanov, L. Barrett, J.S. Wilzewski, L.S. Rothman, JQSRT. 177 (2016) 4-14 R.V. Kochanov, I. E. Gordon, L. S. Rothman, P. Wcislo, C. Hill, J. S. Wilzewski, JQSRT. 177 (2016) 15-30. L. S. Rothman, I. E. Gordon et al. The HITRAN2012 Molecular Spectroscopic Database. JQSRT, 113 (2013) 4-50.

  14. Cavity Coupled Aeroramp Injector Combustion Study

    DTIC Science & Technology

    2009-06-01

    Fluorescence RC-18 Propulsion Research Cell 18 at Wright-Patterson Air Force Base Scramjet Supersonic Combustion Ramjet TDLAS Tunable Diode Laser...aerothrottle starting device. There were also mass flow meters on the vitiator oxygen and natural gas supplies. The ethylene fuel used in the DMSJ...PSI System 10 Hz Health Monitoring System ~0.75 Hz Sensors A/D Converters Buffer Database/Console 5 Hz 22 Figure 16: Nominal

  15. FIREDOC users manual, 3rd edition

    NASA Astrophysics Data System (ADS)

    Jason, Nora H.

    1993-12-01

    FIREDOC is the on-line bibliographic database which reflects the holdings (published reports, journal articles, conference proceedings, books, and audiovisual items) of the Fire Research Information Services (FRIS) at the Building and Fire Research Laboratory (BFRL), National Institute of Standards and Technology (NIST). This manual provides step-by-step procedures for entering and exiting the database via telecommunication lines, as well as a number of techniques for searching the database and processing the results of the searches. This Third Edition is necessitated by the change to a UNIX platform. The new computer allows for faster response time if searching via a modem and, in addition, offers internet accessibility. FIREDOC may be used with personal computers, using DOS or Windows, or with Macintosh computers and workstations. A new section on how to access Internet is included, and one on how to obtain the references of interest to you. Appendix F: Quick Guide to Getting Started will be useful to both modem and Internet users.

  16. Presence and Accuracy of Drug Dosage Recommendations for Continuous Renal Replacement Therapy in Tertiary Drug Information References

    PubMed Central

    Gorman, Sean K; Slavik, Richard S; Lam, Stefanie

    2012-01-01

    Background: Clinicians commonly rely on tertiary drug information references to guide drug dosages for patients who are receiving continuous renal replacement therapy (CRRT). It is unknown whether the dosage recommendations in these frequently used references reflect the most current evidence. Objective: To determine the presence and accuracy of drug dosage recommendations for patients undergoing CRRT in 4 drug information references. Methods: Medications commonly prescribed during CRRT were identified from an institutional medication inventory database, and evidence-based dosage recommendations for this setting were developed from the primary and secondary literature. The American Hospital Formulary System—Drug Information (AHFS–DI), Micromedex 2.0 (specifically the DRUGDEX and Martindale databases), and the 5th edition of Drug Prescribing in Renal Failure (DPRF5) were assessed for the presence of drug dosage recommendations in the CRRT setting. The dosage recommendations in these tertiary references were compared with the recommendations derived from the primary and secondary literature to determine concordance. Results: Evidence-based drug dosage recommendations were developed for 33 medications administered in patients undergoing CRRT. The AHFS–DI provided no dosage recommendations specific to CRRT, whereas the DPRF5 provided recommendations for 27 (82%) of the medications and the Micromedex 2.0 application for 20 (61%) (13 [39%] in the DRUGDEX database and 16 [48%] in the Martindale database, with 9 medications covered by both). The dosage recommendations were in concordance with evidence-based recommendations for 12 (92%) of the 13 medications in the DRUGDEX database, 26 (96%) of the 27 in the DPRF5, and all 16 (100%) of those in the Martindale database. Conclusions: One prominent tertiary drug information resource provided no drug dosage recommendations for patients undergoing CRRT. However, 2 of the databases in an Internet-based medical information application and the latest edition of a renal specialty drug information resource provided recommendations for a majority of the medications investigated. Most dosage recommendations were similar to those derived from the primary and secondary literature. The most recent edition of the DPRF is the preferred source of information when prescribing dosage regimens for patients receiving CRRT. PMID:22783029

  17. A Novel Computational Strategy to Identify A-to-I RNA Editing Sites by RNA-Seq Data: De Novo Detection in Human Spinal Cord Tissue

    PubMed Central

    Picardi, Ernesto; Gallo, Angela; Galeano, Federica; Tomaselli, Sara; Pesole, Graziano

    2012-01-01

    RNA editing is a post-transcriptional process occurring in a wide range of organisms. In human brain, the A-to-I RNA editing, in which individual adenosine (A) bases in pre-mRNA are modified to yield inosine (I), is the most frequent event. Modulating gene expression, RNA editing is essential for cellular homeostasis. Indeed, its deregulation has been linked to several neurological and neurodegenerative diseases. To date, many RNA editing sites have been identified by next generation sequencing technologies employing massive transcriptome sequencing together with whole genome or exome sequencing. While genome and transcriptome reads are not always available for single individuals, RNA-Seq data are widespread through public databases and represent a relevant source of yet unexplored RNA editing sites. In this context, we propose a simple computational strategy to identify genomic positions enriched in novel hypothetical RNA editing events by means of a new two-steps mapping procedure requiring only RNA-Seq data and no a priori knowledge of RNA editing characteristics and genomic reads. We assessed the suitability of our procedure by confirming A-to-I candidates using conventional Sanger sequencing and performing RNA-Seq as well as whole exome sequencing of human spinal cord tissue from a single individual. PMID:22957051

  18. Application of real-time cooperative editing in urban planning management system

    NASA Astrophysics Data System (ADS)

    Jing, Changfeng; Liu, Renyi; Liu, Nan; Bao, Weizheng

    2007-06-01

    With the increasing of business requirement of urban planning bureau, co-edit function is needed urgently, however conventional GIS are not support this. In order to overcome this limitation, a new kind urban 1planning management system with co-edit function is needed. Such a system called PM2006 has been used in Suzhou Urban Planning Bureau. PM2006 is introduced in this paper. In this paper, four main issues of Co-edit system--consistency, responsiveness time, data recoverability and unconstrained operation--were discussed. And for these four questions, resolutions were put forward in paper. To resolve these problems of co-edit GIS system, a data model called FGDB (File and ESRI GeoDatabase) that is mixture architecture of File and ESRI Geodatabase was introduced here. The main components of FGDB data model are ESRI versioned Geodatabase and replicated architecture. With FGDB, client responsiveness, spatial data recoverability and unconstrained operation were overcome. In last of paper, MapServer, the co-edit map server module, is presented. Main functions of MapServer are operation serialization and spatial data replication between file and versioned data.

  19. Model Validation for Propulsion - On the TFNS and LES Subgrid Models for a Bluff Body Stabilized Flame

    NASA Technical Reports Server (NTRS)

    Wey, Thomas

    2017-01-01

    This paper summarizes the reacting results of simulating a bluff body stabilized flame experiment of Volvo Validation Rig using a releasable edition of the National Combustion Code (NCC). The turbulence models selected to investigate the configuration are the sub-grid scaled kinetic energy coupled large eddy simulation (K-LES) and the time-filtered Navier-Stokes (TFNS) simulation. The turbulence chemistry interaction used is linear eddy mixing (LEM).

  20. Ames Hybrid Combustion Facility

    NASA Technical Reports Server (NTRS)

    Zilliac, Greg; Karabeyoglu, Mustafa A.; Cantwell, Brian; Hunt, Rusty; DeZilwa, Shane; Shoffstall, Mike; Soderman, Paul T.; Bencze, Daniel P. (Technical Monitor)

    2003-01-01

    The report summarizes the design, fabrication, safety features, environmental impact, and operation of the Ames Hybrid-Fuel Combustion Facility (HCF). The facility is used in conducting research into the scalability and combustion processes of advanced paraffin-based hybrid fuels for the purpose of assessing their applicability to practical rocket systems. The facility was designed to deliver gaseous oxygen at rates between 0.5 and 16.0 kg/sec to a combustion chamber operating at pressures ranging from 300 to 900. The required run times were of the order of 10 to 20 sec. The facility proved to be robust and reliable and has been used to generate a database of regression-rate measurements of paraffin at oxygen mass flux levels comparable to those of moderate-sized hybrid rocket motors.

  1. Solar Market Research and Analysis Publications | Solar Research | NREL

    Science.gov Websites

    lifespan, and saving costs. The report is an expanded edition of an interim report published in 2015. Cost achieving the SETO 2030 residential PV cost target of $0.05 /kWh by identifying and quantifying cost reduction opportunities. Distribution Grid Integration Unit Cost Database: This database contains unit cost

  2. WGE: a CRISPR database for genome engineering.

    PubMed

    Hodgkins, Alex; Farne, Anna; Perera, Sajith; Grego, Tiago; Parry-Smith, David J; Skarnes, William C; Iyer, Vivek

    2015-09-15

    The rapid development of CRISPR-Cas9 mediated genome editing techniques has given rise to a number of online and stand-alone tools to find and score CRISPR sites for whole genomes. Here we describe the Wellcome Trust Sanger Institute Genome Editing database (WGE), which uses novel methods to compute, visualize and select optimal CRISPR sites in a genome browser environment. The WGE database currently stores single and paired CRISPR sites and pre-calculated off-target information for CRISPRs located in the mouse and human exomes. Scoring and display of off-target sites is simple, and intuitive, and filters can be applied to identify high-quality CRISPR sites rapidly. WGE also provides a tool for the design and display of gene targeting vectors in the same genome browser, along with gene models, protein translation and variation tracks. WGE is open, extensible and can be set up to compute and present CRISPR sites for any genome. The WGE database is freely available at www.sanger.ac.uk/htgt/wge : vvi@sanger.ac.uk or skarnes@sanger.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  3. Internal combustion engine fuel controls. (Latest citations from the US Patent database). Published Search

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-12-01

    The bibliography contains citations of selected patents concerning fuel control devices and methods for use in internal combustion engines. Patents describe air-fuel ratio control, fuel injection systems, evaporative fuel control, and surge-corrected fuel control. Citations also discuss electronic and feedback control, methods for engine protection, and fuel conservation. (Contains a minimum of 232 citations and includes a subject term index and title list.)

  4. Starbase Data Tables: An ASCII Relational Database for Unix

    NASA Astrophysics Data System (ADS)

    Roll, John

    2011-11-01

    Database management is an increasingly important part of astronomical data analysis. Astronomers need easy and convenient ways of storing, editing, filtering, and retrieving data about data. Commercial databases do not provide good solutions for many of the everyday and informal types of database access astronomers need. The Starbase database system with simple data file formatting rules and command line data operators has been created to answer this need. The system includes a complete set of relational and set operators, fast search/index and sorting operators, and many formatting and I/O operators. Special features are included to enhance the usefulness of the database when manipulating astronomical data. The software runs under UNIX, MSDOS and IRAF.

  5. Automated search of control points in surface-based morphometry.

    PubMed

    Canna, Antonietta; Russo, Andrea G; Ponticorvo, Sara; Manara, Renzo; Pepino, Alessandro; Sansone, Mario; Di Salle, Francesco; Esposito, Fabrizio

    2018-04-16

    Cortical surface-based morphometry is based on a semi-automated analysis of structural MRI images. In FreeSurfer, a widespread tool for surface-based analyses, a visual check of gray-white matter borders is followed by the manual placement of control points to drive the topological correction (editing) of segmented data. A novel algorithm combining radial sampling and machine learning is presented for the automated control point search (ACPS). Four data sets with 3 T MRI structural images were used for ACPS validation, including raw data acquired twice in 36 healthy subjects and both raw and FreeSurfer preprocessed data of 125 healthy subjects from public databases. The unedited data from a subgroup of subjects were submitted to manual control point search and editing. The ACPS algorithm was trained on manual control points and tested on new (unseen) unedited data. Cortical thickness (CT) and fractal dimensionality (FD) were estimated in three data sets by reconstructing surfaces from both unedited and edited data, and the effects of editing were compared between manual and automated editing and versus no editing. The ACPS-based editing improved the surface reconstructions similarly to manual editing. Compared to no editing, ACPS-based and manual editing significantly reduced CT and FD in consistent regions across different data sets. Despite the extra processing of control point driven reconstructions, CT and FD estimates were highly reproducible in almost all cortical regions, albeit some problematic regions (e.g. entorhinal cortex) may benefit from different editing. The use of control points improves the surface reconstruction and the ACPS algorithm can automate their search reducing the burden of manual editing. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Market Assessment of Biomass Gasification and Combustion Technology for Small- and Medium-Scale Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, D.; Haase, S.

    2009-07-01

    This report provides a market assessment of gasification and direct combustion technologies that use wood and agricultural resources to generate heat, power, or combined heat and power (CHP) for small- to medium-scale applications. It contains a brief overview of wood and agricultural resources in the U.S.; a description and discussion of gasification and combustion conversion technologies that utilize solid biomass to generate heat, power, and CHP; an assessment of the commercial status of gasification and combustion technologies; a summary of gasification and combustion system economics; a discussion of the market potential for small- to medium-scale gasification and combustion systems; andmore » an inventory of direct combustion system suppliers and gasification technology companies. The report indicates that while direct combustion and close-coupled gasification boiler systems used to generate heat, power, or CHP are commercially available from a number of manufacturers, two-stage gasification systems are largely in development, with a number of technologies currently in demonstration. The report also cites the need for a searchable, comprehensive database of operating combustion and gasification systems that generate heat, power, or CHP built in the U.S., as well as a national assessment of the market potential for the systems.« less

  7. Environmental Assessment for Tower Construction at the Brandywine Communication Receiver Site, Prince George’s County, Maryland

    DTIC Science & Technology

    2005-05-01

    mobilization . • Place1nent of tower guy wires will be adjusted to avoid construction and disturbance to any wetlands or small tributaries through on...include combustion emissions (VOC, NOx, CO, SO2) and fugitive dust (PM10) from mobile heavy-duty diesel- and gasoline-powered equipment and soil...Pollutant Factors, Mobile Sources (AP 42). 4th Edition, U.S. Environmental Protection Agency, Ann Arbor, Michigan. Total estimated emissions for VOC and

  8. POLLUX: a program for simulated cloning, mutagenesis and database searching of DNA constructs.

    PubMed

    Dayringer, H E; Sammons, S A

    1991-04-01

    Computer support for research in biotechnology has developed rapidly and has provided several tools to aid the researcher. This report describes the capabilities of new computer software developed in this laboratory to aid in the documentation and planning of experiments in molecular biology. The program, POLLUX, provides a graphical medium for the entry, edit and manipulation of DNA constructs and a textual format for display and edit of construct descriptive data. Program operation and procedures are designed to mimic the actual laboratory experiments with respect to capability and the order in which they are performed. Flexible control over the content of the computer-generated displays and program facilities is provided by a mouse-driven menu interface. Programmed facilities for mutagenesis, simulated cloning and searching of the database from networked workstations are described.

  9. The HITRAN2016 molecular spectroscopic database

    NASA Astrophysics Data System (ADS)

    Gordon, I. E.; Rothman, L. S.; Hill, C.; Kochanov, R. V.; Tan, Y.; Bernath, P. F.; Birk, M.; Boudon, V.; Campargue, A.; Chance, K. V.; Drouin, B. J.; Flaud, J.-M.; Gamache, R. R.; Hodges, J. T.; Jacquemart, D.; Perevalov, V. I.; Perrin, A.; Shine, K. P.; Smith, M.-A. H.; Tennyson, J.; Toon, G. C.; Tran, H.; Tyuterev, V. G.; Barbe, A.; Császár, A. G.; Devi, V. M.; Furtenbacher, T.; Harrison, J. J.; Hartmann, J.-M.; Jolly, A.; Johnson, T. J.; Karman, T.; Kleiner, I.; Kyuberis, A. A.; Loos, J.; Lyulin, O. M.; Massie, S. T.; Mikhailenko, S. N.; Moazzen-Ahmadi, N.; Müller, H. S. P.; Naumenko, O. V.; Nikitin, A. V.; Polyansky, O. L.; Rey, M.; Rotger, M.; Sharpe, S. W.; Sung, K.; Starikova, E.; Tashkun, S. A.; Auwera, J. Vander; Wagner, G.; Wilzewski, J.; Wcisło, P.; Yu, S.; Zak, E. J.

    2017-12-01

    This paper describes the contents of the 2016 edition of the HITRAN molecular spectroscopic compilation. The new edition replaces the previous HITRAN edition of 2012 and its updates during the intervening years. The HITRAN molecular absorption compilation is composed of five major components: the traditional line-by-line spectroscopic parameters required for high-resolution radiative-transfer codes, infrared absorption cross-sections for molecules not yet amenable to representation in a line-by-line form, collision-induced absorption data, aerosol indices of refraction, and general tables such as partition sums that apply globally to the data. The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additional absorption phenomena, added line-shape formalisms, and validity. Moreover, molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth. Of considerable note, experimental IR cross-sections for almost 300 additional molecules important in different areas of atmospheric science have been added to the database. The compilation can be accessed through www.hitran.org. Most of the HITRAN data have now been cast into an underlying relational database structure that offers many advantages over the long-standing sequential text-based structure. The new structure empowers the user in many ways. It enables the incorporation of an extended set of fundamental parameters per transition, sophisticated line-shape formalisms, easy user-defined output formats, and very convenient searching, filtering, and plotting of data. A powerful application programming interface making use of structured query language (SQL) features for higher-level applications of HITRAN is also provided.

  10. Getting Started with AppleWorks Data Base. First Edition.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    This manual is a hands-on teaching tool for beginning users of the AppleWorks database software. It was developed to allow Apple IIGS users who are generally familiar with their machine and its peripherals to build a simple AppleWorks database file using version 2.0 or 2.1 of the program, and to store, print, and manipulate the file. The materials…

  11. Dynamic Terrin

    DTIC Science & Technology

    1991-12-30

    York, 1985. [ Serway 86]: Raymond Serway , Physics for Scientists and Engineers. 2nd Edition, Saunders College Publishing, Philadelphia, 1986. pp. 200... Physical Modeling System 3.4 Realtime Hydrology 3.5 Soil Dynamics and Kinematics 4. Database Issues 4.1 Goals 4.2 Object Oriented Databases 4.3 Distributed...Animation System F. Constraints and Physical Modeling G. The PM Physical Modeling System H. Realtime Hydrology I. A Simplified Model of Soil Slumping

  12. Generation of Comprehensive Surrogate Kinetic Models and Validation Databases for Simulating Large Molecular Weight Hydrocarbon Fuels

    DTIC Science & Technology

    2012-10-25

    of hydrogen/ carbon molar ratio (H/C), derived cetane number (DCN), threshold sooting index (TSI), and average mean molecular weight (MWave) of...diffusive soot extinction configurations. Matching the “real fuel combustion property targets” of hydrogen/ carbon molar ratio (H/C), derived cetane number...combustion property targets - hydrogen/ carbon molar ratio (H/C), derived cetane number (DCN), threshold sooting index (TSI), and average mean

  13. Improvements to the Magnetics Information Consortium (MagIC) Paleo and Rock Magnetic Database

    NASA Astrophysics Data System (ADS)

    Jarboe, N.; Minnett, R.; Tauxe, L.; Koppers, A. A. P.; Constable, C.; Jonestrask, L.

    2015-12-01

    The Magnetic Information Consortium (MagIC) database (http://earthref.org/MagIC/) continues to improve the ease of data uploading and editing, the creation of complex searches, data visualization, and data downloads for the paleomagnetic, geomagnetic, and rock magnetic communities. Online data editing is now available and the need for proprietary spreadsheet software is therefore entirely negated. The data owner can change values in the database or delete entries through an HTML 5 web interface that resembles typical spreadsheets in behavior and uses. Additive uploading now allows for additions to data sets to be uploaded with a simple drag and drop interface. Searching the database has improved with the addition of more sophisticated search parameters and with the facility to use them in complex combinations. A comprehensive summary view of a search result has been added for increased quick data comprehension while a raw data view is available if one desires to see all data columns as stored in the database. Data visualization plots (ARAI, equal area, demagnetization, Zijderveld, etc.) are presented with the data when appropriate to aid the user in understanding the dataset. MagIC data associated with individual contributions or from online searches may be downloaded in the tab delimited MagIC text file format for susbsequent offline use and analysis. With input from the paleomagnetic, geomagnetic, and rock magnetic communities, the MagIC database will continue to improve as a data warehouse and resource.

  14. Challenging a dogma; AJCC 8th staging system is not sufficient to predict outcomes of patients with malignant pleural mesothelioma.

    PubMed

    Abdel-Rahman, Omar

    2017-11-01

    The 8th edition of malignant pleural mesothelioma (MPM) American Joint Committee on Cancer (AJCC) staging system has been published. The current analysis aims to evaluate its performance in a population-based setting among patients recorded within the surveillance, epidemiology and end results (SEER) database. SEER database (2004-2013) has been accessed through SEER*Stat program and AJCC 8th edition stage groups were reconstructed. Survival analyses (overall and cancer-specific) were conducted according to 6th and 8th editions through Kaplan-Meier analysis. Cox-regression multivariate model was also utilized for pair wise comparisons between different prognostic groups for overall and cancer-specific survival. A total of 5382 patients with MPM were identified in the period from 2004 to 2013. According to the 6th edition, significant pair wise P values for overall survival included: IA vs. III (P=0.027); IA vs. IV: P<0.0001; IB vs. IV: P<0.0001; II vs. III: P<0.0001; II vs. IV: P<0.0001; III vs. IV: P<0.0001). According to the 8th edition, significant pair wise P values for overall survival included: all stages vs. IV: P<0.0001; IA vs. II: P=0.046; IA vs. IIIA: P=0.022; IA vs. IIIB: P <0.0001; IB vs. II: P<0.0001; IB vs. IIIB: P<0.0001; II vs. IIIA: P<0.0001; IIIA vs. IIIB: P<0.0001). C-index for 6th edition was 0.539 (SE: 0.008; 95% CI: 0.524-0.555); while C-index for 8th edition was 0.540 (SE: 0.008; 95% CI: 0.525-0.556). Based on the above findings, a simplified staging system was proposed and overall and cancer-specific survivals were evaluated according to the simplified system. For overall and cancer-specific survival assessment, P values for all pair wise comparisons among different stages were significant (<0.01). The prognostic performance of both the 6th and 8th AJCC editions is unsatisfactory; there is a need for a more practical and prognostically relevant staging system for MPM. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. RED: A Java-MySQL Software for Identifying and Visualizing RNA Editing Sites Using Rule-Based and Statistical Filters.

    PubMed

    Sun, Yongmei; Li, Xing; Wu, Di; Pan, Qi; Ji, Yuefeng; Ren, Hong; Ding, Keyue

    2016-01-01

    RNA editing is one of the post- or co-transcriptional processes that can lead to amino acid substitutions in protein sequences, alternative pre-mRNA splicing, and changes in gene expression levels. Although several methods have been suggested to identify RNA editing sites, there remains challenges to be addressed in distinguishing true RNA editing sites from its counterparts on genome and technical artifacts. In addition, there lacks a software framework to identify and visualize potential RNA editing sites. Here, we presented a software - 'RED' (RNA Editing sites Detector) - for the identification of RNA editing sites by integrating multiple rule-based and statistical filters. The potential RNA editing sites can be visualized at the genome and the site levels by graphical user interface (GUI). To improve performance, we used MySQL database management system (DBMS) for high-throughput data storage and query. We demonstrated the validity and utility of RED by identifying the presence and absence of C→U RNA-editing sites experimentally validated, in comparison with REDItools, a command line tool to perform high-throughput investigation of RNA editing. In an analysis of a sample data-set with 28 experimentally validated C→U RNA editing sites, RED had sensitivity and specificity of 0.64 and 0.5. In comparison, REDItools had a better sensitivity (0.75) but similar specificity (0.5). RED is an easy-to-use, platform-independent Java-based software, and can be applied to RNA-seq data without or with DNA sequencing data. The package is freely available under the GPLv3 license at http://github.com/REDetector/RED or https://sourceforge.net/projects/redetector.

  16. RED: A Java-MySQL Software for Identifying and Visualizing RNA Editing Sites Using Rule-Based and Statistical Filters

    PubMed Central

    Sun, Yongmei; Li, Xing; Wu, Di; Pan, Qi; Ji, Yuefeng; Ren, Hong; Ding, Keyue

    2016-01-01

    RNA editing is one of the post- or co-transcriptional processes that can lead to amino acid substitutions in protein sequences, alternative pre-mRNA splicing, and changes in gene expression levels. Although several methods have been suggested to identify RNA editing sites, there remains challenges to be addressed in distinguishing true RNA editing sites from its counterparts on genome and technical artifacts. In addition, there lacks a software framework to identify and visualize potential RNA editing sites. Here, we presented a software − ‘RED’ (RNA Editing sites Detector) − for the identification of RNA editing sites by integrating multiple rule-based and statistical filters. The potential RNA editing sites can be visualized at the genome and the site levels by graphical user interface (GUI). To improve performance, we used MySQL database management system (DBMS) for high-throughput data storage and query. We demonstrated the validity and utility of RED by identifying the presence and absence of C→U RNA-editing sites experimentally validated, in comparison with REDItools, a command line tool to perform high-throughput investigation of RNA editing. In an analysis of a sample data-set with 28 experimentally validated C→U RNA editing sites, RED had sensitivity and specificity of 0.64 and 0.5. In comparison, REDItools had a better sensitivity (0.75) but similar specificity (0.5). RED is an easy-to-use, platform-independent Java-based software, and can be applied to RNA-seq data without or with DNA sequencing data. The package is freely available under the GPLv3 license at http://github.com/REDetector/RED or https://sourceforge.net/projects/redetector. PMID:26930599

  17. [Design and establishment of modern literature database about acupuncture Deqi].

    PubMed

    Guo, Zheng-rong; Qian, Gui-feng; Pan, Qiu-yin; Wang, Yang; Xin, Si-yuan; Li, Jing; Hao, Jie; Hu, Ni-juan; Zhu, Jiang; Ma, Liang-xiao

    2015-02-01

    A search on acupuncture Deqi was conducted using four Chinese-language biomedical databases (CNKI, Wan-Fang, VIP and CBM) and PubMed database and using keywords "Deqi" or "needle sensation" "needling feeling" "needle feel" "obtaining qi", etc. Then, a "Modern Literature Database for Acupuncture Deqi" was established by employing Microsoft SQL Server 2005 Express Edition, introducing the contents, data types, information structure and logic constraint of the system table fields. From this Database, detailed inquiries about general information of clinical trials, acupuncturists' experience, ancient medical works, comprehensive literature, etc. can be obtained. The present databank lays a foundation for subsequent evaluation of literature quality about Deqi and data mining of undetected Deqi knowledge.

  18. Description of 'REQUEST-KYUSHYU' for KYUKEICHO regional data base

    NASA Astrophysics Data System (ADS)

    Takimoto, Shin'ichi

    Kyushu Economic Research Association (a foundational juridical person) initiated the regional database services, ' REQUEST-Kyushu ' recently. It is the full scale databases compiled based on the information and know-hows which the Association has accumulated over forty years. It covers the regional information database for journal and newspaper articles, and statistical information database for economic statistics. As to the former database it is searched on a personal computer and then a search result (original text) is sent through a facsimile. As to the latter, it is also searched on a personal computer where the data is processed, edited or downloaded. This paper describes characteristics, content and the system outline of 'REQUEST-Kyushu'.

  19. Health indicators 1991.

    PubMed

    Dawson, N

    1991-01-01

    This is the second edition of a database developed by the Canadian Centre for Health Information (CCHI). It features 49 health indicators, under one cover containing the most recent data available from a variety of national surveys. This information may be used to establish health goals for the population and to offer objective measures of their success. The database can be accessed through CANSIM, Statistics Canada's socio-economic electronic database and retrieval system, or through a personal computer package which enables the user to retrieve and analyze the 1.2 million data points in the system.

  20. JNDMS Task Authorization 2 Report

    DTIC Science & Technology

    2013-10-01

    uses Barnyard to store alarms from all DREnet Snort sensors in a MySQL database. Barnyard is an open source tool designed to work with Snort to take...Technology ITI Information Technology Infrastructure J2EE Java 2 Enterprise Edition JAR Java Archive. This is an archive file format defined by Java ...standards. JDBC Java Database Connectivity JDW JNDMS Data Warehouse JNDMS Joint Network and Defence Management System JNDMS Joint Network Defence and

  1. 15 CFR 995.4 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE QUALITY ASSURANCE AND CERTIFICATION... of the 1974 SOLAS Convention. Electronic Navigational Chart (ENC) means a database, standardized as... National Oceanic and Atmospheric Administration. NOAA ENC files comply with the IHO S-57 standard, Edition...

  2. 15 CFR 995.4 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE QUALITY ASSURANCE AND CERTIFICATION... of the 1974 SOLAS Convention. Electronic Navigational Chart (ENC) means a database, standardized as... National Oceanic and Atmospheric Administration. NOAA ENC files comply with the IHO S-57 standard, Edition...

  3. 15 CFR 995.4 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE QUALITY ASSURANCE AND CERTIFICATION... of the 1974 SOLAS Convention. Electronic Navigational Chart (ENC) means a database, standardized as... National Oceanic and Atmospheric Administration. NOAA ENC files comply with the IHO S-57 standard, Edition...

  4. 15 CFR 995.4 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE QUALITY ASSURANCE AND CERTIFICATION... of the 1974 SOLAS Convention. Electronic Navigational Chart (ENC) means a database, standardized as... National Oceanic and Atmospheric Administration. NOAA ENC files comply with the IHO S-57 standard, Edition...

  5. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    PubMed

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. The HITRAN2016 Molecular Spectroscopic Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, I. E.; Rothman, L. S.; Hill, C.

    This article describes the contents of the 2016 edition of the HITRAN molecular spectroscopic compilation. The new edition replaces the previous HITRAN edition of 2012 and its updates during the intervening years. The HITRAN molecular absorption compilation is composed of five major components: the traditional line-by-line spectroscopic parameters required for high-resolution radiative-transfer codes, infrared absorption cross-sections for molecules not yet amenable to representation in a line-by-line form, collision-induced absorption data, aerosol indices of refraction, and general tables such as partition sums that apply globally to the data. The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additionalmore » absorption phenomena, added line-shape formalisms, and validity. Moreover, molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth. Of considerable note, experimental IR cross-sections for almost 300 additional molecules important in different areas of atmospheric science have been added to the database. The compilation can be accessed through www.hitran.org. Most of the HITRAN data have now been cast into an underlying relational database structure that offers many advantages over the long-standing sequential text-based structure. The new structure empowers the user in many ways. It enables the incorporation of an extended set of fundamental parameters per transition, sophisticated line-shape formalisms, easy user-defined output formats, and very convenient searching, filtering, and plotting of data. Finally, a powerful application programming interface making use of structured query language (SQL) features for higher-level applications of HITRAN is also provided.« less

  7. The HITRAN2016 Molecular Spectroscopic Database

    DOE PAGES

    Gordon, I. E.; Rothman, L. S.; Hill, C.; ...

    2017-07-05

    This article describes the contents of the 2016 edition of the HITRAN molecular spectroscopic compilation. The new edition replaces the previous HITRAN edition of 2012 and its updates during the intervening years. The HITRAN molecular absorption compilation is composed of five major components: the traditional line-by-line spectroscopic parameters required for high-resolution radiative-transfer codes, infrared absorption cross-sections for molecules not yet amenable to representation in a line-by-line form, collision-induced absorption data, aerosol indices of refraction, and general tables such as partition sums that apply globally to the data. The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additionalmore » absorption phenomena, added line-shape formalisms, and validity. Moreover, molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth. Of considerable note, experimental IR cross-sections for almost 300 additional molecules important in different areas of atmospheric science have been added to the database. The compilation can be accessed through www.hitran.org. Most of the HITRAN data have now been cast into an underlying relational database structure that offers many advantages over the long-standing sequential text-based structure. The new structure empowers the user in many ways. It enables the incorporation of an extended set of fundamental parameters per transition, sophisticated line-shape formalisms, easy user-defined output formats, and very convenient searching, filtering, and plotting of data. Finally, a powerful application programming interface making use of structured query language (SQL) features for higher-level applications of HITRAN is also provided.« less

  8. Pulsed-Laser, High Speed Photography of Rocket Propellant Surface Deflagration.

    DTIC Science & Technology

    1986-05-01

    Investigator was Dr Roger J. Becker. AFRPL Project Manager was Mr Gary L. Vogt. This technical report has been reviewed and is approved for publication...8217;YMlB)OI (/P’I I la . i tJ .o C ’ Gary L. Vogt (805) 277-5258 AFPLIDYCR DD FORM 1473,83 APR EDITION OF 1 JAN 73 IS OBSOLETE. Unclass i fied" SECURl iY...84-1236. 4. G. A. Flandro , "A Simple Conceptual Model for the Nonlinear Transient Combustion of a Solid Rocket Propellant," AIAA Paper No. 82-1222

  9. Design and Implementation of a Three-Tiered Web-Based Inventory Ordering and Tracking System Prototype Using CORBA and Java

    DTIC Science & Technology

    2000-03-01

    languages yet still be able to access the legacy relational databases that businesses have huge investments in. JDBC is a low-level API designed for...consider the return of investment . The system requirements, discussed in Chapter II, are the main source of input to developing the relational...1996. Inprise, Gatekeeper Guide, Inprise Corporation, 1999. Kroenke, D., Database Processing Fundementals , Design, and Implementation, Sixth Edition

  10. [DNAStat, version 1.2 -- a software package for processing genetic profile databases and biostatistical calculations].

    PubMed

    Berent, Jarosław

    2007-01-01

    This paper presents the new DNAStat version 1.2 for processing genetic profile databases and biostatistical calculations. This new version contains, besides all the options of its predecessor 1.0, a calculation-results file export option in .xls format for Microsoft Office Excel, as well as the option of importing/exporting the population base of systems as .txt files for processing in Microsoft Notepad or EditPad

  11. Dual-Pump CARS Development and Application to Supersonic Combustion

    NASA Technical Reports Server (NTRS)

    Magnotti, Gaetano; Cutler, Andrew D.

    2012-01-01

    A dual-pump Coherent Anti-Stokes Raman Spectroscopy (CARS) instrument has been developed to obtain simultaneous measurements of temperature and absolute mole fractions of N2, O2 and H2 in supersonic combustion and generate databases for validation and development of CFD codes. Issues that compromised previous attempts, such as beam steering and high irradiance perturbation effects, have been alleviated or avoided. Improvements in instrument precision and accuracy have been achieved. An axis-symmetric supersonic combusting coaxial jet facility has been developed to provide a simple, yet suitable flow to CFD modelers. Approximately one million dual-pump CARS single shots have been collected in the supersonic jet for varying values of flight and exit Mach numbers at several locations. Data have been acquired with a H2 co-flow (combustion case) or a N2 co-flow (mixing case). Results are presented and the effects of the compressibility and of the heat release are discussed.

  12. The coupling between flame surface dynamics and species mass conservation in premixed turbulent combustion

    NASA Technical Reports Server (NTRS)

    Trouve, A.; Veynante, D.; Bray, K. N. C.; Mantel, T.

    1994-01-01

    Current flamelot models based on a description of the flame surface dynamics require the closure of two inter-related equations: a transport equation for the mean reaction progress variable, (tilde)c, and a transport equation for the flame surface density, Sigma. The coupling between these two equations is investigated using direct numerical simulations (DNS) with emphasis on the correlation between the turbulent fluxes of (tilde)c, bar(pu''c''), and Sigma, (u'')(sub S)Sigma. Two different DNS databases are used in the present work: a database developed at CTR by A. Trouve and a database developed by C. J. Rutland using a different code. Both databases correspond to statistically one-dimensional premixed flames in isotropic turbulent flow. The run parameters, however, are significantly different, and the two databases correspond to different combustion regimes. It is found that in all simulated flames, the correlation between bar(pu''c'') and (u'')(sub S)Sigma is always strong. The sign, however, of the turbulent flux of (tilde)c or Sigma with respect to the mean gradients, delta(tilde)c/delta(x) or delta(Sigma)/delta(x), is case-dependent. The CTR database is found to exhibit gradient turbulent transport of (tilde)c and Sigma, whereas the Rutland DNS features counter-gradient diffusion. The two databases are analyzed and compared using various tools (a local analysis of the flow field near the flame, a classical analysis of the conservation equation for (tilde)(u''c''), and a thin flame theoretical analysis). A mechanism is then proposed to explain the discrepancies between the two databases and a preliminary simple criterion is derived to predict the occurrence of gradient/counter-gradient turbulent diffusion.

  13. Transferable Calibration Standard Developed for Quantitative Raman Scattering Diagnostics in High-Pressure Flames

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet; Kojima, Jun

    2005-01-01

    Researchers from NASA Glenn Research Center s Combustion Branch and the Ohio Aerospace Institute (OAI) have developed a transferable calibration standard for an optical technique called spontaneous Raman scattering (SRS) in high-pressure flames. SRS is perhaps the only technique that provides spatially and temporally resolved, simultaneous multiscalar measurements in turbulent flames. Such measurements are critical for the validation of numerical models of combustion. This study has been a combined experimental and theoretical effort to develop a spectral calibration database for multiscalar diagnostics using SRS in high-pressure flames. However, in the past such measurements have used a one-of-a-kind experimental setup and a setup-dependent calibration procedure to empirically account for spectral interferences, or crosstalk, among the major species of interest. Such calibration procedures, being non-transferable, are prohibitively expensive to duplicate. A goal of this effort is to provide an SRS calibration database using transferable standards that can be implemented widely by other researchers for both atmospheric-pressure and high-pressure (less than 30 atm) SRS studies. A secondary goal of this effort is to provide quantitative multiscalar diagnostics in high pressure environments to validate computational combustion codes.

  14. MetPetDB: A database for metamorphic geochemistry

    NASA Astrophysics Data System (ADS)

    Spear, Frank S.; Hallett, Benjamin; Pyle, Joseph M.; Adalı, Sibel; Szymanski, Boleslaw K.; Waters, Anthony; Linder, Zak; Pearce, Shawn O.; Fyffe, Matthew; Goldfarb, Dennis; Glickenhouse, Nickolas; Buletti, Heather

    2009-12-01

    We present a data model for the initial implementation of MetPetDB, a geochemical database specific to metamorphic rock samples. The database is designed around the concept of preservation of spatial relationships, at all scales, of chemical analyses and their textural setting. Objects in the database (samples) represent physical rock samples; each sample may contain one or more subsamples with associated geochemical and image data. Samples, subsamples, geochemical data, and images are described with attributes (some required, some optional); these attributes also serve as search delimiters. All data in the database are classified as published (i.e., archived or published data), public or private. Public and published data may be freely searched and downloaded. All private data is owned; permission to view, edit, download and otherwise manipulate private data may be granted only by the data owner; all such editing operations are recorded by the database to create a data version log. The sharing of data permissions among a group of collaborators researching a common sample is done by the sample owner through the project manager. User interaction with MetPetDB is hosted by a web-based platform based upon the Java servlet application programming interface, with the PostgreSQL relational database. The database web portal includes modules that allow the user to interact with the database: registered users may save and download public and published data, upload private data, create projects, and assign permission levels to project collaborators. An Image Viewer module provides for spatial integration of image and geochemical data. A toolkit consisting of plotting and geochemical calculation software for data analysis and a mobile application for viewing the public and published data is being developed. Future issues to address include population of the database, integration with other geochemical databases, development of the analysis toolkit, creation of data models for derivative data, and building a community-wide user base. It is believed that this and other geochemical databases will enable more productive collaborations, generate more efficient research efforts, and foster new developments in basic research in the field of solid earth geochemistry.

  15. 5S ribosomal RNA database Y2K

    PubMed Central

    Szymanski, Maciej; Barciszewska, Miroslawa Z.; Barciszewski, Jan; Erdmann, Volker A.

    2000-01-01

    This paper presents the updated version (Y2K) of the database of ribosomal 5S ribonucleic acids (5S rRNA) and their genes (5S rDNA), http://rose.man/poznan. pl/5SData/index.html . This edition of the database contains 1985 primary structures of 5S rRNA and 5S rDNA. They include 60 archaebacterial, 470 eubacterial, 63 plastid, nine mitochondrial and 1383 eukaryotic sequences. The nucleotide sequences of the 5S rRNAs or 5S rDNAs are divided according to the taxonomic position of the source organisms. PMID:10592212

  16. 5S ribosomal RNA database Y2K.

    PubMed

    Szymanski, M; Barciszewska, M Z; Barciszewski, J; Erdmann, V A

    2000-01-01

    This paper presents the updated version (Y2K) of the database of ribosomal 5S ribonucleic acids (5S rRNA) and their genes (5S rDNA), http://rose.man/poznan.pl/5SData/index.html. This edition of the database contains 1985primary structures of 5S rRNA and 5S rDNA. They include 60 archaebacterial, 470 eubacterial, 63 plastid, nine mitochondrial and 1383 eukaryotic sequences. The nucleotide sequences of the 5S rRNAs or 5S rDNAs are divided according to the taxonomic position of the source organisms.

  17. HITRAN2016 Database Part II: Overview of the Spectroscopic Parameters of the Trace Gases

    NASA Astrophysics Data System (ADS)

    Tan, Yan; Gordon, Iouli E.; Rothman, Laurence S.; Kochanov, Roman V.; Hill, Christian

    2017-06-01

    The 2016 edition of HITRAN database is available now. This new edition of the database takes advantage of the new structure and can be accessed through HITRANonline (www.hitran.org). The line-by-line lists for almost all of the trace atmospheric species were updated in comparison with the previous edition HITRAN2012. These extended update covers not only updating few transitions of the certain molecules, but also complete replacements of the whole line lists, and as well as introduction of new spectroscopic parameters for non-Voigt line shape. The new line lists for NH_3, HNO_3, OCS, HCN, CH_3Cl, C_2H_2, C_2H_6, PH_3, C_2H_4, CH_3CN, CF_4, C_4H_2, and SO_3 feature substantial expansion of the spectral and dynamic ranges in addition of the improved accuracy of the parameters for already existing lines. A semi-empirical procedure was developed to update the air-broadening and self-broadening coefficients of N_2O, SO_2, NH_3, CH_3Cl, H_2S, and HO_2. We draw particular attention to flaws in the commonly used expression n_{air}=0.79n_{N_2}+0.21n_{O_2} to determine the air-broadening temperature dependence exponent in the power law from those for nitrogen and oxygen broadening. A more meaningful approach will be presented. The semi-empirical line width, pressure shifts and temperature-dependence exponents of CO, NH_3, HF, HCl, OCS, C_2H_2, SO_2 perturbed by H_2, He, and CO_2 have been added to the database based on the algorithm described in Wilzewski et al.. The new spectroscopic parameters for HT profile were implemented into the database for hydrogen molecule. The HITRAN database is supported by the NASA AURA program grant NNX14AI55G and NASA PDART grant NNX16AG51G. I. E. Gordon, L. S. Rothman, et al., J Quant Spectrosc Radiat Transf 2017; submitted. Hill C, et al., J Quant Spectrosc Radiat Transf 2013;130:51-61. Wilzewski JS,et al., J Quant Spectrosc Radiat Transf 2016;168:193-206. Wcislo P, et al., J Quant Spectrosc Radiat Transf 2016;177:75-91.

  18. Nonlinear dimensionality reduction methods for synthetic biology biobricks' visualization.

    PubMed

    Yang, Jiaoyun; Wang, Haipeng; Ding, Huitong; An, Ning; Alterovitz, Gil

    2017-01-19

    Visualizing data by dimensionality reduction is an important strategy in Bioinformatics, which could help to discover hidden data properties and detect data quality issues, e.g. data noise, inappropriately labeled data, etc. As crowdsourcing-based synthetic biology databases face similar data quality issues, we propose to visualize biobricks to tackle them. However, existing dimensionality reduction methods could not be directly applied on biobricks datasets. Hereby, we use normalized edit distance to enhance dimensionality reduction methods, including Isomap and Laplacian Eigenmaps. By extracting biobricks from synthetic biology database Registry of Standard Biological Parts, six combinations of various types of biobricks are tested. The visualization graphs illustrate discriminated biobricks and inappropriately labeled biobricks. Clustering algorithm K-means is adopted to quantify the reduction results. The average clustering accuracy for Isomap and Laplacian Eigenmaps are 0.857 and 0.844, respectively. Besides, Laplacian Eigenmaps is 5 times faster than Isomap, and its visualization graph is more concentrated to discriminate biobricks. By combining normalized edit distance with Isomap and Laplacian Eigenmaps, synthetic biology biobircks are successfully visualized in two dimensional space. Various types of biobricks could be discriminated and inappropriately labeled biobricks could be determined, which could help to assess crowdsourcing-based synthetic biology databases' quality, and make biobricks selection.

  19. VisANT 3.0: new modules for pathway visualization, editing, prediction and construction.

    PubMed

    Hu, Zhenjun; Ng, David M; Yamada, Takuji; Chen, Chunnuan; Kawashima, Shuichi; Mellor, Joe; Linghu, Bolan; Kanehisa, Minoru; Stuart, Joshua M; DeLisi, Charles

    2007-07-01

    With the integration of the KEGG and Predictome databases as well as two search engines for coexpressed genes/proteins using data sets obtained from the Stanford Microarray Database (SMD) and Gene Expression Omnibus (GEO) database, VisANT 3.0 supports exploratory pathway analysis, which includes multi-scale visualization of multiple pathways, editing and annotating pathways using a KEGG compatible visual notation and visualization of expression data in the context of pathways. Expression levels are represented either by color intensity or by nodes with an embedded expression profile. Multiple experiments can be navigated or animated. Known KEGG pathways can be enriched by querying either coexpressed components of known pathway members or proteins with known physical interactions. Predicted pathways for genes/proteins with unknown functions can be inferred from coexpression or physical interaction data. Pathways produced in VisANT can be saved as computer-readable XML format (VisML), graphic images or high-resolution Scalable Vector Graphics (SVG). Pathways in the format of VisML can be securely shared within an interested group or published online using a simple Web link. VisANT is freely available at http://visant.bu.edu.

  20. The research and implementation of coalfield spontaneous combustion of carbon emission WebGIS based on Silverlight and ArcGIS server

    NASA Astrophysics Data System (ADS)

    Zhu, Z.; Bi, J.; Wang, X.; Zhu, W.

    2014-02-01

    As an important sub-topic of the natural process of carbon emission data public information platform construction, coalfield spontaneous combustion of carbon emission WebGIS system has become an important study object. In connection with data features of coalfield spontaneous combustion carbon emissions (i.e. a wide range of data, which is rich and complex) and the geospatial characteristics, data is divided into attribute data and spatial data. Based on full analysis of the data, completed the detailed design of the Oracle database and stored on the Oracle database. Through Silverlight rich client technology and the expansion of WCF services, achieved the attribute data of web dynamic query, retrieval, statistical, analysis and other functions. For spatial data, we take advantage of ArcGIS Server and Silverlight-based API to invoke GIS server background published map services, GP services, Image services and other services, implemented coalfield spontaneous combustion of remote sensing image data and web map data display, data analysis, thematic map production. The study found that the Silverlight technology, based on rich client and object-oriented framework for WCF service, can efficiently constructed a WebGIS system. And then, combined with ArcGIS Silverlight API to achieve interactive query attribute data and spatial data of coalfield spontaneous emmission, can greatly improve the performance of WebGIS system. At the same time, it provided a strong guarantee for the construction of public information on China's carbon emission data.

  1. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  2. HOWDY: an integrated database system for human genome research

    PubMed Central

    Hirakawa, Mika

    2002-01-01

    HOWDY is an integrated database system for accessing and analyzing human genomic information (http://www-alis.tokyo.jst.go.jp/HOWDY/). HOWDY stores information about relationships between genetic objects and the data extracted from a number of databases. HOWDY consists of an Internet accessible user interface that allows thorough searching of the human genomic databases using the gene symbols and their aliases. It also permits flexible editing of the sequence data. The database can be searched using simple words and the search can be restricted to a specific cytogenetic location. Linear maps displaying markers and genes on contig sequences are available, from which an object can be chosen. Any search starting point identifies all the information matching the query. HOWDY provides a convenient search environment of human genomic data for scientists unsure which database is most appropriate for their search. PMID:11752279

  3. Hepatic Transcriptome Responses in Mice (Mus musculus) Exposed to the Nafion Membrane and Its Combustion Products

    PubMed Central

    Feng, Mingbao; Qu, Ruijuan; Habteselassie, Mussie; Wu, Jun; Yang, Shaogui; Sun, Ping; Huang, Qingguo; Wang, Zunyao

    2015-01-01

    Nafion 117 membrane (N117), an important polymer electrolyte membrane (PEM), has been widely used for numerous chemical technologies. Despite its increasing production and use, the toxicity data for N117 and its combustion products remain lacking. Toxicity studies are necessary to avoid problems related to waste disposal in landfills and incineration that may arise. In this study, we investigated the histopathological alterations, oxidative stress biomarker responses, and transcriptome profiles in the liver of male mice exposed to N117 and its combustion products for 24 days. An ion-chromatography system and liquid chromatography system coupled to a hybrid quadrupole time-of-flight mass spectrometry were used to analyze the chemical compositions of these combustion products. The transcriptomics analysis identified several significantly altered molecular pathways, including the metabolism of xenobiotics, carbohydrates and lipids; signal transduction; cellular processes; immune system; and signaling molecules and interaction. These studies provide preliminary data for the potential toxicity of N117 and its combustion products on living organisms and may fill the information gaps in the toxicity databases for the currently used PEMs. PMID:26057616

  4. KEGGParser: parsing and editing KEGG pathway maps in Matlab.

    PubMed

    Arakelyan, Arsen; Nersisyan, Lilit

    2013-02-15

    KEGG pathway database is a collection of manually drawn pathway maps accompanied with KGML format files intended for use in automatic analysis. KGML files, however, do not contain the required information for complete reproduction of all the events indicated in the static image of a pathway map. Several parsers and editors of KEGG pathways exist for processing KGML files. We introduce KEGGParser-a MATLAB based tool for KEGG pathway parsing, semiautomatic fixing, editing, visualization and analysis in MATLAB environment. It also works with Scilab. The source code is available at http://www.mathworks.com/matlabcentral/fileexchange/37561.

  5. Whither the White Knight: CDROM in Technical Services.

    ERIC Educational Resources Information Center

    Campbell, Brian

    1987-01-01

    Outlines evaluative criteria and compares optical data disk products used in library technical processes, including bibliographic records for cataloging, acquisition databases, and local public access catalogs. An extensive table provides information on specific products, including updates, interfaces, edit screens, installation help, manuals,…

  6. Selected Reference Books of 1993-1994.

    ERIC Educational Resources Information Center

    McIlvaine, Eileen

    1994-01-01

    Offers brief, critical reviews of recent scholarly and general works of interest to reference workers in university libraries. Titles covered include dictionaries, databases, religion, literature, music, dance, art and architecture, business, political science, social issues, and history. Brief descriptions of new editions and supplements for…

  7. The GEISA Spectroscopic Database System in its latest Edition

    NASA Astrophysics Data System (ADS)

    Jacquinet-Husson, N.; Crépeau, L.; Capelle, V.; Scott, N. A.; Armante, R.; Chédin, A.

    2009-04-01

    GEISA (Gestion et Etude des Informations Spectroscopiques Atmosphériques: Management and Study of Spectroscopic Information)[1] is a computer-accessible spectroscopic database system, designed to facilitate accurate forward planetary radiative transfer calculations using a line-by-line and layer-by-layer approach. It was initiated in 1976. Currently, GEISA is involved in activities related to the assessment of the capabilities of IASI (Infrared Atmospheric Sounding Interferometer on board the METOP European satellite -http://earth-sciences.cnes.fr/IASI/)) through the GEISA/IASI database[2] derived from GEISA. Since the Metop (http://www.eumetsat.int) launch (October 19th 2006), GEISA/IASI is the reference spectroscopic database for the validation of the level-1 IASI data, using the 4A radiative transfer model[3] (4A/LMD http://ara.lmd.polytechnique.fr; 4A/OP co-developed by LMD and Noveltis with the support of CNES). Also, GEISA is involved in planetary research, i.e.: modelling of Titan's atmosphere, in the comparison with observations performed by Voyager: http://voyager.jpl.nasa.gov/, or by ground-based telescopes, and by the instruments on board the Cassini-Huygens mission: http://www.esa.int/SPECIALS/Cassini-Huygens/index.html. The updated 2008 edition of GEISA (GEISA-08), a system comprising three independent sub-databases devoted, respectively, to line transition parameters, infrared and ultraviolet/visible absorption cross-sections, microphysical and optical properties of atmospheric aerosols, will be described. Spectroscopic parameters quality requirement will be discussed in the context of comparisons between observed or simulated Earth's and other planetary atmosphere spectra. GEISA is implemented on the CNES/CNRS Ether Products and Services Centre WEB site (http://ether.ipsl.jussieu.fr), where all archived spectroscopic data can be handled through general and user friendly associated management software facilities. More than 350 researchers are registered for on line use of GEISA. Refs: 1. Jacquinet-Husson N., N.A. Scott, A. Chédin,L. Crépeau, R. Armante, V. Capelle, J. Orphal, A. Coustenis, C. Boonne, N. Poulet-Crovisier, et al. THE GEISA SPECTROSCOPIC DATABASE: Current and future archive for Earth and planetary atmosphere studies. JQSRT, 109, 1043-1059, 2008 2. Jacquinet-Husson N., N.A. Scott, A. Chédin, K. Garceran, R. Armante, et al. The 2003 edition of the GEISA/IASI spectroscopic database. JQSRT, 95, 429-67, 2005. 3. Scott, N.A. and A. Chedin, 1981: A fast line-by-line method for atmospheric absorption computations: The Automatized Atmospheric Absorption Atlas. J. Appl. Meteor., 20,556-564.

  8. California Fault Parameters for the National Seismic Hazard Maps and Working Group on California Earthquake Probabilities 2007

    USGS Publications Warehouse

    Wills, Chris J.; Weldon, Ray J.; Bryant, W.A.

    2008-01-01

    This report describes development of fault parameters for the 2007 update of the National Seismic Hazard Maps and the Working Group on California Earthquake Probabilities (WGCEP, 2007). These reference parameters are contained within a database intended to be a source of values for use by scientists interested in producing either seismic hazard or deformation models to better understand the current seismic hazards in California. These parameters include descriptions of the geometry and rates of movements of faults throughout the state. These values are intended to provide a starting point for development of more sophisticated deformation models which include known rates of movement on faults as well as geodetic measurements of crustal movement and the rates of movements of the tectonic plates. The values will be used in developing the next generation of the time-independent National Seismic Hazard Maps, and the time-dependant seismic hazard calculations being developed for the WGCEP. Due to the multiple uses of this information, development of these parameters has been coordinated between USGS, CGS and SCEC. SCEC provided the database development and editing tools, in consultation with USGS, Golden. This database has been implemented in Oracle and supports electronic access (e.g., for on-the-fly access). A GUI-based application has also been developed to aid in populating the database. Both the continually updated 'living' version of this database, as well as any locked-down official releases (e.g., used in a published model for calculating earthquake probabilities or seismic shaking hazards) are part of the USGS Quaternary Fault and Fold Database http://earthquake.usgs.gov/regional/qfaults/ . CGS has been primarily responsible for updating and editing of the fault parameters, with extensive input from USGS and SCEC scientists.

  9. Sub-Saharan Africa Report

    DTIC Science & Technology

    1985-11-14

    official foreign reserves, and the general recognition in the market that there has been a continu- ous shortage of dollars. Whatever entered the forex ...employer by misusing his privileged access to the ’payroll system and editing the personnel database on payday to increase his monthly salary by

  10. 78 FR 1562 - Improving Government Regulations; Unified Agenda of Federal Regulatory and Deregulatory Actions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-08

    ... statutory administration requirements as required. Starting with the fall 2007 edition, the Internet became... Agenda database. Because publication in the Federal Register is mandated for the regulatory flexibility.... Michael L. Rhodes, Director, Administration and Management. Defense Acquisition Regulations Council...

  11. Efficiently Distributing Component-based Applications Across Wide-Area Environments

    DTIC Science & Technology

    2002-01-01

    a variety of sophisticated network-accessible services such as e-mail, banking, on-line shopping, entertainment, and serv - ing as a data exchange...product database Customer Serves as a façade to Order and Account Stateful Session Beans ShoppingCart Maintains list of items to be bought by customer...Pet Store tests; and JBoss 3.0.3 with Jetty 4.1.0, for the RUBiS tests) and a sin- gle database server ( Oracle 8.1.7 Enterprise Edition), each running

  12. [Revision of the TNM Stage Grouping in the Forthcoming Eighth Edition of the TNM Classification for Lung Cancer].

    PubMed

    Ye, Bo; Zhao, Heng

    2016-06-20

    The currently adopted staging system for lung cancer is the seventh edition of the TNM staging edited by Union for International Cancer Control (UICC) in January, 2009. In recent years, with the advances of techniques in lung cancer diagnosis and the treatment trends towards precision treatment modalities such as individualized therapy and molecular targeted therapy, the survival and prognosis of lung cancer has been significantly improved. The old staging standard is difficult to satisfy the currentrapidly developing clinical needs. Therefore, the International Lung Cancer Research Society (International Association for the Study of Lung Cancer, IASLC) updated the stage of lung cancer in 2015, and the forthcoming eighth edition of the TNM Classification for Lung Cancer, which will be formally adopted in Jan. 2017, has been published in Journal of Thoracic Oncology. The new staging system has adopted 35 databases from 16 countries, including 94,708 cases treated between 1999 and 2010. The advantages of the new staging lies in its higher prognosis prediction and clinical guidance value.

  13. Forest Fire Smoldering Emissions from Ponderosa Pine Duff in Central Washington

    NASA Astrophysics Data System (ADS)

    Baker, S. P.; Lincoln, E.; Page, W.; Richardson, M.

    2017-12-01

    Forest fire smoldering combustion is a significant contribution to pollution and carbon emissions. Smoldering combustion produces the majority of carbon monoxide (CO), methane (CH4), volatile organic compounds (VOC), and fine particulate matter (PM2.5) emitted by forest fires when it occurs. The emission factor for PM2.5 and many VOCs are correlated with the modified combustion efficiency (MCE), which is the ratio of CO2 emitted, to the sum of emitted CO2 and CO. MCE is a measure of the relative ratio of flaming and smoldering combustion, but its relationship to the physical fire process is poorly studied. We measured carbon emission rates and individual emission factors for CO, CO2, CH4, and VOC's from smoldering combustion on Ponderosa pine /Douglas-Fir forest sites in central Washington. The emission factor results are linked with concurrent thermal measurements made at various depths in the duff and surface IR camera imagery. The MCE value ranged from .80 to .91 and are correlated with emission factors for 24 carbon compounds. Other data collected were fuel moistures and duff temperatures at depth increments. This goal of this research is the creation of a database to better predict the impacts of air pollution resulting from burns leading to smoldering combustion.

  14. Advancing predictive models for particulate formation in turbulent flames via massively parallel direct numerical simulations

    PubMed Central

    Bisetti, Fabrizio; Attili, Antonio; Pitsch, Heinz

    2014-01-01

    Combustion of fossil fuels is likely to continue for the near future due to the growing trends in energy consumption worldwide. The increase in efficiency and the reduction of pollutant emissions from combustion devices are pivotal to achieving meaningful levels of carbon abatement as part of the ongoing climate change efforts. Computational fluid dynamics featuring adequate combustion models will play an increasingly important role in the design of more efficient and cleaner industrial burners, internal combustion engines, and combustors for stationary power generation and aircraft propulsion. Today, turbulent combustion modelling is hindered severely by the lack of data that are accurate and sufficiently complete to assess and remedy model deficiencies effectively. In particular, the formation of pollutants is a complex, nonlinear and multi-scale process characterized by the interaction of molecular and turbulent mixing with a multitude of chemical reactions with disparate time scales. The use of direct numerical simulation (DNS) featuring a state of the art description of the underlying chemistry and physical processes has contributed greatly to combustion model development in recent years. In this paper, the analysis of the intricate evolution of soot formation in turbulent flames demonstrates how DNS databases are used to illuminate relevant physico-chemical mechanisms and to identify modelling needs. PMID:25024412

  15. Data Input for Libraries: State-of-the-Art Report.

    ERIC Educational Resources Information Center

    Buckland, Lawrence F.

    This brief overview of new manuscript preparation methods which allow authors and editors to set their own type discusses the advantages and disadvantages of optical character recognition (OCR), microcomputers and personal computers, minicomputers, and word processors for editing and database entry. Potential library applications are also…

  16. University of Iowa at TREC 2008 Legal and Relevance Feedback Tracks

    DTIC Science & Technology

    2008-11-01

    Fellbaum, C, [ed.]. Wordnet: An Electronic Lexical Database. Cambridge : MIT Press, 1998. [3] Salton , G. (ed) (1971), The SMART Retrieval System...learning tools and techniques. 2nd Edition. San Francisco : Morgan Kaufmann, 2005. [5] Platt, J . Machines using Sequential Minimal Optimization. [ed.] B

  17. Reference Manual for Machine-Readable Bibliographic Descriptions. Second Revised Edition.

    ERIC Educational Resources Information Center

    Dierickx, H., Ed.; Hopkinson, A., Ed.

    A product of the UNISIST International Centre for Bibliographic Descriptions (UNIBIB), this reference manual presents a standardized communication format for the exchange of machine-readable bibliographic information between bibliographic databases or other types of bibliographic information services, including libraries. The manual is produced in…

  18. Bending the rules: when deaf writers leave college.

    PubMed

    Biser, Eileen; Rubel, Linda; Toscano, Rose Marie

    2007-01-01

    On-the-job writing of deaf college graduates at all degree levels was investigated. Institutional databases and questionnaires to alumni and employers were the sources for information. Respondents were asked about editing assistance, sources and types of assistance, and perceptions of such assistance by employers and employees. Results of the study confirmed that deaf employees did considerable writing regardless of degree or type of job. Their self-reports indicated grammar as the major weakness. Additionally, employers stated that clarity, organization, and spelling were serious writing problems. The study also showed that deaf employees asked for and received editing assistance and that employers were willing to support the improvement of writing skills. Because error-free texts are expected in the workplace and editing assistance is sought and received, postsecondary institutions should mimic these practices by providing copyediting services and instruction in the ethics and practices of working with editors.

  19. Far infrared supplement. Third edition: Catalog of infrared observations (lambda greater than or equal to 4.6 micrometers)

    NASA Technical Reports Server (NTRS)

    Gezari, Daniel Y.; Schmitz, Marion; Pitts, Patricia S.; Mead, Jaylee M.

    1993-01-01

    The Far Infrared Supplement contains a subset of the data in the full Catalog of Infrared Observations (all observations at wavelengths greater than 4.6 microns). The Catalog of Infrared Observations (CIO), NASA RP-1294, is a compilation of infrared astronomical observational data obtained from an extensive literature search of scientific journals and major astronomical catalogs and surveys. The literature search is complete for years 1965 through 1990 in this third edition. The catalog contains about 210,000 observations of roughly 20,000 individual sources, and supporting appendices. The expanded third edition contains coded IRAS 4-band data for all CIO sources detected by IRAS. The appendices include an atlas of infrared source positions (also included in this volume), two bibliographies of catalog listings, and an atlas of infrared spectral ranges. The complete CIO database is available to qualified users in printed, microfiche, and magnetic tape formats.

  20. Catalog of Infrared Observations, Third Edition

    NASA Technical Reports Server (NTRS)

    Gezari, Daniel Y.; Schmitz, Marion; Pitts, Patricia S.; Mead, Jaylee M.

    1993-01-01

    The Far Infrared Supplement contains a subset of the data in the full Catalog of Infrared Observations (all observations at wavelengths greater than 4.6 microns). The Catalog of Infrared Observations (CIO), NASA RP-1294, is a compilation of infrared astronomical observational data obtained from an extensive literature search of scientific journals and major astronomical catalogs and surveys. The literature search is complete for years 1965 through 1990 in this Third Edition. The Catalog contains about 210,000 observations of roughly 20,000 individual sources and supporting appendices. The expanded Third Edition contains coded IRAS 4-band data for all CIO sources detected by IRAS. The appendices include an atlas of infrared source positions (also included in this volume), two bibliographies of Catalog listings, and an atlas of infrared spectral ranges. The complete CIO database is available to qualified users in printed, microfiche, and magnetic-tape formats.

  1. Development of a forestry government agency enterprise GIS system: a disconnected editing approach

    NASA Astrophysics Data System (ADS)

    Zhu, Jin; Barber, Brad L.

    2008-10-01

    The Texas Forest Service (TFS) has developed a geographic information system (GIS) for use by agency personnel in central Texas for managing oak wilt suppression and other landowner assistance programs. This Enterprise GIS system was designed to support multiple concurrent users accessing shared information resources. The disconnected editing approach was adopted in this system to avoid the overhead of maintaining an active connection between TFS central Texas field offices and headquarters since most field offices are operating with commercially provided Internet service. The GIS system entails maintaining a personal geodatabase on each local field office computer. Spatial data from the field is periodically up-loaded into a central master geodatabase stored in a Microsoft SQL Server at the TFS headquarters in College Station through the ESRI Spatial Database Engine (SDE). This GIS allows users to work off-line when editing data and requires connecting to the central geodatabase only when needed.

  2. Apollo: a community resource for genome annotation editing

    PubMed Central

    Ed, Lee; Nomi, Harris; Mark, Gibson; Raymond, Chetty; Suzanna, Lewis

    2009-01-01

    Summary: Apollo is a genome annotation-editing tool with an easy to use graphical interface. It is a component of the GMOD project, with ongoing development driven by the community. Recent additions to the software include support for the generic feature format version 3 (GFF3), continuous transcriptome data, a full Chado database interface, integration with remote services for on-the-fly BLAST and Primer BLAST analyses, graphical interfaces for configuring user preferences and full undo of all edit operations. Apollo's user community continues to grow, including its use as an educational tool for college and high-school students. Availability: Apollo is a Java application distributed under a free and open source license. Installers for Windows, Linux, Unix, Solaris and Mac OS X are available at http://apollo.berkeleybop.org, and the source code is available from the SourceForge CVS repository at http://gmod.cvs.sourceforge.net/gmod/apollo. Contact: elee@berkeleybop.org PMID:19439563

  3. Apollo: a community resource for genome annotation editing.

    PubMed

    Lee, Ed; Harris, Nomi; Gibson, Mark; Chetty, Raymond; Lewis, Suzanna

    2009-07-15

    Apollo is a genome annotation-editing tool with an easy to use graphical interface. It is a component of the GMOD project, with ongoing development driven by the community. Recent additions to the software include support for the generic feature format version 3 (GFF3), continuous transcriptome data, a full Chado database interface, integration with remote services for on-the-fly BLAST and Primer BLAST analyses, graphical interfaces for configuring user preferences and full undo of all edit operations. Apollo's user community continues to grow, including its use as an educational tool for college and high-school students. Apollo is a Java application distributed under a free and open source license. Installers for Windows, Linux, Unix, Solaris and Mac OS X are available at http://apollo.berkeleybop.org, and the source code is available from the SourceForge CVS repository at http://gmod.cvs.sourceforge.net/gmod/apollo.

  4. Chemical analyses of coal, coal-associated rocks and coal combustion products collected for the National Coal Quality Inventory

    USGS Publications Warehouse

    Hatch, Joseph R.; Bullock, John H.; Finkelman, Robert B.

    2006-01-01

    In 1999, the USGS initiated the National Coal Quality Inventory (NaCQI) project to address a need for quality information on coals that will be mined during the next 20-30 years. At the time this project was initiated, the publicly available USGS coal quality data was based on samples primarily collected and analyzed between 1973 and 1985. The primary objective of NaCQI was to create a database containing comprehensive, accurate and accessible chemical information on the quality of mined and prepared United States coals and their combustion byproducts. This objective was to be accomplished through maintaining the existing publicly available coal quality database, expanding the database through the acquisition of new samples from priority areas, and analysis of the samples using updated coal analytical chemistry procedures. Priorities for sampling include those areas where future sources of compliance coal are federally owned. This project was a cooperative effort between the U.S. Geological Survey (USGS), State geological surveys, universities, coal burning utilities, and the coal mining industry. Funding support came from the Electric Power Research Institute (EPRI) and the U.S. Department of Energy (DOE).

  5. Advancements in web-database applications for rabies surveillance.

    PubMed

    Rees, Erin E; Gendron, Bruno; Lelièvre, Frédérick; Coté, Nathalie; Bélanger, Denise

    2011-08-02

    Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include (1) automatic integration of multi-agency data and diagnostic results on a daily basis; (2) a web-based data editing interface that enables authorized users to add, edit and extract data; and (3) an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from raccoon rabies. Furthermore, health agencies have real-time access to a wide assortment of data documenting new developments in the raccoon rabies epidemic and this enables a more timely and appropriate response.

  6. Advancements in web-database applications for rabies surveillance

    PubMed Central

    2011-01-01

    Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. Results RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include 1) automatic integration of multi-agency data and diagnostic results on a daily basis; 2) a web-based data editing interface that enables authorized users to add, edit and extract data; and 3) an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. Conclusions RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from raccoon rabies. Furthermore, health agencies have real-time access to a wide assortment of data documenting new developments in the raccoon rabies epidemic and this enables a more timely and appropriate response. PMID:21810215

  7. Motion Pattern Encapsulation for Data-Driven Constraint-Based Motion Editing

    NASA Astrophysics Data System (ADS)

    Carvalho, Schubert R.; Boulic, Ronan; Thalmann, Daniel

    The growth of motion capture systems have contributed to the proliferation of human motion database, mainly because human motion is important in many applications, ranging from games entertainment and films to sports and medicine. However, the captured motions normally attend specific needs. As an effort for adapting and reusing captured human motions in new tasks and environments and improving the animator's work, we present and discuss a new data-driven constraint-based animation system for interactive human motion editing. This method offers the compelling advantage that it provides faster deformations and more natural-looking motion results compared to goal-directed constraint-based methods found in the literature.

  8. Computer Courseware Evaluations. January 1988 to December 1988. Volume VIII.

    ERIC Educational Resources Information Center

    Riome, Carol-Anne, Comp.

    The eighth in a series, this report reviews microcomputer software authorized by the Alberta (Canada) Department of Education from January 1988 through December 1988. This edition provides detailed evaluations of 40 authorized programs for teaching business education, computer literacy, databases, file management, French, information retrieval,…

  9. NICEM Thesaurus. First Edition.

    ERIC Educational Resources Information Center

    National Information Center for Educational Media, Albuquerque, NM.

    This thesaurus, developed by the National Information Center for Educational Media (NICEM), represents an expansion of the NICEM subject headings list, which is designed to provide access to a database of bibliographical records of nonprint, educational media. A preface discusses the issues that led to a revamping of the subject headings,…

  10. My Favorite Things Electronically Speaking, 1997 Edition.

    ERIC Educational Resources Information Center

    Glantz, Shelley

    1997-01-01

    Responding to an informal survey, 96 media specialists named favorite software, CD-ROMs, and online sites. This article lists automation packages, electronic encyclopedias, CD-ROMs, electronic magazine indexes, CD-ROM and online database services, electronic sources of current events, laser disks for grades 6-12, word processing programs for…

  11. Apples in the Apple Library--How One Library Took a Byte.

    ERIC Educational Resources Information Center

    Ertel, Monica

    1983-01-01

    Summarizes automation of a specialized library at Apple Computer, Inc., describing software packages chosen for the following functions: word processing/text editing; cataloging and circulation; reference; and in-house databases. Examples of each function and additional sources of information on software and equipment mentioned in the article are…

  12. Bending the Rules: When Deaf Writers Leave College

    ERIC Educational Resources Information Center

    Biser, Eileen; Rubel, Linda; Toscano, Rose Marie

    2007-01-01

    On-the-job writing of deaf college graduates at all degree levels was investigated. Institutional databases and questionnaires to alumni and employers were the sources for information. Respondents were asked about editing assistance, sources and types of assistance, and perceptions of such assistance by employers and employees. Results of the…

  13. Techniques for Generating Objects in a Three-Dimensional CAD System.

    ERIC Educational Resources Information Center

    Goss, Larry D.

    1987-01-01

    Discusses coordinate systems, units of measure, scaling and levels as they relate to a database generated by a computer in a spatial rather than planer location. Describes geometric-oriented input, direct coordinates, transformations, annotation, editing and patterns. Stresses that hand drafting emulation is a short-sighted approach to…

  14. Re-examination of service-sire conception rates in the United States

    USDA-ARS?s Scientific Manuscript database

    Until recently sire conception rates (SCRs) in the United States had been published only for bulls from artificial-insemination (AI) organizations that paid dairy records processing centers a fee for editing the data and forwarding it to the national dairy database of the Council on Dairy Cattle Bre...

  15. Evaluation of the 8th AJCC staging system for pathologically versus clinically staged pancreatic adenocarcinoma: A time to revisit a dogma?

    PubMed

    Abdel-Rahman, Omar

    2018-02-01

    The 8th edition of the American Joint Committee on Cancer (AJCC) staging system for pancreatic exocrine adenocarcinoma has been released. The current study seeks to assess the 7th and 8th editions among patients registered within the surveillance, epidemiology and end results (SEER) database. SEER database (2010-2013) has been accessed through SEER*Stat program and AJCC 8th edition stages were reconstructed utilizing the collaborative stage descriptions. Kaplan-Meier analysis of overall survival and pancreatic cancer-specific survival analyses (according to both 7th and 8th editions and according to whether pathological or clinical staging were conducted) has been performed. Multivariate analysis of factors affecting pancreatic cancer-specific survival was also conducted through a Cox proportional hazard model. A total of 18  948 patients with pancreatic adenocarcinoma were identified in the period from 2010-2013. Pancreatic cancer-specific survival among pathologically staged patients and according to the 8th edition showed significant differences for all pair wise comparisons among different stages (P < 0.0001) except for the comparison between stage IA and stage IB (P = 0.307) and the comparison between stage IB and stage IIA (P = 0.116). Moreover, P value for stage IA vs IIA was 0.014; while pancreatic cancer-specific survival according to the 7th edition among pathologically staged patients showed significant differences for all pair wise comparisons among different stages (P < 0.0001) except for the comparison between IA and IB (P = 0.072), the comparison between stage IIA and stage IIB (P = 0.065), the comparison between stage IIA and stage III (P = 0.059) and the comparison between IIB and III (P = 0.595). Among clinically staged patients (i.e. those who did not undergo initial radical surgery), the prognostic performance of both 7th and 8th stages for both overall survival and pancreatic cancer-specific survival was limited. There is clearly a need to have two staging systems for pancreatic adenocarcinoma: pathological and clinical staging systems. Copyright © 2018 First Affiliated Hospital, Zhejiang University School of Medicine in China. Published by Elsevier B.V. All rights reserved.

  16. Energy technologies and the environment: Environmental information handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-10-01

    This revision of Energy Technologies and the Environment reflects the changes in energy supply and demand, focus of environmental concern, and emphasis of energy research and development that have occurred since publication of the earlier edition in 1980. The increase in availability of oil and natural gas, at least for the near term, is responsible in part for a reduced emphasis on development of replacement fuels and technologies. Trends in energy development also have been influenced by an increased reliance on private industry initiatives, and a correspondingly reduced government involvement, in demonstrating more developed technologies. Environmental concerns related to acidmore » rain and waste management continue to increase the demand for development of innovative energy systems. The basic criteria for including a technology in this report are that (1) the technology is a major current or potential future energy supply and (2) significant changes in employing or understanding the technology have occurred since publication of the 1980 edition. Coal is seen to be a continuing major source of energy supply, and thus chapters pertaining to the principal coal technologies have been revised from the 1980 edition (those on coal mining and preparation, conventional coal-fired power plants, fluidized-bed combustion, coal gasification, and coal liquefaction) or added as necessary to include emerging technologies (those on oil shale, combined-cycle power plants, coal-liquid mixtures, and fuel cells).« less

  17. Combustion of interacting droplet arrays in a microgravity environment

    NASA Technical Reports Server (NTRS)

    Dietrich, Daniel L.; Haggard, John B.

    1993-01-01

    This research program involves the study of one and two dimensional arrays of droplets in a buoyant-free environment. The purpose of the work is to extend the database and theories that exist for single droplets into the regime where droplet interactions are important. The eventual goal being to use the results of this work as inputs to models on spray combustion where droplets seldom burn individually; instead the combustion history of a droplet is strongly influenced by the presence of the neighboring droplets. The emphasis of the present investigation is experimental, although comparison will be made to existing theoretical and numerical treatments when appropriate. Both normal gravity and low gravity testing will be employed, and the results compared. The work to date will be summarized in the next section, followed by a section detailing the future plans.

  18. The development of a dynamic software for the user interaction from the geographic information system environment with the database of the calibration site of the satellite remote electro-optic sensors

    NASA Astrophysics Data System (ADS)

    Zyelyk, Ya. I.; Semeniv, O. V.

    2015-12-01

    The state of the problem of the post-launch calibration of the satellite electro-optic remote sensors and its solutions in Ukraine is analyzed. The database is improved and dynamic services for user interaction with database from the environment of open geographical information system Quantum GIS for information support of calibration activities are created. A dynamic application under QGIS is developed, implementing these services in the direction of the possibility of data entering, editing and extraction from the database, using the technology of object-oriented programming and of modern complex program design patterns. The functional and algorithmic support of this dynamic software and its interface are developed.

  19. Internet Databases of the Properties, Enzymatic Reactions, and Metabolism of Small Molecules—Search Options and Applications in Food Science

    PubMed Central

    Minkiewicz, Piotr; Darewicz, Małgorzata; Iwaniak, Anna; Bucholska, Justyna; Starowicz, Piotr; Czyrko, Emilia

    2016-01-01

    Internet databases of small molecules, their enzymatic reactions, and metabolism have emerged as useful tools in food science. Database searching is also introduced as part of chemistry or enzymology courses for food technology students. Such resources support the search for information about single compounds and facilitate the introduction of secondary analyses of large datasets. Information can be retrieved from databases by searching for the compound name or structure, annotating with the help of chemical codes or drawn using molecule editing software. Data mining options may be enhanced by navigating through a network of links and cross-links between databases. Exemplary databases reviewed in this article belong to two classes: tools concerning small molecules (including general and specialized databases annotating food components) and tools annotating enzymes and metabolism. Some problems associated with database application are also discussed. Data summarized in computer databases may be used for calculation of daily intake of bioactive compounds, prediction of metabolism of food components, and their biological activity as well as for prediction of interactions between food component and drugs. PMID:27929431

  20. Internet Databases of the Properties, Enzymatic Reactions, and Metabolism of Small Molecules-Search Options and Applications in Food Science.

    PubMed

    Minkiewicz, Piotr; Darewicz, Małgorzata; Iwaniak, Anna; Bucholska, Justyna; Starowicz, Piotr; Czyrko, Emilia

    2016-12-06

    Internet databases of small molecules, their enzymatic reactions, and metabolism have emerged as useful tools in food science. Database searching is also introduced as part of chemistry or enzymology courses for food technology students. Such resources support the search for information about single compounds and facilitate the introduction of secondary analyses of large datasets. Information can be retrieved from databases by searching for the compound name or structure, annotating with the help of chemical codes or drawn using molecule editing software. Data mining options may be enhanced by navigating through a network of links and cross-links between databases. Exemplary databases reviewed in this article belong to two classes: tools concerning small molecules (including general and specialized databases annotating food components) and tools annotating enzymes and metabolism. Some problems associated with database application are also discussed. Data summarized in computer databases may be used for calculation of daily intake of bioactive compounds, prediction of metabolism of food components, and their biological activity as well as for prediction of interactions between food component and drugs.

  1. A National-Level Validation of the New American Joint Committee on Cancer 8th Edition Subclassification of Stage IIA and B Anal Squamous Cell Cancer.

    PubMed

    Goffredo, Paolo; Garancini, Mattia; Robinson, Timothy J; Frakes, Jessica; Hoshi, Hisakazu; Hassan, Imran

    2018-06-01

    The 8th edition of the American Joint Committee on Cancer (AJCC) updated the staging system of anal squamous cell cancer (ASCC) by subdividing stage II into A (T2N0M0) and B (T3N0M0) based on a secondary analysis of the RTOG 98-11 trial. We aimed to validate this new subclassification utilizing two nationally representative databases. The National Cancer Database (NCDB) [2004-2014] and the Surveillance, Epidemiology, and End Results (SEER) database [1988-2013] were queried to identify patients with stage II ASCC. A total of 6651 and 2579 stage IIA (2-5 cm) and 1777 and 641 stage IIB (> 5 cm) patients were identified in the NCDB and SEER databases, respectively. Compared with stage IIB patients, stage IIA patients within the NCDB were more often females with fewer comorbidities. No significant differences were observed between age, race, receipt of chemotherapy and radiation, and mean radiation dose. Demographic, clinical, and pathologic characteristics were comparable between patients in both datasets. The 5-year OS was 72% and 69% for stage IIA versus 57% and 50% for stage IIB in the NCDB and SEER databases, respectively (p < 0.001). After adjustment for available demographic and clinical confounders, stage IIB was significantly associated with worse survival in both cohorts (hazard ratio 1.58 and 2.01, both p < 0.001). This study validates the new AJCC subclassification of stage II anal cancer into A and B based on size (2-5 cm vs. > 5 cm) in the general ASCC population. AJCC stage IIB patients represent a higher risk category that should be targeted with more aggressive/novel therapies.

  2. Carbon deposition model for oxygen-hydrocarbon combustion

    NASA Technical Reports Server (NTRS)

    Bossard, John A.

    1988-01-01

    The objectives are to use existing hardware to verify and extend the database generated on the original test programs. The data to be obtained are the carbon deposition characteristics when methane is used at injection densities comparable to full scale values. The database will be extended to include liquid natural gas (LNG) testing at low injection densities for gas generator/preburner conditions. The testing will be performed at mixture ratios between 0.25 and 0.60, and at chamber pressures between 750 and 1500 psi.

  3. Calvert City Power Combustion Turbine Facility

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  4. ZINC: A Free Tool to Discover Chemistry for Biology

    PubMed Central

    2012-01-01

    ZINC is a free public resource for ligand discovery. The database contains over twenty million commercially available molecules in biologically relevant representations that may be downloaded in popular ready-to-dock formats and subsets. The Web site also enables searches by structure, biological activity, physical property, vendor, catalog number, name, and CAS number. Small custom subsets may be created, edited, shared, docked, downloaded, and conveyed to a vendor for purchase. The database is maintained and curated for a high purchasing success rate and is freely available at zinc.docking.org. PMID:22587354

  5. Management system for the SND experiments

    NASA Astrophysics Data System (ADS)

    Pugachev, K.; Korol, A.

    2017-09-01

    A new management system for the SND detector experiments (at VEPP-2000 collider in Novosibirsk) is developed. We describe here the interaction between a user and the SND databases. These databases contain experiment configuration, conditions and metadata. The new system is designed in client-server architecture. It has several logical layers corresponding to the users roles. A new template engine is created. A web application is implemented using Node.js framework. At the time the application provides: showing and editing configuration; showing experiment metadata and experiment conditions data index; showing SND log (prototype).

  6. A priori and a posteriori analyses of the flamelet/progress variable approach for supersonic combustion

    NASA Astrophysics Data System (ADS)

    Saghafian, Amirreza; Pitsch, Heinz

    2012-11-01

    A compressible flamelet/progress variable approach (CFPV) has been devised for high-speed flows. Temperature is computed from the transported total energy and tabulated species mass fractions and the source term of the progress variable is rescaled with pressure and temperature. The combustion is thus modeled by three additional scalar equations and a chemistry table that is computed in a pre-processing step. Three-dimensional direct numerical simulation (DNS) databases of reacting supersonic turbulent mixing layer with detailed chemistry are analyzed to assess the underlying assumptions of CFPV. Large eddy simulations (LES) of the same configuration using the CFPV method have been performed and compared with the DNS results. The LES computations are based on the presumed subgrid PDFs of mixture fraction and progress variable, beta function and delta function respectively, which are assessed using DNS databases. The flamelet equation budget is also computed to verify the validity of CFPV method for high-speed flows.

  7. PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnette, Daniel W.

    2012-01-04

    PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables across different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields;more » generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.« less

  8. Effects of technical editing in biomedical journals: a systematic review.

    PubMed

    Wager, Elizabeth; Middleton, Philippa

    2002-06-05

    Technical editing supposedly improves the accuracy and clarity of journal articles. We examined evidence of its effects on research reports in biomedical journals. Subset of a systematic review using Cochrane methods, searching MEDLINE, EMBASE, and other databases from earliest entries to February 2000 by using inclusive search terms; hand searching relevant journals. We selected comparative studies of the effects of editorial processes on original research articles between acceptance and publication in biomedical journals. Two reviewers assessed each study and performed independent data extraction. The 11 studies on technical editing indicate that it improves the readability of articles slightly (as measured by Gunning Fog and Flesch reading ease scores), may improve other aspects of their quality, can increase the accuracy of references and quotations, and raises the quality of abstracts. Supplying authors with abstract preparation instructions had no discernible effect. Considering the time and resources devoted to technical editing, remarkably little is know about its effects or the effects of imposing different house styles. Studies performed at 3 journals employing relatively large numbers of professional technical editors suggest that their editorial processes are associated with increases in readability and quality of articles, but these findings may not be generalizable to other journals.

  9. CrisprGE: a central hub of CRISPR/Cas-based genome editing.

    PubMed

    Kaur, Karambir; Tandon, Himani; Gupta, Amit Kumar; Kumar, Manoj

    2015-01-01

    CRISPR system is a powerful defense mechanism in bacteria and archaea to provide immunity against viruses. Recently, this process found a new application in intended targeting of the genomes. CRISPR-mediated genome editing is performed by two main components namely single guide RNA and Cas9 protein. Despite the enormous data generated in this area, there is a dearth of high throughput resource. Therefore, we have developed CrisprGE, a central hub of CRISPR/Cas-based genome editing. Presently, this database holds a total of 4680 entries of 223 unique genes from 32 model and other organisms. It encompasses information about the organism, gene, target gene sequences, genetic modification, modifications length, genome editing efficiency, cell line, assay, etc. This depository is developed using the open source LAMP (Linux Apache MYSQL PHP) server. User-friendly browsing, searching facility is integrated for easy data retrieval. It also includes useful tools like BLAST CrisprGE, BLAST NTdb and CRISPR Mapper. Considering potential utilities of CRISPR in the vast area of biology and therapeutics, we foresee this platform as an assistance to accelerate research in the burgeoning field of genome engineering. © The Author(s) 2015. Published by Oxford University Press.

  10. Agricultural Safety and Health: A Resource Guide. Rural Information Center Publication Series, No. 40. Revised Edition.

    ERIC Educational Resources Information Center

    Zimmerman, Joy, Comp.

    This guide lists resource materials that address agricultural occupational injuries and diseases and their prevention. Many of the entries were derived from the AGRICOLA database produced by the National Agricultural Library and include journal articles, books, government reports, training materials, and audiovisual materials. The first section…

  11. Criminal Justice Research in Libraries and on the Internet.

    ERIC Educational Resources Information Center

    Nelson, Bonnie R.

    In addition to covering the enduring elements of traditional research on criminal justice, this new edition provides full coverage on research using the World Wide Web, hypertext documents, computer indexes, and other online resources. It gives an in-depth explanation of such concepts as databases, networks, and full text, and covers the Internet…

  12. Helping Students Succeed. Annual Report, 2010

    ERIC Educational Resources Information Center

    New Mexico Higher Education Department, 2010

    2010-01-01

    This annual report contains postsecondary data that has been collected and analyzed using the New Mexico Higher Education Department's Data Editing and Reporting (DEAR) database, unless otherwise noted. The purpose of the DEAR system is to increase the reliability in the data and to make more efficient efforts by institutions and the New Mexico…

  13. Working with Computers: Computer Orientation for Foreign Students.

    ERIC Educational Resources Information Center

    Barlow, Michael

    Designed as a resource for foreign students, this book includes instructions not only on how to use computers, but also on how to use them to complete academic work more efficiently. Part I introduces the basic operations of mainframes and microcomputers and the major areas of computing, i.e., file management, editing, communications, databases,…

  14. Microgravity science and applications bibliography, 1989 revision

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This edition of the Microgravity Science and Applications (MSA) Bibliography is a compilation of government reports, contractor reports, conference proceedings, and journal articles dealing with flight experiments utilizing a low gravity environment to elucidate and control various processes, or with ground based activities that provide supported research. It encompasses literature published but not cited in the 1988 Revision and that literature which has been published in the past year. Subdivisions of the Bibliography include: electronic materials, metals, alloys, and composites; fluids, interfaces, and transport; glasses and ceramics; biotechnology; combustion science; experimental technology, facilities, and instrumentation. Also included are publications from the European, Soviet, and Japanese programs.

  15. Microgravity science and applications bibliography, 1990 revision

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This edition of the Microgravity Science and Applications (MSA) Bibliography is a compilation of government reports, contractor reports, conference proceedings, and journal articles dealing with flight experiments utilizing a low gravity environment to elucidate and control various processes, or with ground based activities that provide supporting research. It encompasses literature published but not cited in the 1989 Revision and that literature which has been published in the past year. Subdivisions of the bibliography include: electronic materials; metals, alloys, and composites; fluids, interfaces, and transport; glasses and ceramics; biotechnology; combustion science; and experimental technology, facilities, and instrumentation. Also included are publications from the European, Soviet, and Japanese programs.

  16. Microgravity science and applications bibliography, 1991 revision

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This edition of the Microgravity Science and Applications (MSA) Bibliography is a compilation of government reports, contractor reports, conference proceedings, and journal articles dealing with flight experiments using a low gravity environment to elucidate and control various processes, or with ground based activities that provide supporting research. It encompasses literature published but not cited in the 1990 Revision and that literature which has been published in the past year. Subdivisions of the bibliography include: Electronic materials; Metals, alloys, and composites; Fluids, interfaces and transport; Glasses and ceramics; Biotechnology; Combustion science; and Experimental technology, instrumentation, and facilities. Also included are a limited number of publications from the European, Soviet, and Japanese programs.

  17. The table of isotopes-8th edition and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firestone, R.B.

    A new edition of the Table of Isotopes has been published this year by John Wiley and Sons, Inc. This edition is the eighth in a series started by Glenn T. Seaborg in 1940. The two-volume, 3168-page, cloth-bound edition is twice the size of the previous edition published in 1978. It contains nuclear structure and decay data, based mainly on the Evaluated Nuclear Structure Data File (ENSDF), for >3100 isotopes and isomers. Approximately 24000 references are cited, and the appendices have been updated and extended. The book is packaged with an interactive CD-ROM that contains the Table of Isotopes inmore » Adobe Acrobat Portable Document Format for convenient viewing on personal computer (PC) and UNIX workstations. The CD-ROM version contains a chart of the nuclides graphical index and separate indices organized for radioisotope users and nuclear structure physicists. More than 100000 hypertext links are provided to move the user quickly through related information free from the limitations of page size. Complete references with keyword abstracts are provided. The CD-ROM also contains the Table of Super-deformed Nuclear Bands and Fission Isomers; Tables of Atoms, Atomic Nuclei, and Subatomic Particles by Ivan P. Selinov; the ENSDF and nuclear structure reference (NSR) databases; the ENSDF manual by Jagdish K. Tuli; and Abode Acrobat Reader software.« less

  18. OH PLIF Visualization of the UVa Supersonic Combustion Experiment: Configuration C

    NASA Technical Reports Server (NTRS)

    McRae, Colin D.; Johansen, Craig T.; Danehy, Paul M.; Gallo, Emanuela C. A.; Cantu, Luca M. L.; Magnotti, Gaetano; Cutler, Andrew D.; Rockwell, Robert D., Jr.; Goyne, Christopher P.; McDnaiel, James C.

    2013-01-01

    Non-intrusive hydroxyl radical (OH) planar laser-induced fluorescence (PLIF) measurements were obtained in configuration C of the University of Virginia supersonic combustion experiment. The combustion of hydrogen fuel injected through an unswept compression ramp into a supersonic cross-flow was imaged over a range of streamwise positions. Images were corrected for optical distortion, variations in the laser sheet profile, and different camera views. Results indicate an effect of fuel equivalence ratio on combustion zone shape and local turbulence length scale. The streamwise location of the reaction zone relative to the fuel injector was also found to be sensitive to the fuel equivalence ratio. The flow boundary conditions in the combustor section, which are sensitive to the fuel flow rate, are believed to have caused this effect. A combination of laser absorption and radiative trapping effects are proposed to have caused asymmetry observed in the images. The results complement previously published OH PLIF data obtained for configuration A along with other non-intrusive measurements to form a database for computational fluid dynamics (CFD) model validation.

  19. Optical Measurements in a Combustor Using a 9-Point Swirl-Venturi Fuel Injector

    NASA Technical Reports Server (NTRS)

    Hicks, Yolanda R.; Anderson, Robert C.; Locke, Randy J.

    2007-01-01

    This paper highlights the use of two-dimensional data to characterize a multipoint swirl-venturi injector. The injector is based on a NASA-conceived lean direct injection concept. Using a variety of advanced optical diagnostic techniques, we examine the flows resultant from multipoint, lean-direct injectors that have nine injection sites arranged in a 3 x 3 grid. The measurements are made within an optically-accessible, jet-A-fueled, 76-mm by 76-mm flame tube combustor. Combustion species mapping and velocity measurements are obtained using planar laser-induced fluorescence of OH and fuel, planar laser scatter of liquid fuel, chemiluminescence from CH*, NO*, and OH*, and particle image velocimetry of seeded air (non-fueled). These measurements are used to study fuel injection, mixedness, and combustion processes and are part of a database of measurements that will be used for validating computational combustion models.

  20. Genome editing of Ralstonia eutropha using an electroporation-based CRISPR-Cas9 technique.

    PubMed

    Xiong, Bin; Li, Zhongkang; Liu, Li; Zhao, Dongdong; Zhang, Xueli; Bi, Changhao

    2018-01-01

    Ralstonia eutropha is an important bacterium for the study of polyhydroxyalkanoates (PHAs) synthesis and CO 2 fixation, which makes it a potential strain for industrial PHA production and attractive host for CO 2 conversion. Although the bacterium is not recalcitrant to genetic manipulation, current methods for genome editing based on group II introns or single crossover integration of a suicide plasmid are inefficient and time-consuming, which limits the genetic engineering of this organism. Thus, developing an efficient and convenient method for R. eutropha genome editing is imperative. An efficient genome editing method for R. eutropha was developed using an electroporation-based CRISPR-Cas9 technique. In our study, the electroporation efficiency of R. eutropha was found to be limited by its restriction-modification (RM) systems. By searching the putative RM systems in R. eutropha H16 using REBASE database and comparing with that in E. coli MG1655, five putative restriction endonuclease genes which are related to the RM systems in R. eutropha were predicated and disrupted. It was found that deletion of H16_A0006 and H16_A0008 - 9 increased the electroporation efficiency 1658 and 4 times, respectively. Fructose was found to reduce the leaky expression of the arabinose-inducible pBAD promoter, which was used to optimize the expression of cas9 , enabling genome editing via homologous recombination based on CRISPR-Cas9 in R. eutropha . A total of five genes were edited with efficiencies ranging from 78.3 to 100%. The CRISPR-Cpf1 system and the non-homologous end joining mechanism were also investigated, but failed to yield edited strains. We present the first genome editing method for R. eutropha using an electroporation-based CRISPR-Cas9 approach, which significantly increased the efficiency and decreased time to manipulate this facultative chemolithoautotrophic microbe. The novel technique will facilitate more advanced researches and applications of R. eutropha for PHA production and CO 2 conversion.

  1. Assessment of the American Joint Commission on Cancer 8th Edition Staging System for Patients with Pancreatic Neuroendocrine Tumors: A Surveillance, Epidemiology, and End Results analysis.

    PubMed

    Li, Xiaogang; Gou, Shanmiao; Liu, Zhiqiang; Ye, Zeng; Wang, Chunyou

    2018-03-01

    Although several staging systems have been proposed for pancreatic neuroendocrine tumors (pNETs), the optimal staging system remains unclear. Here, we aimed to assess the application of the newly revised 8th edition American Joint Committee on Cancer (AJCC) staging system for exocrine pancreatic carcinoma (EPC) to pNETs, in comparison with that of other staging systems. We identified pNETs patients from the Surveillance, Epidemiology, and End Results (SEER) database (2004-2014). Overall survival was analyzed using Kaplan-Meier curves with the log-rank test. The predictive accuracy of each staging system was assessed by the concordance index (c-index). Cox proportional hazards regression was conducted to calculate the impact of different stages. In total, 2424 patients with pNETs, including 2350 who underwent resection, were identified using SEER data. Patients with different stages were evenly stratified based on the 8th edition AJCC staging system for EPC. Kaplan-Meier curves were well separated in all patients and patients with resection using the 8th edition AJCC staging system for EPC. Moreover, the hazard ratio increased with worsening disease stage. The c-index of the 8th edition AJCC staging system for EPC was similar to that of the other systems. For pNETs patients, the 8th edition AJCC staging system for EPC exhibits good prognostic discrimination among different stages in both all patients and those with resection. © 2018 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.

  2. The Giardia genome project database.

    PubMed

    McArthur, A G; Morrison, H G; Nixon, J E; Passamaneck, N Q; Kim, U; Hinkle, G; Crocker, M K; Holder, M E; Farr, R; Reich, C I; Olsen, G E; Aley, S B; Adam, R D; Gillin, F D; Sogin, M L

    2000-08-15

    The Giardia genome project database provides an online resource for Giardia lamblia (WB strain, clone C6) genome sequence information. The database includes edited single-pass reads, the results of BLASTX searches, and details of progress towards sequencing the entire 12 million-bp Giardia genome. Pre-sorted BLASTX results can be retrieved based on keyword searches and BLAST searches of the high throughput Giardia data can be initiated from the web site or through NCBI. Descriptions of the genomic DNA libraries, project protocols and summary statistics are also available. Although the Giardia genome project is ongoing, new sequences are made available on a bi-monthly basis to ensure that researchers have access to information that may assist them in the search for genes and their biological function. The current URL of the Giardia genome project database is www.mbl.edu/Giardia.

  3. Effect of warning statements in e-cigarette advertisements: an experiment with young adults in the US

    PubMed Central

    Sanders-Jackson, Ashley; Schleicher, Nina C.; Fortmann, Stephen P.; Henriksen, Lisa

    2016-01-01

    Background and Aims This on-line experiment examined whether the addition of ingredient- or industry-themed warning statements in television advertisements for e-cigarettes would affect young adults’ craving for and risk perceptions of e-cigarettes and combustible cigarettes, as well as intent to purchase e-cigarettes. Design Advertisements for two leading e-cigarette brands were edited to contain a warning statement about product ingredients or about the tobacco industry. Participants were assigned randomly to one of eight treatments or one of two brand-specific control conditions without any warning statement. Participants Young adults (n=900, ages 18–34 years) in a web panel were recruited from three groups: recent e-cigarette users, current smokers who used combustible cigarettes exclusively and non-users of either product. Measurements Craving and risk perceptions (addictiveness, harmful to health in general, harmful to others) were measured separately for e-cigarettes and combustible cigarettes. The Juster scale measured intention to purchase e-cigarettes. Findings Exposure to both types of warnings was associated with lower craving for e-cigarettes among e-cigarette users and smokers who experienced any craving (P <0.01) and lower intention to purchase among all participants (P <0.001). Only exposure to ingredient-themed warnings was associated with lower craving for combustible cigarettes (P<0.05). Participants who saw industry-themed warnings reported greater perceptions of general harm (P<0.001), but also rated e-cigarettes as less addictive than the control conditions (P<0.05). Conclusion The addition of ingredient- or industry-themed warning statements to e-cigarette television advertising similarly reduces craving and purchase intent for e-cigarettes, but has inconsistent effects on perceived risks. PMID:25557128

  4. Effect of warning statements in e-cigarette advertisements: an experiment with young adults in the United States.

    PubMed

    Sanders-Jackson, Ashley; Schleicher, Nina C; Fortmann, Stephen P; Henriksen, Lisa

    2015-12-01

    This on-line experiment examined whether the addition of ingredient- or industry-themed warning statements in television advertisements for e-cigarettes would affect young adults' craving for and risk perceptions of e-cigarettes and combustible cigarettes, as well as intent to purchase e-cigarettes. Advertisements for two leading e-cigarette brands were edited to contain a warning statement about product ingredients or about the tobacco industry. Participants were assigned randomly to one of eight treatments or one of two brand-specific control conditions without any warning statement. Young adults (n=900, aged 18-34 years) in a web panel were recruited from three groups: recent e-cigarette users, current smokers who used combustible cigarettes exclusively and non-users of either product. Craving and risk perceptions (addictiveness, harmful to health in general, harmful to others) were measured separately for e-cigarettes and combustible cigarettes. The Juster scale measured intention to purchase e-cigarettes. Exposure to both types of warnings was associated with lower craving for e-cigarettes among e-cigarette users and smokers who experienced any craving (P<0.01) and lower intention to purchase among all participants (P<0.001). Only exposure to ingredient-themed warnings was associated with lower craving for combustible cigarettes (P<0.05). Participants who saw industry-themed warnings reported greater perceptions of general harm (P<0.001), but also rated e-cigarettes as less addictive than the control conditions (P<0.05). The addition of ingredient- or industry-themed warning statements to e-cigarette television advertising similarly reduces craving and purchase intent for e-cigarettes, but has inconsistent effects on perceived risks. © 2015 Society for the Study of Addiction.

  5. [The biomedical periodicals of Hungarian editions--historical overview].

    PubMed

    Berhidi, Anna; Geges, József; Vasas, Lívia

    2006-03-12

    The majority of Hungarian scientific results are published in international periodicals in foreign languages. Yet the publications in Hungarian scientific periodicals also should not be ignored. This study analyses biomedical periodicals of Hungarian edition from different points of view. Based on different databases a list of titles consisting of 119 items resulted, which contains both the core and the peripheral journals of the biomedical field. These periodicals were analysed empirically, one by one: checking out the titles. 13 of the titles are ceased, among the rest 106 Hungarian scientific journals 10 are published in English language. From the remaining majority of Hungarian language and publishing only a few show up in international databases. Although quarter of the Hungarian biomedical journals meet the requirements, which means they could be represented in international databases, these periodicals are not indexed. 42 biomedical periodicals are available online. Although quarter of these journals come with restricted access. 2/3 of the Hungarian biomedical journals have detailed instructions to authors. These instructions inform the publishing doctors and researchers of the requirements of a biomedical periodical. The increasing number of Hungarian biomedical journals published is welcome news. But it would be important for quality publications which are cited a lot to appear in the Hungarian journals. The more publications are cited, the more journals and authors gain in prestige on home and international level.

  6. Construction of combustion models for rapeseed methyl ester bio-diesel fuel for internal combustion engine applications.

    PubMed

    Golovitchev, Valeri I; Yang, Junfeng

    2009-01-01

    Bio-diesel fuels are non-petroleum-based diesel fuels consisting of long chain alkyl esters produced by the transesterification of vegetable oils, that are intended for use (neat or blended with conventional fuels) in unmodified diesel engines. There have been few reports of studies proposing theoretical models for bio-diesel combustion simulations. In this study, we developed combustion models based on ones developed previously. We compiled the liquid fuel properties, and the existing detailed mechanism of methyl butanoate ester (MB, C(5)H(10)O(2)) oxidation was supplemented by sub-mechanisms for two proposed fuel constituent components, C(7)H(16) and C(7)H(8)O (and then, by mp2d, C(4)H(6)O(2) and propyne, C(3)H(4)) to represent the combustion model for rapeseed methyl ester described by the chemical formula, C(19)H(34)O(2) (or C(19)H(36)O(2)). The main fuel vapor thermal properties were taken as those of methyl palmitate C(19)H(36)O(2) in the NASA polynomial form of the Burcat database. The special global reaction was introduced to "crack" the main fuel into its constituent components. This general reaction included 309 species and 1472 reactions, including soot and NO(x) formation processes. The detailed combustion mechanism was validated using shock-tube ignition-delay data under diesel engine conditions. For constant volume and diesel engine (Volvo D12C) combustion modeling, this mechanism could be reduced to 88 species participating in 363 reactions.

  7. Energy and Environment Division, annual report FY 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osowitt, M.

    1981-07-01

    This report covers research in: energy analysis; energy efficiency studies; solar energy; chemical process; energy-efficient buildings; environmental pollutant studies; combustion research; laser spectroscopy and trace elements; and oil shale and coal research. An energy and environment personnel listing is appended. Separate projects are indexed individually for the database. (PSB)

  8. 3D visualization of molecular structures in the MOGADOC database

    NASA Astrophysics Data System (ADS)

    Vogt, Natalja; Popov, Evgeny; Rudert, Rainer; Kramer, Rüdiger; Vogt, Jürgen

    2010-08-01

    The MOGADOC database (Molecular Gas-Phase Documentation) is a powerful tool to retrieve information about compounds which have been studied in the gas-phase by electron diffraction, microwave spectroscopy and molecular radio astronomy. Presently the database contains over 34,500 bibliographic references (from the beginning of each method) for about 10,000 inorganic, organic and organometallic compounds and structural data (bond lengths, bond angles, dihedral angles, etc.) for about 7800 compounds. Most of the implemented molecular structures are given in a three-dimensional (3D) presentation. To create or edit and visualize the 3D images of molecules, new tools (special editor and Java-based 3D applet) were developed. Molecular structures in internal coordinates were converted to those in Cartesian coordinates.

  9. Marfan Database (second edition): software and database for the analysis of mutations in the human FBN1 gene.

    PubMed Central

    Collod-Béroud, G; Béroud, C; Adès, L; Black, C; Boxer, M; Brock, D J; Godfrey, M; Hayward, C; Karttunen, L; Milewicz, D; Peltonen, L; Richards, R I; Wang, M; Junien, C; Boileau, C

    1997-01-01

    Fibrillin is the major component of extracellular microfibrils. Mutations in the fibrillin gene on chromosome 15 (FBN1) were described at first in the heritable connective tissue disorder, Marfan syndrome (MFS). More recently, FBN1 has also been shown to harbor mutations related to a spectrum of conditions phenotypically related to MFS. These mutations are private, essentially missense, generally non-recurrent and widely distributed throughout the gene. To date no clear genotype/phenotype relationship has been observed excepted for the localization of neonatal mutations in a cluster between exons 24 and 32. The second version of the computerized Marfan database contains 89 entries. The software has been modified to accomodate new functions and routines. PMID:9016526

  10. Structures data collection for The National Map using volunteered geographic information

    USGS Publications Warehouse

    Poore, Barbara S.; Wolf, Eric B.; Korris, Erin M.; Walter, Jennifer L.; Matthews, Greg D.

    2012-01-01

    The U.S. Geological Survey (USGS) has historically sponsored volunteered data collection projects to enhance its topographic paper and digital map products. This report describes one phase of an ongoing project to encourage volunteers to contribute data to The National Map using online editing tools. The USGS recruited students studying geographic information systems (GIS) at the University of Colorado Denver and the University of Denver in the spring of 2011 to add data on structures - manmade features such as schools, hospitals, and libraries - to four quadrangles covering metropolitan Denver. The USGS customized a version of the online Potlatch editor created by the OpenStreetMap project and populated it with 30 structure types drawn from the Geographic Names Information System (GNIS), a USGS database of geographic features. The students corrected the location and attributes of these points and added information on structures that were missing. There were two rounds of quality control. Student volunteers reviewed each point, and an in-house review of each point by the USGS followed. Nine-hundred and thirty-eight structure points were initially downloaded from the USGS database. Editing and quality control resulted in 1,214 structure points that were subsequently added to The National Map. A post-project analysis of the data shows that after student edit and peer review, 92 percent of the points contributed by volunteers met National Map Accuracy Standards for horizontal accuracy. Lessons from this project will be applied to later phases. These include: simplifying editing tasks and the user interfaces, stressing to volunteers the importance of adding structures that are missing, and emphasizing the importance of conforming to editorial guidelines for formatting names and addresses of structures. The next phase of the project will encompass the entire State of Colorado and will allow any citizen to contribute structures data. Volunteers will benefit from this project by engaging with their local geography and contributing to a national resource of topographic information that remains in the public domain for anyone to download.

  11. Entergy Arkansas Independence and White Bluff Stations, Request for PSD Determiniation for Lignite Combustion

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  12. Applicability of PSD-WEPCO Rule for Existing Five Combined-Cycle Combustion Turbines at Cogen Technologies, Union County

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  13. Science and Technology Text Mining: Origins of Database Tomography and Multi-Word Phrase Clustering

    DTIC Science & Technology

    2003-08-15

    six decades to the pioneering work in: 1) lexicography of Hornby [1942] to account for co- occurrence knowledge, and 2) linguistics of De Saussure ...of Development in a Research Field," Scientometrics, Vol.19, No.1, 1990b. De Saussure , F., "Cours de Linguistique Generale," 4eme Edition, Librairie

  14. Rules for Merging MELVYL Records. Technical Report No. 6. Revised.

    ERIC Educational Resources Information Center

    Coyle, Karen

    The University of California Catalog and Periodicals databases each have over 20 separately contributing libraries, and records for the same work can enter the MELVYL system from different campus libraries. MELVYL's goal is to have one union record for each distinct edition of a work. To promote this goal, the University's Division of Library…

  15. Geologic map of the eastern part of the Challis National Forest and vicinity, Idaho

    USGS Publications Warehouse

    Wilson, A.B.; Skipp, B.A.

    1994-01-01

    The paper version of the Geologic Map of the eastern part of the Challis National Forest and vicinity, Idaho was compiled by Anna Wilson and Betty Skipp in 1994. The geology was compiled on a 1:250,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  16. Development and Validation of a 3-Dimensional CFB Furnace Model

    NASA Astrophysics Data System (ADS)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents CFB process analysis focused on combustion and NO profiles in pilot and industrial scale bituminous coal combustion.

  17. GROWTH OF THE INTERNATIONAL CRITICALITY SAFETY AND REACTOR PHYSICS EXPERIMENT EVALUATION PROJECTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Blair Briggs; John D. Bess; Jim Gulliford

    2011-09-01

    Since the International Conference on Nuclear Criticality Safety (ICNC) 2007, the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) have continued to expand their efforts and broaden their scope. Eighteen countries participated on the ICSBEP in 2007. Now, there are 20, with recent contributions from Sweden and Argentina. The IRPhEP has also expanded from eight contributing countries in 2007 to 16 in 2011. Since ICNC 2007, the contents of the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments1' have increased from 442 evaluations (38000 pages), containing benchmark specifications for 3955 critical ormore » subcritical configurations to 516 evaluations (nearly 55000 pages), containing benchmark specifications for 4405 critical or subcritical configurations in the 2010 Edition of the ICSBEP Handbook. The contents of the Handbook have also increased from 21 to 24 criticality-alarm-placement/shielding configurations with multiple dose points for each, and from 20 to 200 configurations categorized as fundamental physics measurements relevant to criticality safety applications. Approximately 25 new evaluations and 150 additional configurations are expected to be added to the 2011 edition of the Handbook. Since ICNC 2007, the contents of the 'International Handbook of Evaluated Reactor Physics Benchmark Experiments2' have increased from 16 different experimental series that were performed at 12 different reactor facilities to 53 experimental series that were performed at 30 different reactor facilities in the 2011 edition of the Handbook. Considerable effort has also been made to improve the functionality of the searchable database, DICE (Database for the International Criticality Benchmark Evaluation Project) and verify the accuracy of the data contained therein. DICE will be discussed in separate papers at ICNC 2011. The status of the ICSBEP and the IRPhEP will be discussed in the full paper, selected benchmarks that have been added to the ICSBEP Handbook will be highlighted, and a preview of the new benchmarks that will appear in the September 2011 edition of the Handbook will be provided. Accomplishments of the IRPhEP will also be highlighted and the future of both projects will be discussed. REFERENCES (1) International Handbook of Evaluated Criticality Safety Benchmark Experiments, NEA/NSC/DOC(95)03/I-IX, Organisation for Economic Co-operation and Development-Nuclear Energy Agency (OECD-NEA), September 2010 Edition, ISBN 978-92-64-99140-8. (2) International Handbook of Evaluated Reactor Physics Benchmark Experiments, NEA/NSC/DOC(2006)1, Organisation for Economic Co-operation and Development-Nuclear Energy Agency (OECD-NEA), March 2011 Edition, ISBN 978-92-64-99141-5.« less

  18. STE thrust chamber technology: Main injector technology program and nozzle Advanced Development Program (ADP)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The purpose of the STME Main Injector Program was to enhance the technology base for the large-scale main injector-combustor system of oxygen-hydrogen booster engines in the areas of combustion efficiency, chamber heating rates, and combustion stability. The initial task of the Main Injector Program, focused on analysis and theoretical predictions using existing models, was complemented by the design, fabrication, and test at MSFC of a subscale calorimetric, 40,000-pound thrust class, axisymmetric thrust chamber operating at approximately 2,250 psi and a 7:1 expansion ratio. Test results were used to further define combustion stability bounds, combustion efficiency, and heating rates using a large injector scale similar to the Pratt & Whitney (P&W) STME main injector design configuration including the tangential entry swirl coaxial injection elements. The subscale combustion data was used to verify and refine analytical modeling simulation and extend the database range to guide the design of the large-scale system main injector. The subscale injector design incorporated fuel and oxidizer flow area control features which could be varied; this allowed testing of several design points so that the STME conditions could be bracketed. The subscale injector design also incorporated high-reliability and low-cost fabrication techniques such as a one-piece electrical discharged machined (EDMed) interpropellant plate. Both subscale and large-scale injectors incorporated outer row injector elements with scarfed tip features to allow evaluation of reduced heating rates to the combustion chamber.

  19. Rating locomotive crew diesel emission exposure profiles using statistics and Bayesian Decision Analysis.

    PubMed

    Hewett, Paul; Bullock, William H

    2014-01-01

    For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value (TLV) of 3 ppm the nitrogen dioxide exposure profile merits an exposure rating of AIHA exposure category 1. However, using the newly adopted TLV of 0.2 ppm the exposure profile receives an exposure rating of category 4. Further evaluation is recommended to determine the current status of nitrogen dioxide exposures. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resource: additional text on OELs, methods, results, and additional figures and tables.].

  20. Intelligent Access to Sequence and Structure Databases (IASSD) - an interface for accessing information from major web databases.

    PubMed

    Ganguli, Sayak; Gupta, Manoj Kumar; Basu, Protip; Banik, Rahul; Singh, Pankaj Kumar; Vishal, Vineet; Bera, Abhisek Ranjan; Chakraborty, Hirak Jyoti; Das, Sasti Gopal

    2014-01-01

    With the advent of age of big data and advances in high throughput technology accessing data has become one of the most important step in the entire knowledge discovery process. Most users are not able to decipher the query result that is obtained when non specific keywords or a combination of keywords are used. Intelligent access to sequence and structure databases (IASSD) is a desktop application for windows operating system. It is written in Java and utilizes the web service description language (wsdl) files and Jar files of E-utilities of various databases such as National Centre for Biotechnology Information (NCBI) and Protein Data Bank (PDB). Apart from that IASSD allows the user to view protein structure using a JMOL application which supports conditional editing. The Jar file is freely available through e-mail from the corresponding author.

  1. Global Data Toolset (GDT)

    USGS Publications Warehouse

    Cress, Jill J.; Riegle, Jodi L.

    2007-01-01

    According to the United Nations Environment Programme World Conservation Monitoring Centre (UNEP-WCMC) approximately 60 percent of the data contained in the World Database on Protected Areas (WDPA) has missing or incomplete boundary information. As a result, global analyses based on the WDPA can be inaccurate, and professionals responsible for natural resource planning and priority setting must rely on incomplete geospatial data sets. To begin to address this problem the World Data Center for Biodiversity and Ecology, in cooperation with the U. S. Geological Survey (USGS) Rocky Mountain Geographic Science Center (RMGSC), the National Biological Information Infrastructure (NBII), the Global Earth Observation System, and the Inter-American Biodiversity Information Network (IABIN) sponsored a Protected Area (PA) workshop in Asuncion, Paraguay, in November 2007. The primary goal of this workshop was to train representatives from eight South American countries on the use of the Global Data Toolset (GDT) for reviewing and editing PA data. Use of the GDT will allow PA experts to compare their national data to other data sets, including non-governmental organization (NGO) and WCMC data, in order to highlight inaccuracies or gaps in the data, and then to apply any needed edits, especially in the delineation of the PA boundaries. In addition, familiarizing the participants with the web-enabled GDT will allow them to maintain and improve their data after the workshop. Once data edits have been completed the GDT will also allow the country authorities to perform any required review and validation processing. Once validated, the data can be used to update the global WDPA and IABIN databases, which will enhance analysis on global and regional levels.

  2. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    NASA Astrophysics Data System (ADS)

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  3. Indexing of Iranian Publications in Well-known Endodontic Textbooks: A Scientometric Analysis.

    PubMed

    Kakooei, Sina; Mostafavi, Mahshid; Parirokh, Masoud; Asgary, Saeed

    2016-01-01

    Quoting an article in well-known textbooks is held as a credit for that paper. The numbers of Iranian publications mentioned in endodontic textbooks have increased during recent years. The aim of this investigation was to evaluate the number of Iranian articles quoted in eminent endodontic textbooks. Three known textbooks (Ingle's Endodontics, Seltzer and Bender's Dental Pulp and Cohen's Pathways of the Pulp) were chosen and all the editions of the textbooks since 2000 were investigated for quoted Iranian publications. Only Iranian authors with affiliations from a domestic university were chosen. All references at the end of each chapter were read by hand searching, and results were noted. The trend and percentage of Iranian publications in different editions of the textbooks were also calculated. The number of citations of these publications in Google Scholar and Scopus databases were also obtained. The number of Iranian publications in all well-known textbooks have notably increased since 2000. The number and percentage of Iranian publications in the latest edition of Cohen's Pathways of the Pulp was higher compared to other textbooks as well as the previous edition of the same text. Number and percentage of Iranian publications in the field of endodontics in all three textbooks have remarkably increased since 2000.

  4. Microgravity science and applications bibliography, 1986 revision

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This edition of the Microgravity Science and Applications (MSA) Bibliography is a compilation of Government reports, contractor reports, conference proceedings, and journal articles dealing with flight experiments utilizing a low-gravity environment to elucidate and control various processes or ground-based activities providing supporting research. It encompasses literature published in FY-86 and part of FY-87 but not cited in the 1985 Revision, pending publications, and those submitted for publication during this time period. Subdivisions of the bibliography include six major categories: Electronic Materials, Metals, Alloys, and Combustion Science. Other categories include Experimental Technology and General Studies. Included are publications from the European and Soviet programs. In addition, there is a list of patents and a cross reference index.

  5. Measurements of gas parameters in plasma-assisted supersonic combustion processes using diode laser spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolshov, Mikhail A; Kuritsyn, Yu A; Liger, V V

    2009-09-30

    We report a procedure for temperature and water vapour concentration measurements in an unsteady-state combustion zone using diode laser absorption spectroscopy. The procedure involves measurements of the absorption spectrum of water molecules around 1.39 {mu}m. It has been used to determine hydrogen combustion parameters in M = 2 gas flows in the test section of a supersonic wind tunnel. The relatively high intensities of the absorption lines used have enabled direct absorption measurements. We describe a differential technique for measurements of transient absorption spectra, the procedure we used for primary data processing and approaches for determining the gas temperature andmore » H{sub 2}O concentration in the probed zone. The measured absorption spectra are fitted with spectra simulated using parameters from spectroscopic databases. The combustion-time-averaged ({approx}50 ms) gas temperature and water vapour partial pressure in the hot wake region are determined to be 1050 K and 21 Torr, respectively. The large signal-to-noise ratio in our measurements allowed us to assess the temporal behaviour of these parameters. The accuracy in our temperature measurements in the probed zone is {approx}40 K. (laser applications and other topics in quantum electronics)« less

  6. LASER APPLICATIONS AND OTHER TOPICS IN QUANTUM ELECTRONICS: Measurements of gas parameters in plasma-assisted supersonic combustion processes using diode laser spectroscopy

    NASA Astrophysics Data System (ADS)

    Bolshov, Mikhail A.; Kuritsyn, Yu A.; Liger, V. V.; Mironenko, V. R.; Leonov, S. B.; Yarantsev, D. A.

    2009-09-01

    We report a procedure for temperature and water vapour concentration measurements in an unsteady-state combustion zone using diode laser absorption spectroscopy. The procedure involves measurements of the absorption spectrum of water molecules around 1.39 μm. It has been used to determine hydrogen combustion parameters in M = 2 gas flows in the test section of a supersonic wind tunnel. The relatively high intensities of the absorption lines used have enabled direct absorption measurements. We describe a differential technique for measurements of transient absorption spectra, the procedure we used for primary data processing and approaches for determining the gas temperature and H2O concentration in the probed zone. The measured absorption spectra are fitted with spectra simulated using parameters from spectroscopic databases. The combustion-time-averaged (~50 ms) gas temperature and water vapour partial pressure in the hot wake region are determined to be 1050 K and 21 Torr, respectively. The large signal-to-noise ratio in our measurements allowed us to assess the temporal behaviour of these parameters. The accuracy in our temperature measurements in the probed zone is ~40 K.

  7. The new open Flexible Emission Inventory for Greece and the Greater Athens Area (FEI-GREGAA): Account of pollutant sources and their importance from 2006 to 2012

    NASA Astrophysics Data System (ADS)

    Fameli, Kyriaki-Maria; Assimakopoulos, Vasiliki D.

    2016-07-01

    Photochemical and particulate pollution problems persist in Athens as they do in various European cities, despite measures taken. Although, for many cities, organized and updated pollutant emissions databases exist, as well as infrastructure for the support of policy implementation, this is not the case for Greece and Athens. So far abstract efforts to create inventories from temporal and spatial annual low resolution data have not lead to the creation of a useful database. The objective of this study was to construct an emission inventory in order to examine the emission trends in Greece and the Greater Athens Area for the period 2006-2012 on a spatial scale of 6 × 6 km2 and 2 × 2 km2, respectively and on a temporal scale of 1 h. Emissions were calculated from stationary combustion sources, transportation (road, navigation and aviation), agriculture and industry obtained from official national and European sources. Moreover, new emission factors were calculated for road transport and aviation. The final database named F.E.I. - GREGAA (Flexible Emission Inventory for GREece and the GAA) is open-structured so as to receive data updates, new pollutants, various emission scenarios and/or different emission factors and be transformed for any grid spacing. Its main purpose is to be used in applications with photochemical models to contribute to the investigation on the type of sources and activities that lead to the configuration of air quality. Results showed a decreasing trend in CO, NOx and VOCs-NMVOCs emissions and an increasing trend from 2011 onwards in PM10 emissions. Road transport and small combustion contribute most to CO emissions, road transport and navigation to NOx and small combustion and industries to PM10. The onset of the economic crisis can be seen from the reduction of emissions from industry and the increase of biomass burning for heating purposes.

  8. Schools and Data: The Educator's Guide for Using Data to Improve Decision Making

    ERIC Educational Resources Information Center

    Creighton, Theodore B.

    2006-01-01

    Since the first edition of "Schools and Data", the No Child Left Behind Act has swept the country, and data-based decision making is no longer an option for educators. Today's educational climate makes it imperative for all schools to collect data and use statistical analysis to help create clear goals and recognize strategies for…

  9. The Wiki as a Virtual Space for Qualitative Data Collection

    ERIC Educational Resources Information Center

    Castanos, Carolina; Piercy, Fred P.

    2010-01-01

    The authors make a case for using wiki technology in qualitative research. A wiki is an online database that allows users to create, edit, and/or reflect on the content of a web page. Thus, wiki technology can support qualitative research that attempts to understand the shared thinking of participants. To illustrate the use of the wiki for this…

  10. [Application characteristics and situation analysis of volatile oils in database of Chinese patent medicine].

    PubMed

    Wang, Sai-Jun; Wu, Zhen-Feng; Yang, Ming; Wang, Ya-Qi; Hu, Peng-Yi; Jie, Xiao-Lu; Han, Fei; Wang, Fang

    2014-09-01

    Aromatic traditional Chinese medicines have a long history in China, with wide varieties. Volatile oils are active ingredients extracted from aromatic herbal medicines, which usually contain tens or hundreds of ingredients, with many biological activities. Therefore, volatile oils are often used in combined prescriptions and made into various efficient preparations for oral administration or external use. Based on the sources from the database of Newly Edited National Chinese Traditional Patent Medicines (the second edition), the author selected 266 Chinese patent medicines containing volatile oils in this paper, and then established an information sheet covering such items as name, dosage, dosage form, specification and usage, and main functions. Subsequently, on the basis of the multidisciplinary knowledge of pharmaceutics, traditional Chinese pharmacology and basic theory of traditional Chinese medicine, efforts were also made in the statistics of the dosage form and usage, variety of volatile oils and main functions, as well as the status analysis on volatile oils in terms of the dosage form development, prescription development, drug instruction and quality control, in order to lay a foundation for the further exploration of the market development situations of volatile oils and the future development orientation.

  11. The New Zealand Tsunami Database: historical and modern records

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Downes, G. L.; Cochran, U. A.; Clark, K.; Scheele, F.

    2016-12-01

    A database of historical (pre-instrumental) and modern (instrumentally recorded)tsunamis that have impacted or been observed in New Zealand has been compiled andpublished online. New Zealand's tectonic setting, astride an obliquely convergenttectonic boundary on the Pacific Rim, means that it is vulnerable to local, regional andcircum-Pacific tsunamis. Despite New Zealand's comparatively short written historicalrecord of c. 200 years there is a wealth of information about the impact of past tsunamis.The New Zealand Tsunami Database currently has 800+ entries that describe >50 highvaliditytsunamis. Sources of historical information include witness reports recorded indiaries, notes, newspapers, books, and photographs. Information on recent events comesfrom tide gauges and other instrumental recordings such as DART® buoys, and media ofgreater variety, for example, video and online surveys. The New Zealand TsunamiDatabase is an ongoing project with information added as further historical records cometo light. Modern tsunamis are also added to the database once the relevant data for anevent has been collated and edited. This paper briefly overviews the procedures and toolsused in the recording and analysis of New Zealand's historical tsunamis, with emphasison database content.

  12. The Design of Lexical Database for Indonesian Language

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Amalia, A.

    2017-03-01

    Kamus Besar Bahasa Indonesia (KBBI), an official dictionary for Indonesian language, provides lists of words with their meaning. The online version can be accessed via Internet network. Another online dictionary is Kateglo. KBBI online and Kateglo only provides an interface for human. A machine cannot retrieve data from the dictionary easily without using advanced techniques. Whereas, lexical of words is required in research or application development which related to natural language processing, text mining, information retrieval or sentiment analysis. To address this requirement, we need to build a lexical database which provides well-defined structured information about words. A well-known lexical database is WordNet, which provides the relation among words in English. This paper proposes the design of a lexical database for Indonesian language based on the combination of KBBI 4th edition, Kateglo and WordNet structure. Knowledge representation by utilizing semantic networks depict the relation among words and provide the new structure of lexical database for Indonesian language. The result of this design can be used as the foundation to build the lexical database for Indonesian language.

  13. [Establishment of the database of the 3D facial models for the plastic surgery based on network].

    PubMed

    Liu, Zhe; Zhang, Hai-Lin; Zhang, Zheng-Guo; Qiao, Qun

    2008-07-01

    To collect the three-dimensional (3D) facial data of 30 facial deformity patients by the 3D scanner and establish a professional database based on Internet. It can be helpful for the clinical intervention. The primitive point data of face topography were collected by the 3D scanner. Then the 3D point cloud was edited by reverse engineering software to reconstruct the 3D model of the face. The database system was divided into three parts, including basic information, disease information and surgery information. The programming language of the web system is Java. The linkages between every table of the database are credibility. The query operation and the data mining are convenient. The users can visit the database via the Internet and use the image analysis system to observe the 3D facial models interactively. In this paper we presented a database and a web system adapt to the plastic surgery of human face. It can be used both in clinic and in basic research.

  14. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    NASA Technical Reports Server (NTRS)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  15. WikiPEATia - a web based platform for assembling peatland data through ‘crowd sourcing’

    NASA Astrophysics Data System (ADS)

    Wisser, D.; Glidden, S.; Fieseher, C.; Treat, C. C.; Routhier, M.; Frolking, S. E.

    2009-12-01

    The Earth System Science community is realizing that peatlands are an important and unique terrestrial ecosystem that has not yet been well-integrated into large-scale earth system analyses. A major hurdle is the lack of accessible, geospatial data of peatland distribution, coupled with data on peatland properties (e.g., vegetation composition, peat depth, basal dates, soil chemistry, peatland class) at the global scale. This data, however, is available at the local scale. Although a comprehensive global database on peatlands probably lags similar data on more economically important ecosystems such as forests, grasslands, croplands, a large amount of field data have been collected over the past several decades. A few efforts have been made to map peatlands at large scales but existing data have not been assembled into a single geospatial database that is publicly accessible or do not depict data with a level of detail that is needed in the Earth System Science Community. A global peatland database would contribute to advances in a number of research fields such as hydrology, vegetation and ecosystem modeling, permafrost modeling, and earth system modeling. We present a Web 2.0 approach that uses state-of-the-art webserver and innovative online mapping technologies and is designed to create such a global database through ‘crowd-sourcing’. Primary functions of the online system include form-driven textual user input of peatland research metadata, spatial data input of peatland areas via a mapping interface, database editing and querying editing capabilities, as well as advanced visualization and data analysis tools. WikiPEATia provides an integrated information technology platform for assembling, integrating, and posting peatland-related geospatial datasets facilitates and encourages research community involvement. A successful effort will make existing peatland data much more useful to the research community, and will help to identify significant data gaps.

  16. Issue Concerning EPA's Position on what Fuel Combustion Equipment should be Counted Toward the 250 Million BTU/hr under PSD Requirements

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  17. MitBASE : a comprehensive and integrated mitochondrial DNA database. The present status

    PubMed Central

    Attimonelli, M.; Altamura, N.; Benne, R.; Brennicke, A.; Cooper, J. M.; D’Elia, D.; Montalvo, A. de; Pinto, B. de; De Robertis, M.; Golik, P.; Knoop, V.; Lanave, C.; Lazowska, J.; Licciulli, F.; Malladi, B. S.; Memeo, F.; Monnerot, M.; Pasimeni, R.; Pilbout, S.; Schapira, A. H. V.; Sloof, P.; Saccone, C.

    2000-01-01

    MitBASE is an integrated and comprehensive database of mitochondrial DNA data which collects, under a single interface, databases for Plant, Vertebrate, Invertebrate, Human, Protist and Fungal mtDNA and a Pilot database on nuclear genes involved in mitochondrial biogenesis in Saccharomyces cerevisiae. MitBASE reports all available information from different organisms and from intraspecies variants and mutants. Data have been drawn from the primary databases and from the literature; value adding information has been structured, e.g., editing information on protist mtDNA genomes, pathological information for human mtDNA variants, etc. The different databases, some of which are structured using commercial packages (Microsoft Access, File Maker Pro) while others use a flat-file format, have been integrated under ORACLE. Ad hoc retrieval systems have been devised for some of the above listed databases keeping into account their peculiarities. The database is resident at the EBI and is available at the following site: http://www3.ebi.ac.uk/Research/Mitbase/mitbase.pl . The impact of this project is intended for both basic and applied research. The study of mitochondrial genetic diseases and mitochondrial DNA intraspecies diversity are key topics in several biotechnological fields. The database has been funded within the EU Biotechnology programme. PMID:10592207

  18. Linking NCBI to Wikipedia: a wiki-based approach.

    PubMed

    Page, Roderic D M

    2011-03-31

    The NCBI Taxonomy underpins many bioinformatics and phyloinformatics databases, but by itself provides limited information on the taxa it contains. One readily available source of information on many taxa is Wikipedia. This paper describes iPhylo Linkout, a Semantic wiki that maps taxa in NCBI's taxonomy database onto corresponding pages in Wikipedia. Storing the mapping in a wiki makes it easy to edit, correct, or otherwise annotate the links between NCBI and Wikipedia. The mapping currently comprises some 53,000 taxa, and is available at http://iphylo.org/linkout. The links between NCBI and Wikipedia are also made available to NCBI users through the NCBI LinkOut service.

  19. Blended Synchronous Delivery Mode in Graduate Programs: A Literature Review and Its Implementation in the Master Teacher Program

    ERIC Educational Resources Information Center

    Lakhal, Sawsen; Bateman, Dianne; Bédard, Janie

    2017-01-01

    The aim of this study is to present a narrative literature review of advantages, challenges, and conditions for the success of blended synchronous course delivery mode. For this purpose, we searched the database EditLib and analyzed 16 existing papers from 2001 to 2016. The conditions for success were operationalized in the Master Teacher Program…

  20. IUCN/SSC Invasive Species Specialist Group (ISSG)

    Science.gov Websites

    ., Barrios, V., Darwall, W.R.T. and Numa, C. (Editors). 2014. in the Eastern Mediterranean. Cambridge, UK the International Conference on Island Invasives. Edited by C. R. Veitch, M. N. Clout, and D. R. Towns combining and harmonizing data on IAS from a wide range of different databases and networks. The aim is to

  1. A Graphics Facility for Integration, Editing, and Display of Slope, Curvature, and Contours from a Digital Terrain Elevation Database

    DTIC Science & Technology

    1988-06-01

    DETAILED PROBLEM STATEM ENT ......................................................... 23 A . INTRODUCTION...assorted information about the world land masses. When this is done, the problem of storage, manipulation, and display of realistic, dense, and accurate...elevation data becomes a problem of paramount importance. If the data which is stored can be utilized to recreate specific information about certain

  2. Melanoma staging: Evidence-based changes in the American Joint Committee on Cancer eighth edition cancer staging manual.

    PubMed

    Gershenwald, Jeffrey E; Scolyer, Richard A; Hess, Kenneth R; Sondak, Vernon K; Long, Georgina V; Ross, Merrick I; Lazar, Alexander J; Faries, Mark B; Kirkwood, John M; McArthur, Grant A; Haydu, Lauren E; Eggermont, Alexander M M; Flaherty, Keith T; Balch, Charles M; Thompson, John F

    2017-11-01

    Answer questions and earn CME/CNE To update the melanoma staging system of the American Joint Committee on Cancer (AJCC) a large database was assembled comprising >46,000 patients from 10 centers worldwide with stages I, II, and III melanoma diagnosed since 1998. Based on analyses of this new database, the existing seventh edition AJCC stage IV database, and contemporary clinical trial data, the AJCC Melanoma Expert Panel introduced several important changes to the Tumor, Nodes, Metastasis (TNM) classification and stage grouping criteria. Key changes in the eighth edition AJCC Cancer Staging Manual include: 1) tumor thickness measurements to be recorded to the nearest 0.1 mm, not 0.01 mm; 2) definitions of T1a and T1b are revised (T1a, <0.8 mm without ulceration; T1b, 0.8-1.0 mm with or without ulceration or <0.8 mm with ulceration), with mitotic rate no longer a T category criterion; 3) pathological (but not clinical) stage IA is revised to include T1b N0 M0 (formerly pathologic stage IB); 4) the N category descriptors "microscopic" and "macroscopic" for regional node metastasis are redefined as "clinically occult" and "clinically apparent"; 5) prognostic stage III groupings are based on N category criteria and T category criteria (ie, primary tumor thickness and ulceration) and increased from 3 to 4 subgroups (stages IIIA-IIID); 6) definitions of N subcategories are revised, with the presence of microsatellites, satellites, or in-transit metastases now categorized as N1c, N2c, or N3c based on the number of tumor-involved regional lymph nodes, if any; 7) descriptors are added to each M1 subcategory designation for lactate dehydrogenase (LDH) level (LDH elevation no longer upstages to M1c); and 8) a new M1d designation is added for central nervous system metastases. This evidence-based revision of the AJCC melanoma staging system will guide patient treatment, provide better prognostic estimates, and refine stratification of patients entering clinical trials. CA Cancer J Clin 2017;67:472-492. © 2017 American Cancer Society. © 2017 American Cancer Society.

  3. Melanoma Staging: Evidence-Based Changes in the American Joint Committee on Cancer Eighth Edition Cancer Staging Manual

    PubMed Central

    Gershenwald, Jeffrey E.; Scolyer, Richard A.; Hess, Kenneth R.; Sondak, Vernon K.; Long, Georgina V.; Ross, Merrick I.; Lazar, Alexander J.; Faries, Mark B.; Kirkwood, John M.; McArthur, Grant A.; Haydu, Lauren E.; Eggermont, Alexander M. M.; Flaherty, Keith T.; Balch, Charles M.; Thompson, John F.

    2018-01-01

    To update the melanoma staging system of the American Joint Committee on Cancer (AJCC) a large database was assembled comprising >46,000 patients from 10 centers worldwide with stages I, II, and III melanoma diagnosed since 1998. Based on analyses of this new database, the existing seventh edition AJCC stage IV database, and contemporary clinical trial data, the AJCC Melanoma Expert Panel introduced several important changes to the Tumor, Nodes, Metastasis (TNM) classification and stage grouping criteria. Key changes in the eighth edition AJCC Cancer Staging Manual include: 1) tumor thickness measurements to be recorded to the nearest 0.1 mm, not 0.01 mm; 2) definitions of T1a and T1b are revised (T1a, <0.8 mm without ulceration; T1b, 0.8–1.0 mm with or without ulceration or <0.8 mm with ulceration), with mitotic rate no longer a T category criterion; 3) pathological (but not clinical) stage IA is revised to include T1b N0 M0 (formerly pathologic stage IB); 4) the N category descriptors “microscopic” and “macroscopic” for regional node metastasis are redefined as “clinically occult” and “clinically apparent”; 5) prognostic stage III groupings are based on N category criteria and T category criteria (ie, primary tumor thickness and ulceration) and increased from 3 to 4 subgroups (stages IIIA–IIID); 6) definitions of N subcategories are revised, with the presence of microsatellites, satellites, or in-transit metastases now categorized as N1c, N2c, or N3c based on the number of tumor-involved regional lymph nodes, if any; 7) descriptors are added to each M1 subcategory designation for lactate dehydrogenase (LDH) level (LDH elevation no longer upstages to M1c); and 8) a new M1d designation is added for central nervous system metastases. This evidence-based revision of the AJCC melanoma staging system will guide patient treatment, provide better prognostic estimates, and refine stratification of patients entering clinical trials. PMID:29028110

  4. OH PLIF Visualization of the UVa Supersonic Combustion Experiment: Configuration A

    NASA Technical Reports Server (NTRS)

    Johansen, Craig T.; McRae, Colin D.; Danehy, Paul M.; Gallo, Emanuela C. A.; Cantu, Luca M. L.; Magnotti, Gaetano; Cutler, Andrew D.; Rockwell, Robert D., Jr.; Goyne, Chris P.; McDaniel, James C.

    2013-01-01

    Hydroxyl radical (OH) planar laser-induced fluorescence (PLIF) measurements were performed in the University of Virginia supersonic combustion experiment. The test section was set up in configuration A, which includes a Mach 2 nozzle, combustor, and extender section. Hydrogen fuel was injected through an unswept compression ramp at two different equivalence ratios. Through the translation of the optical system and the use of two separate camera views, the entire optically accessible range of the combustor was imaged. Single-shot, average, and standard deviation images of the OH PLIF signal are presented at several streamwise locations. The results show the development of a highly turbulent flame structure and provide an experimental database to be used for numerical model assessment.

  5. A Step Towards CO2-Neutral Aviation

    NASA Technical Reports Server (NTRS)

    Brankovic, Andreja; Ryder, Robert C.; Hendricks, Robert C.; Huber, Marcia L.

    2008-01-01

    An approximation method for evaluation of the caloric equations used in combustion chemistry simulations is described. The method is applied to generate the equations of specific heat, static enthalpy, and Gibb's free energy for fuel mixtures of interest to gas turbine engine manufacturers. Liquid-phase fuel properties are also derived. The fuels investigated include JP-8, synthetic fuel, and two blends of JP-8 and synthetic fuel. The complete set of fuel property equations for both phases are implemented into a computational fluid dynamics (CFD) flow solver database, and multiphase, reacting flow simulations of a well-tested liquid-fueled combustor are performed. The simulations are a first step in understanding combustion system performance and operational issues when using alternate fuels, at practical engine operating conditions.

  6. RISSC: a novel database for ribosomal 16S–23S RNA genes spacer regions

    PubMed Central

    García-Martínez, Jesús; Bescós, Ignacio; Rodríguez-Sala, Jesús Javier; Rodríguez-Valera, Francisco

    2001-01-01

    A novel database, under the acronym RISSC (Ribosomal Intergenic Spacer Sequence Collection), has been created. It compiles more than 1600 entries of edited DNA sequence data from the 16S–23S ribosomal spacers present in most prokaryotes and organelles (e.g. mitochondria and chloroplasts) and is accessible through the Internet (http://ulises.umh.es/RISSC), where systematic searches for specific words can be conducted, as well as BLAST-type sequence searches. Additionally, a characteristic feature of this region, the presence/absence and nature of tRNA genes within the spacer, is included in all the entries, even when not previously indicated in the original database. All these combined features could provide a useful documen­tation tool for studies on evolution, identification, typing and strain characterization, among others. PMID:11125084

  7. The GEISA Spectroscopic Database as a Tool for Hyperspectral Earth' Tropospheric Remote Sensing Applications

    NASA Astrophysics Data System (ADS)

    Jacquinet-Husson, Nicole; Crépeau, Laurent; Capelle, Virginie; Scott, Noëlle; Armante, Raymond; Chédin, Alain

    2010-05-01

    Remote sensing of the terrestrial atmosphere has advanced significantly in recent years, and this has placed greater demands on the compilations in terms of accuracy, additional species, and spectral coverage. The successful performances of the new generation of hyperspectral Earth' atmospheric sounders like AIRS (Atmospheric Infrared Sounder -http://www-airs.jpl.nasa.gov/), in the USA, and IASI (Infrared Atmospheric Sounding Interferometer -http://earth-sciences.cnes.fr/IASI/) in Europe, which have a better vertical resolution and accuracy, compared to the previous satellite infrared vertical sounders, depend ultimately on the accuracy to which the spectroscopic parameters of the optically active gases are known, since they constitute an essential input to the forward radiative transfer models that are used to interpret their observations. In this context, the GEISA (1) (Gestion et Etude des Informations Spectroscopiques Atmosphériques: Management and Study of Atmospheric Spectroscopic Information) computer-accessible database, initiated in 1976, is continuously developed and maintained at LMD (Laboratoire de Météorologie Dynamique, France). The updated 2009 edition of GEISA (GEISA-09)is a system comprising three independent sub-databases devoted respectively to: line transition parameters, infrared and ultraviolet/visible absorption cross-sections, microphysical and optical properties of atmospheric aerosols. In this edition, the contents of which will be summarized, 50 molecules are involved in the line transition parameters sub-database, including 111 isotopes, for a total of 3,807,997 entries, in the spectral range from 10-6 to 35,877.031 cm-1. Currently, GEISA is involved in activities related to the assessment of the capabilities of IASI through the GEISA/IASI database derived from GEISA (2). Since the Metop (http://www.eumetsat.int) launch (October 19th 2006), GEISA/IASI is the reference spectroscopic database for the validation of the level-1 IASI data, using the 4A radiative transfer model (3) (4A/LMD http://ara.lmd.polytechnique.fr; 4A/OP co-developed by LMD and NOVELTIS -http://www.noveltis.fr/) with the support of CNES (2006). Special emphasize will be given to the description of GEISA/IASI. Spectroscopic parameters quality requirement will be discussed in the context of comparisons between observed or simulated Earth's atmosphere spectra. GEISA and GEISA/IASI are implemented on the CNES/CNRS Ether Products and Services Centre WEB site (http://ether.ipsl.jussieu.fr), where all archived spectroscopic data can be handled through general and user friendly associated management software facilities. More than 350 researchers are registered for on line use of GEISA. Refs: (1) Jacquinet-Husson N., N.A. Scott, A. Chédin,L. Crépeau, R. Armante, V. Capelle, J. Orphal, A. Coustenis, C. Boonne, N. Poulet-Crovisier, et al. THE GEISA SPECTROSCOPIC DATABASE: Current and future archive for Earth and planetary atmosphere studies. JQSRT 109 (2008) 1043-1059. (2) Jacquinet-Husson N., N.A. Scott, A. Chédin, K. Garceran, R. Armante, et al. The 2003 edition of the GEISA/IASI spectroscopic database. JQSRT 95 (2005)429-467. (3) Scott, N.A. and A. Chedin. A fast line-by-line method for atmospheric absorption computations: The Automatized Atmospheric Absorption Atlas. J. Appl. Meteor. 20 (1981)556-564.

  8. Cross-cultural validity of standardized motor development screening and assessment tools: a systematic review.

    PubMed

    Mendonça, Bianca; Sargent, Barbara; Fetters, Linda

    2016-12-01

    To investigate whether standardized motor development screening and assessment tools that are used to evaluate motor abilities of children aged 0 to 2 years are valid in cultures other than those in which the normative sample was established. This was a systematic review in which six databases were searched. Studies were selected based on inclusion/exclusion criteria and appraised for evidence level and quality. Study variables were extracted. Twenty-three studies representing six motor development screening and assessment tools in 16 cultural contexts met the inclusion criteria: Alberta Infant Motor Scale (n=7), Ages and Stages Questionnaire, 3rd edition (n=2), Bayley Scales of Infant and Toddler Development, 3rd edition (n=8), Denver Developmental Screening Test, 2nd edition (n=4), Harris Infant Neuromotor Test (n=1), and Peabody Developmental Motor Scales, 2nd edition (n=1). Thirteen studies found significant differences between the cultural context and normative sample. Two studies established reliability and/or validity of standardized motor development assessments in high-risk infants from different cultural contexts. Five studies established new population norms. Eight studies described the cross-cultural adaptation of a standardized motor development assessment. Standardized motor development assessments have limited validity in cultures other than that in which the normative sample was established. Their use can result in under- or over-referral for services. © 2016 Mac Keith Press.

  9. Examining database persistence of ISO/EN 13606 standardized electronic health record extracts: relational vs. NoSQL approaches.

    PubMed

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Lozano-Rubí, Raimundo; Serrano-Balazote, Pablo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2017-08-18

    The objective of this research is to compare the relational and non-relational (NoSQL) database systems approaches in order to store, recover, query and persist standardized medical information in the form of ISO/EN 13606 normalized Electronic Health Record XML extracts, both in isolation and concurrently. NoSQL database systems have recently attracted much attention, but few studies in the literature address their direct comparison with relational databases when applied to build the persistence layer of a standardized medical information system. One relational and two NoSQL databases (one document-based and one native XML database) of three different sizes have been created in order to evaluate and compare the response times (algorithmic complexity) of six different complexity growing queries, which have been performed on them. Similar appropriate results available in the literature have also been considered. Relational and non-relational NoSQL database systems show almost linear algorithmic complexity query execution. However, they show very different linear slopes, the former being much steeper than the two latter. Document-based NoSQL databases perform better in concurrency than in isolation, and also better than relational databases in concurrency. Non-relational NoSQL databases seem to be more appropriate than standard relational SQL databases when database size is extremely high (secondary use, research applications). Document-based NoSQL databases perform in general better than native XML NoSQL databases. EHR extracts visualization and edition are also document-based tasks more appropriate to NoSQL database systems. However, the appropriate database solution much depends on each particular situation and specific problem.

  10. New spectroscopy in the HITRAN2016 database and its impact on atmospheric retrievals

    NASA Astrophysics Data System (ADS)

    Gordon, I.; Rothman, L. S.; Kochanov, R. V.; Tan, Y.; Toon, G. C.

    2017-12-01

    The HITRAN spectroscopic database is a backbone of the interpretation of spectral atmospheric retrievals and is an important input to the radiative transfer codes. The database is serving the atmospheric community for nearly half-a-century with every new edition being released every four years. The most recent release of the database is HITRAN2016 [1]. It consists of line-by-line lists, experimental absorption cross-sections, collision-induced absorption data and aerosol indices of refraction. In this presentation it will be stressed the importance of using the most recent edition of the database in the radiative transfer codes. The line-by-line lists for most of the HITRAN molecules were updated (and two new molecules added) in comparison with the previous compilation HITRAN2012 [2] that has been in use, along with some intermediate updates, since 2012. The extent of the updates ranges from updating a few lines of certain molecules to complete replacements of the lists and introduction of additional isotopologues. In addition, the amount of molecules in cross-sectional part of the database has increased dramatically from nearly 50 to over 300. The molecules covered by the HITRAN database are important in planetary remote sensing, environment monitoring (in particular, biomass burning detection), climate applications, industrial pollution tracking, atrophysics, and more. Taking advantage of the new structure and interface available at www.hitran.org [3] and the HITRAN Application Programming Interface [4] the amount of parameters has also been significantly increased, now incorporating, for instance, non-Voigt line profiles [5]; broadening by gases other than air and "self" [6]; and other phenomena, including line mixing. This is a very important novelty that needs to be properly introduced in the radiative transfer codes in order to advance accurate interpretation of the remote sensing retrievals. This work is supported by the NASA PDART (NNX16AG51G) and AURA (NNX 17AI78G) programs. References[1] I.E. Gordon et al, JQSRT in press (2017) http://doi.org/10.1016/j.jqsrt.2017.06.038. [2] L.S. Rothman et al, JQSRT 130, 4 (2013). [3] C. Hill et al, JQSRT 177, 4 (2016). [4] R.V. Kochanov et al, JQSRT 177, 15 (2016). [5] P. Wcisło et al., JQSRT 177, 75 (2016). [6] J. S. Wilzewski et al., JQSRT 168, 193 (2016).

  11. The GermOnline cross-species systems browser provides comprehensive information on genes and gene products relevant for sexual reproduction.

    PubMed

    Gattiker, Alexandre; Niederhauser-Wiederkehr, Christa; Moore, James; Hermida, Leandro; Primig, Michael

    2007-01-01

    We report a novel release of the GermOnline knowledgebase covering genes relevant for the cell cycle, gametogenesis and fertility. GermOnline was extended into a cross-species systems browser including information on DNA sequence annotation, gene expression and the function of gene products. The database covers eight model organisms and Homo sapiens, for which complete genome annotation data are available. The database is now built around a sophisticated genome browser (Ensembl), our own microarray information management and annotation system (MIMAS) used to extensively describe experimental data obtained with high-density oligonucleotide microarrays (GeneChips) and a comprehensive system for online editing of database entries (MediaWiki). The RNA data include results from classical microarrays as well as tiling arrays that yield information on RNA expression levels, transcript start sites and lengths as well as exon composition. Members of the research community are solicited to help GermOnline curators keep database entries on genes and gene products complete and accurate. The database is accessible at http://www.germonline.org/.

  12. A user friendly database for use in ALARA job dose assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zodiates, A.M.; Willcock, A.

    1995-03-01

    The pressurized water reactor (PWR) design chosen for adoption by Nuclear Electric plc was based on the Westinghouse Standard Nuclear Unit Power Plant (SNUPPS). This design was developed to meet the United Kingdom requirements and these improvements are embodied in the Sizewell B plant which will start commercial operation in 1994. A user-friendly database was developed to assist the station in the dose and ALARP assessments of the work expected to be carried out during station operation and outage. The database stores the information in an easily accessible form and enables updating, editing, retrieval, and searches of the information. Themore » database contains job-related information such as job locations, number of workers required, job times, and the expected plant doserates. It also contains the means to flag job requirements such as requirements for temporary shielding, flushing, scaffolding, etc. Typical uses of the database are envisaged to be in the prediction of occupational doses, the identification of high collective and individual dose jobs, use in ALARP assessments, setting of dose targets, monitoring of dose control performance, and others.« less

  13. DEVELOPMENT AND APPLICATION OF A MASS SPECTRA-VOLATILITY DATABASE OF COMBUSTION AND SECONDARY ORGANIC AEROSOL SOURCES FOR THE AERODYNE AEROSOL MASS SPECTROMETER

    EPA Science Inventory

    1. Thermodenuder Development:

    Two TD systems were designed, constructed, and tested at Aerodyne. In this design, the vaporizer consists of a 50 cm long, 1 inch OD stainless steel tube wrapped with three heating tapes and fiberglass insulation and then mounted in a sta...

  14. Coal Gasification - section in Kirk-Othmer Concise Encyclopedia of Chemical Technology, 5th Edition, 2-vol. set, July 2007, ISBN 978-0-470-04748-4, pp. 580-587

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadle, L.J.; Berry, D.A.; Syamlal, Madhava

    2007-07-01

    Coal gasification is the process of reacting coal with oxygen, steam, and carbon dioxide to form a product gas containing hydrogen and carbon monoxide. Gasification is essentially incomplete combustion. The chemical and physical processes are quite similar, the main difference being the nature of the final products. From a processing point of view the main operating difference is that gasification consumes heat evolved during combustion. Under the reducing environment of gasification the sulfur in the coal is released as hydrogen sulfide rather than sulfur dioxide and the coal's nitrogen is converted mostly to ammonia rather than nitrogen oxides. These reducedmore » forms of sulfur and nitrogen are easily isolated, captured, and utilized, and thus gasification is a clean coal technology with better environmental performance than coal combustion. Depending on the type of gasifier and the operating conditions, gasification can be used to produce a fuel gas suitable for any number of applications. A low heating value fuel gas is produced from an air blown gasifier for use as an industrial fuel and for power production. A medium heating value fuel gas is produced from enriched oxygen blown gasification for use as a synthesis gas in the production of chemicals such as ammonia, methanol, and transportation fuels. A high heating value gas can be produced from shifting the medium heating value product gas over catalysts to produce a substitute or synthetic natural gas (SNG).« less

  15. TRedD—A database for tandem repeats over the edit distance

    PubMed Central

    Sokol, Dina; Atagun, Firat

    2010-01-01

    A ‘tandem repeat’ in DNA is a sequence of two or more contiguous, approximate copies of a pattern of nucleotides. Tandem repeats are common in the genomes of both eukaryotic and prokaryotic organisms. They are significant markers for human identity testing, disease diagnosis, sequence homology and population studies. In this article, we describe a new database, TRedD, which contains the tandem repeats found in the human genome. The database is publicly available online, and the software for locating the repeats is also freely available. The definition of tandem repeats used by TRedD is a new and innovative definition based upon the concept of ‘evolutive tandem repeats’. In addition, we have developed a tool, called TandemGraph, to graphically depict the repeats occurring in a sequence. This tool can be coupled with any repeat finding software, and it should greatly facilitate analysis of results. Database URL: http://tandem.sci.brooklyn.cuny.edu/ PMID:20624712

  16. Biological knowledge bases using Wikis: combining the flexibility of Wikis with the structure of databases.

    PubMed

    Brohée, Sylvain; Barriot, Roland; Moreau, Yves

    2010-09-01

    In recent years, the number of knowledge bases developed using Wiki technology has exploded. Unfortunately, next to their numerous advantages, classical Wikis present a critical limitation: the invaluable knowledge they gather is represented as free text, which hinders their computational exploitation. This is in sharp contrast with the current practice for biological databases where the data is made available in a structured way. Here, we present WikiOpener an extension for the classical MediaWiki engine that augments Wiki pages by allowing on-the-fly querying and formatting resources external to the Wiki. Those resources may provide data extracted from databases or DAS tracks, or even results returned by local or remote bioinformatics analysis tools. This also implies that structured data can be edited via dedicated forms. Hence, this generic resource combines the structure of biological databases with the flexibility of collaborative Wikis. The source code and its documentation are freely available on the MediaWiki website: http://www.mediawiki.org/wiki/Extension:WikiOpener.

  17. Cpf1-Database: web-based genome-wide guide RNA library design for gene knockout screens using CRISPR-Cpf1.

    PubMed

    Park, Jeongbin; Bae, Sangsu

    2018-03-15

    Following the type II CRISPR-Cas9 system, type V CRISPR-Cpf1 endonucleases have been found to be applicable for genome editing in various organisms in vivo. However, there are as yet no web-based tools capable of optimally selecting guide RNAs (gRNAs) among all possible genome-wide target sites. Here, we present Cpf1-Database, a genome-wide gRNA library design tool for LbCpf1 and AsCpf1, which have DNA recognition sequences of 5'-TTTN-3' at the 5' ends of target sites. Cpf1-Database provides a sophisticated but simple way to design gRNAs for AsCpf1 nucleases on the genome scale. One can easily access the data using a straightforward web interface, and using the powerful collections feature one can easily design gRNAs for thousands of genes in short time. Free access at http://www.rgenome.net/cpf1-database/. sangsubae@hanyang.ac.kr.

  18. Suspect screening of OH-PAHs and non-target screening of other organic compounds in wood smoke particles using HR-Orbitrap-MS.

    PubMed

    Avagyan, Rozanna; Åberg, Magnus; Westerholm, Roger

    2016-11-01

    Wood combustion has been shown to contribute significantly to emissions of polycyclic aromatic hydrocarbons and hydroxylated polycyclic aromatic hydrocarbons, compounds with toxic and carcinogenic properties. However, only a small number of hydroxylated polycyclic aromatic hydrocarbons have been determined in particles from wood combustion, usually compounds with available reference standards. In this present study, suspect and non-target screening strategies were applied to characterize the wood smoke particles from four different wood types and two combustion conditions with respect to hydroxylated polycyclic aromatic hydrocarbons and other organic compounds. In the suspect screening, 32 peaks corresponding to 12 monohydroxylated masses were tentatively identified by elemental composition assignments and matching of isotopic pattern and fragments. More than one structure was suggested for most of the measured masses. Statistical analysis was performed on the non-target screening data in order to single out significant peaks having intensities that depend on the wood type and/or combustion condition. Significant peaks were found in both negative and positive ionization modes, with unique peaks for each wood type and combustion condition, as well as a combination of both factors. Furthermore, structural elucidation of some peaks was done by comparing the spectra in the samples with spectra found in the spectral databases. Six compounds were tentatively identified in positive ionization mode, and 19 in negative ionization mode. The results in this present study demonstrate that there are significant overall differences in the chemistry of wood smoke particles that depends on both the wood type and the combustion condition used. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Global mercury emissions from combustion in light of international fuel trading.

    PubMed

    Chen, Yilin; Wang, Rong; Shen, Huizhong; Li, Wei; Chen, Han; Huang, Ye; Zhang, Yanyan; Chen, Yuanchen; Su, Shu; Lin, Nan; Liu, Junfeng; Li, Bengang; Wang, Xilong; Liu, Wenxin; Coveney, Raymond M; Tao, Shu

    2014-01-01

    The spatially resolved emission inventory is essential for understanding the fate of mercury. Previous global mercury emission inventories for fuel combustion sources overlooked the influence of fuel trading on local emission estimates of many countries, mostly developing countries, for which national emission data are not available. This study demonstrates that in many countries, the mercury content of coal and petroleum locally consumed differ significantly from those locally produced. If the mercury content in locally produced fuels were used to estimate emission, then the resulting global mercury emissions from coal and petroleum would be overestimated by 4.7 and 72%, respectively. Even higher misestimations would exist in individual countries, leading to strong spatial bias. On the basis of the available data on fuel trading and an updated global fuel consumption database, a new mercury emission inventory for 64 combustion sources has been developed. The emissions were mapped at 0.1° × 0.1° resolution for 2007 and at country resolution for a period from 1960 to 2006. The estimated global total mercury emission from all combustion sources (fossil fuel, biomass fuel, solid waste, and wildfires) in 2007 was 1454 Mg (1232-1691 Mg as interquartile range from Monte Carlo simulation), among which elementary mercury (Hg(0)), divalent gaseous mercury (Hg(2+)), and particulate mercury (Hg(p)) were 725, 548, and 181 Mg, respectively. The total emission from anthropogenic sources, excluding wildfires, was 1040 Mg (886-1248 Mg), with coal combustion contributing more than half. Globally, total annual anthropogenic mercury emission from combustion sources increased from 285 Mg (263-358 Mg) in 1960 to 1040 Mg (886-1248 Mg) in 2007, owing to an increased fuel consumption in developing countries. However, mercury emissions from developed countries have decreased since 2000.

  20. Absorption sensor for CO in combustion gases using 2.3 µm tunable diode lasers

    NASA Astrophysics Data System (ADS)

    Chao, X.; Jeffries, J. B.; Hanson, R. K.

    2009-11-01

    Tunable diode laser absorption spectroscopy of CO was studied in the controlled laboratory environments of a heated cell and a combustion exhaust rig. Two absorption lines, R(10) and R(11) in the first overtone band of CO near 2.3 µm, were selected from a HITRAN simulation to minimize interference from water vapor at a representative combustion exhaust temperature (~1200 K). The linestrengths and collision broadening coefficients for these lines were measured in a heated static cell. This database was then used in a comparative study of direct absorption and wavelength-modulation absorption. CO concentration measurements using scanned-wavelength direct absorption (DA) and wavelength modulation with the second-harmonic signal normalized by the first-harmonic signal (WMS-2f/1f) all agreed with those measured by a conventional gas sampling analyzer over the range from <10 ppm to 2.3%. As expected, water vapor was found to be the dominant source of background interference for CO detection in combustion flows at high temperatures. Water absorption was measured to a high spectral resolution within the wavelength region 4295-4301 cm-1 at 1100 K, and shown to produce <10 ppm level interference for CO detection in combustion exhausts at temperatures up to 1200 K. We found that the WMS-2f/1f strategy avoids the need for WMS calibration measurements but requires characterization of the wavelength and injection-current intensity modulation of the specific diode laser. We conclude that WMS-2f/1f using the selected R(10) or R(11) transitions in the CO overtone band holds good promise for sensitive in situ detection of ppm-level CO in combustion flows, with high resistance to interference absorption from H2O.

  1. Patient-oriented cancer information on the internet: a comparison of wikipedia and a professionally maintained database.

    PubMed

    Rajagopalan, Malolan S; Khanna, Vineet K; Leiter, Yaacov; Stott, Meghan; Showalter, Timothy N; Dicker, Adam P; Lawrence, Yaacov R

    2011-09-01

    A wiki is a collaborative Web site, such as Wikipedia, that can be freely edited. Because of a wiki's lack of formal editorial control, we hypothesized that the content would be less complete and accurate than that of a professional peer-reviewed Web site. In this study, the coverage, accuracy, and readability of cancer information on Wikipedia were compared with those of the patient-orientated National Cancer Institute's Physician Data Query (PDQ) comprehensive cancer database. For each of 10 cancer types, medically trained personnel scored PDQ and Wikipedia articles for accuracy and presentation of controversies by using an appraisal form. Reliability was assessed by using interobserver variability and test-retest reproducibility. Readability was calculated from word and sentence length. Evaluators were able to rapidly assess articles (18 minutes/article), with a test-retest reliability of 0.71 and interobserver variability of 0.53. For both Web sites, inaccuracies were rare, less than 2% of information examined. PDQ was significantly more readable than Wikipedia: Flesch-Kincaid grade level 9.6 versus 14.1. There was no difference in depth of coverage between PDQ and Wikipedia (29.9, 34.2, respectively; maximum possible score 72). Controversial aspects of cancer care were relatively poorly discussed in both resources (2.9 and 6.1 for PDQ and Wikipedia, respectively, NS; maximum possible score 18). A planned subanalysis comparing common and uncommon cancers demonstrated no difference. Although the wiki resource had similar accuracy and depth as the professionally edited database, it was significantly less readable. Further research is required to assess how this influences patients' understanding and retention.

  2. Response to Question on whether the Combustion of Municipal Sewage Sludge would Qualify as Municipal Solid Waste under the Exemption Provided in Section IV.B. of the Draft Changes to EPA's Emission Offset Interpretive Ruling

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  3. An evaluation of information retrieval accuracy with simulated OCR output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, W.B.; Harding, S.M.; Taghva, K.

    Optical Character Recognition (OCR) is a critical part of many text-based applications. Although some commercial systems use the output from OCR devices to index documents without editing, there is very little quantitative data on the impact of OCR errors on the accuracy of a text retrieval system. Because of the difficulty of constructing test collections to obtain this data, we have carried out evaluation using simulated OCR output on a variety of databases. The results show that high quality OCR devices have little effect on the accuracy of retrieval, but low quality devices used with databases of short documents canmore » result in significant degradation.« less

  4. Multi-National Information Sharing -- Cross Domain Collaborative Information Environment (CDCIE) Solution. Revision 4

    DTIC Science & Technology

    2005-04-12

    Hardware, Database, and Operating System independence using Java • Enterprise-class Architecture using Java2 Enterprise Edition 1.4 • Standards based...portal applications. Compliance with the Java Specification Request for Portlet APIs (JSR-168) (Portlet API) and Web Services for Remote Portals...authentication and authorization • Portal Standards using Java Specification Request for Portlet APIs (JSR-168) (Portlet API) and Web Services for Remote

  5. Enhancement of Spatial Ability in Girls in a Single-Sex Environment through Spatial Experience and the Impact on Information Seeking

    ERIC Educational Resources Information Center

    Swarlis, Linda L.

    2008-01-01

    The test scores of spatial ability for women lag behind those of men in many spatial tests. On the Mental Rotations Test (MRT), a significant gender gap has existed for over 20 years and continues to exist. High spatial ability has been linked to efficiencies in typical computing tasks including Web and database searching, text editing, and…

  6. GarlicESTdb: an online database and mining tool for garlic EST sequences.

    PubMed

    Kim, Dae-Won; Jung, Tae-Sung; Nam, Seong-Hyeuk; Kwon, Hyuk-Ryul; Kim, Aeri; Chae, Sung-Hwa; Choi, Sang-Haeng; Kim, Dong-Wook; Kim, Ryong Nam; Park, Hong-Seog

    2009-05-18

    Allium sativum., commonly known as garlic, is a species in the onion genus (Allium), which is a large and diverse one containing over 1,250 species. Its close relatives include chives, onion, leek and shallot. Garlic has been used throughout recorded history for culinary, medicinal use and health benefits. Currently, the interest in garlic is highly increasing due to nutritional and pharmaceutical value including high blood pressure and cholesterol, atherosclerosis and cancer. For all that, there are no comprehensive databases available for Expressed Sequence Tags(EST) of garlic for gene discovery and future efforts of genome annotation. That is why we developed a new garlic database and applications to enable comprehensive analysis of garlic gene expression. GarlicESTdb is an integrated database and mining tool for large-scale garlic (Allium sativum) EST sequencing. A total of 21,595 ESTs collected from an in-house cDNA library were used to construct the database. The analysis pipeline is an automated system written in JAVA and consists of the following components: automatic preprocessing of EST reads, assembly of raw sequences, annotation of the assembled sequences, storage of the analyzed information into MySQL databases, and graphic display of all processed data. A web application was implemented with the latest J2EE (Java 2 Platform Enterprise Edition) software technology (JSP/EJB/JavaServlet) for browsing and querying the database, for creation of dynamic web pages on the client side, and for mapping annotated enzymes to KEGG pathways, the AJAX framework was also used partially. The online resources, such as putative annotation, single nucleotide polymorphisms (SNP) and tandem repeat data sets, can be searched by text, explored on the website, searched using BLAST, and downloaded. To archive more significant BLAST results, a curation system was introduced with which biologists can easily edit best-hit annotation information for others to view. The GarlicESTdb web application is freely available at http://garlicdb.kribb.re.kr. GarlicESTdb is the first incorporated online information database of EST sequences isolated from garlic that can be freely accessed and downloaded. It has many useful features for interactive mining of EST contigs and datasets from each library, including curation of annotated information, expression profiling, information retrieval, and summary of statistics of functional annotation. Consequently, the development of GarlicESTdb will provide a crucial contribution to biologists for data-mining and more efficient experimental studies.

  7. Evaluation of the AJCC 8th Edition Staging System for Pathologically Versus Clinically Staged Intrahepatic Cholangiocarcinoma (iCCA): a Time to Revisit a Dogma? A Surveillance, Epidemiology, and End Results (SEER) Analysis.

    PubMed

    Kamarajah, Sivesh K

    2018-03-07

    Recently, the AJCC has released its 8th edition changes to the staging system for intrahepatic cholangiocarcinoma (iCCA). This study sought to validate the proposed changes to the 8th edition of AJCC system for T and N classification of iCCA using a population-based data set. Using the Surveillance, Epidemiology, and End Results (SEER) database (1998-2013), patients undergoing resection or non-surgical management for non-metastatic iCCA were identified. Overall survival was estimated using the Kaplan-Meier method and compared using log-rank tests. Concordance indices (c-indices) calculated from Cox proportional hazards models were calculated to evaluate discriminatory power. The study included 2630 patients resected (37%) or non-surgically managed (63%) for iCCA. Nodal staging was performed in 56%, of whom 31% had positive nodes. For all patients with iCCA, the median 5-year survival by AJCC T classification for T1a, T1b, T2, T3, and T4 was 32, 21, 14, 10, and 10 months, respectively (p < 0.001). The concordance index for the staging system was 0.57 for all patients, 0.62 for those who underwent resection, and 0.54 for patients who did not undergo resection. In summary, the new AJCC 8th edition staging system is comparable to the 7th edition and valid in stratifying patients with iCCA. However, the performance of the staging system is better in patients undergoing surgical resection than those undergoing non-surgical management. These findings further highlight the need for improved accuracy of radiological imaging in clinically staging patients to guide prognosis.

  8. Analysis of turbojet combustion chamber performances based on flow field simplified mathematical model

    NASA Astrophysics Data System (ADS)

    Rotaru, Constantin

    2017-06-01

    In this paper are presented some results about the study of combustion chamber geometrical configurations that are found in aircraft gas turbine engines. The main focus of this paper consists in a study of a new configuration of the aircraft engine combustion chamber with an optimal distribution of gas velocity in front of the turbine. This constructive solution could allow a lower engine rotational speed, a lower temperature in front of the first stage of the turbine and the possibility to increase the turbine pressure ratio. The Arrhenius relationship, which describes the basic dependencies of the reaction rate on pressure, temperature and concentration has been used. and the CFD simulations were made with jet A fuel (which is presented in the Fluent software database) for an annular flame tube with 24 injectors. The temperature profile at the turbine inlet exhibits nonuniformity due to the number of fuel injectors used in the circumferential direction, the spatial nonuniformity in dilution air cooling and mixing characteristics as well as other secondary flow patterns and instabilities that are set up in the flame tube.

  9. Cyclone: java-based querying and computing with Pathway/Genome databases.

    PubMed

    Le Fèvre, François; Smidtas, Serge; Schächter, Vincent

    2007-05-15

    Cyclone aims at facilitating the use of BioCyc, a collection of Pathway/Genome Databases (PGDBs). Cyclone provides a fully extensible Java Object API to analyze and visualize these data. Cyclone can read and write PGDBs, and can write its own data in the CycloneML format. This format is automatically generated from the BioCyc ontology by Cyclone itself, ensuring continued compatibility. Cyclone objects can also be stored in a relational database CycloneDB. Queries can be written in SQL, and in an intuitive and concise object-oriented query language, Hibernate Query Language (HQL). In addition, Cyclone interfaces easily with Java software including the Eclipse IDE for HQL edition, the Jung API for graph algorithms or Cytoscape for graph visualization. Cyclone is freely available under an open source license at: http://sourceforge.net/projects/nemo-cyclone. For download and installation instructions, tutorials, use cases and examples, see http://nemo-cyclone.sourceforge.net.

  10. Ionospheric characteristics for archiving at the World Data Centers. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamache, R.R.; Reinisch, B.W.

    1990-12-01

    A database structure for archiving ionospheric characteristics at uneven data rates was developed at the July 1989 Ionospheric Informatics Working Group (IIWG) Lowell Workshop in Digital Ionogram Data Formats for World Data Center Archiving. This structure is proposed as a new URSI standard and is being employed by the World Data Center A for solar terrestrial physics for archiving characteristics. Here the database has been slightly refined for the application and programs written to generate these database files using as input Digisonde 256 ARTIST data, post processed by the ULCAR ADEP (ARTIST Data Editing Program) system. The characteristics program asmore » well as supplemental programs developed for this task are described here. The new software will make it possible to archive the ionospheric characteristics from the Geophysics Laboratory high latitude Digisonde network, the AWS DISS and the international Digisonde networks, and other ionospheric sounding networks.« less

  11. A Step Towards CO2-Neutral Aviation

    NASA Technical Reports Server (NTRS)

    Brankovic, Andreja; Ryder, Robert C.; Hendricks, Robert C.; Huber, Marcia L.

    2007-01-01

    An approximation method for evaluation of the caloric equations used in combustion chemistry simulations is described. The method is applied to generate the equations of specific heat, static enthalpy, and Gibb's free energy for fuel mixtures of interest to gas turbine engine manufacturers. Liquid-phase fuel properties are also derived. The fuels include JP-8, synthetic fuel, and two fuel blends consisting of a mixture of JP-8 and synthetic fuel. The complete set of fuel property equations for both phases are implemented into a computational fluid dynamics (CFD) flow solver database, and multi-phase, reacting flow simulations of a well-tested liquid-fueled combustor are performed. The simulations are a first step in understanding combustion system performance and operational issues when using alternate fuels, at practical engine operating conditions.

  12. Review of Integrated behavioral health in primary care: Step-by-step guidance for assessment and intervention (Second edition).

    PubMed

    Ogbeide, Stacy A

    2017-09-01

    Reviews the book, Integrated Behavioral Health in Primary Care: Step-By-Step Guidance for Assessment and Intervention (Second Edition) by Anne C. Dobmeyer, Mark S. Oordt, Jeffrey L. Goodie, and Christopher L. Hunter (see record 2016-59132-000). This comprehensive book is well organized and covers many of the complex issues faced within the Primary Care Behavioral Health (PCBH) model and primary care setting: from uncontrolled type II diabetes to posttraumatic stress disorder. Primary care has changed since the initial release of this book, and the second edition covers many of these changes with up-to-date literature such as population health and the patient-centered medical home. The book is organized into three parts. The first three chapters describe the foundation of integrated behavioral consultation services. The next 12 chapters address common behavioral health issues that present in primary care. Last, the final two chapters focus on special topics such suicidal behavior and designing clinical pathways. This was an enjoyable read and worth the investment- especially if you are a trainee or a seasoned professional new to the practice of integrated behavioral health in primary care. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Studying Turbulence Using Numerical Simulation Databases - X Proceedings of the 2004 Summer Program

    NASA Technical Reports Server (NTRS)

    Moin, Parviz; Mansour, Nagi N.

    2004-01-01

    This Proceedings volume contains 32 papers that span a wide range of topics that reflect the ubiquity of turbulence. The papers have been divided into six groups: 1) Solar Simulations; 2) Magnetohydrodynamics (MHD); 3) Large Eddy Simulation (LES) and Numerical Simulations; 4) Reynolds Averaged Navier Stokes (RANS) Modeling and Simulations; 5) Stability and Acoustics; 6) Combustion and Multi-Phase Flow.

  14. Effects of Energetic Additives on Combustion Dynamics

    DTIC Science & Technology

    2010-04-19

    has the Distribution Statement checked befow. The current distribution for this document can be found in the DTIC® Technical Report Database. Q...no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently velid OMB...and ethanol drops loaded with nano-Al additives burned differently. An exploratory computational study using Large Eddy Simulation indicated that

  15. Genome databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Courteau, J.

    1991-10-11

    Since the Genome Project began several years ago, a plethora of databases have been developed or are in the works. They range from the massive Genome Data Base at Johns Hopkins University, the central repository of all gene mapping information, to small databases focusing on single chromosomes or organisms. Some are publicly available, others are essentially private electronic lab notebooks. Still others limit access to a consortium of researchers working on, say, a single human chromosome. An increasing number incorporate sophisticated search and analytical software, while others operate as little more than data lists. In consultation with numerous experts inmore » the field, a list has been compiled of some key genome-related databases. The list was not limited to map and sequence databases but also included the tools investigators use to interpret and elucidate genetic data, such as protein sequence and protein structure databases. Because a major goal of the Genome Project is to map and sequence the genomes of several experimental animals, including E. coli, yeast, fruit fly, nematode, and mouse, the available databases for those organisms are listed as well. The author also includes several databases that are still under development - including some ambitious efforts that go beyond data compilation to create what are being called electronic research communities, enabling many users, rather than just one or a few curators, to add or edit the data and tag it as raw or confirmed.« less

  16. Ranking Highlights in Personal Videos by Analyzing Edited Videos.

    PubMed

    Sun, Min; Farhadi, Ali; Chen, Tseng-Hung; Seitz, Steve

    2016-11-01

    We present a fully automatic system for ranking domain-specific highlights in unconstrained personal videos by analyzing online edited videos. A novel latent linear ranking model is proposed to handle noisy training data harvested online. Specifically, given a targeted domain such as "surfing," our system mines the YouTube database to find pairs of raw and their corresponding edited videos. Leveraging the assumption that an edited video is more likely to contain highlights than the trimmed parts of the raw video, we obtain pair-wise ranking constraints to train our model. The learning task is challenging due to the amount of noise and variation in the mined data. Hence, a latent loss function is incorporated to mitigate the issues caused by the noise. We efficiently learn the latent model on a large number of videos (about 870 min in total) using a novel EM-like procedure. Our latent ranking model outperforms its classification counterpart and is fairly competitive compared with a fully supervised ranking system that requires labels from Amazon Mechanical Turk. We further show that a state-of-the-art audio feature mel-frequency cepstral coefficients is inferior to a state-of-the-art visual feature. By combining both audio-visual features, we obtain the best performance in dog activity, surfing, skating, and viral video domains. Finally, we show that impressive highlights can be detected without additional human supervision for seven domains (i.e., skating, surfing, skiing, gymnastics, parkour, dog activity, and viral video) in unconstrained personal videos.

  17. Animation control of surface motion capture.

    PubMed

    Tejera, Margara; Casas, Dan; Hilton, Adrian

    2013-12-01

    Surface motion capture (SurfCap) of actor performance from multiple view video provides reconstruction of the natural nonrigid deformation of skin and clothing. This paper introduces techniques for interactive animation control of SurfCap sequences which allow the flexibility in editing and interactive manipulation associated with existing tools for animation from skeletal motion capture (MoCap). Laplacian mesh editing is extended using a basis model learned from SurfCap sequences to constrain the surface shape to reproduce natural deformation. Three novel approaches for animation control of SurfCap sequences, which exploit the constrained Laplacian mesh editing, are introduced: 1) space–time editing for interactive sequence manipulation; 2) skeleton-driven animation to achieve natural nonrigid surface deformation; and 3) hybrid combination of skeletal MoCap driven and SurfCap sequence to extend the range of movement. These approaches are combined with high-level parametric control of SurfCap sequences in a hybrid surface and skeleton-driven animation control framework to achieve natural surface deformation with an extended range of movement by exploiting existing MoCap archives. Evaluation of each approach and the integrated animation framework are presented on real SurfCap sequences for actors performing multiple motions with a variety of clothing styles. Results demonstrate that these techniques enable flexible control for interactive animation with the natural nonrigid surface dynamics of the captured performance and provide a powerful tool to extend current SurfCap databases by incorporating new motions from MoCap sequences.

  18. Fiber Supported Droplet Combustion-2 (FSDC-2)

    NASA Technical Reports Server (NTRS)

    Colantonio, Renato; Dietrich, Daniel; Haggard, John B., Jr.; Nayagan, Vedha; Dryer, Frederick L.; Shaw, Benjamin D.; Williams, Forman A.

    1998-01-01

    Experimental results for the burning characteristics of fiber supported, liquid droplets in ambient Shuttle cabin air (21% oxygen, 1 bar pressure) were obtained from the Glove Box Facility aboard the STS-94/MSL-1 mission using the Fiber Supported Droplet Combustion - 2 (FSDC-2) apparatus. The combustion of individual droplets of methanol/water mixtures, ethanol, ethanol/water azeotrope, n-heptane, n-decane, and n-heptane/n-hexadecane mixtures were studied in quiescent air. The effects of low velocity, laminar gas phase forced convection on the combustion of individual droplets of n-heptane and n-decane were investigated and interactions of two droplet-arrays of n-heptane and n-decane droplets were also studied with and without gas phase convective flow. Initial diameters ranging from about 2mm to over 6mm were burned on 80-100 micron silicon fibers. In addition to phenomenological observations, quantitative data were obtained in the form of backlit images of the burning droplets, overall flame images, and radiometric combustion emission measurements as a function of the burning time in each experiment. In all, 124 of the 129 attempted experiments (or about twice the number of experiments originally planned for the STS-94/MSL-1 mission) were conducted successfully. The experimental results contribute new observations on the combustion properties of pure alkanes, binary alkane mixtures, and simple alcohols for droplet sizes not studied previously, including measurements on individual droplets and two-droplet arrays, inclusive of the effects of forced gas phase convection. New phenomena characterized experimentally for the first time include radiative extinction of droplet burning for alkanes and the "twin effect" which occurs as a result of interactions during the combustion of two-droplet arrays. Numerical modeling of isolated droplet combustion phenomenon has been conducted for methanol/water mixtures, n-heptane, and n-heptane/n-hexadecane mixtures, and results compare quantitatively with those found experimentally for methanol/water mixtures. Initial computational results qualitatively predict experimental results obtained for isolated n-heptane and n-heptane/n-hexadecane droplet combustion, although the effects of sooting are not yet included in the modeling work. Numerical modeling of ethanol and ethanol/water droplet burning is under development. Considerable data remain to be fully analyzed and will provide a large database for comparisons with further numerical and analytical modeling and development of future free droplet experiments aboard space platforms.

  19. Automated RTOP Management System

    NASA Technical Reports Server (NTRS)

    Hayes, P.

    1984-01-01

    The structure of NASA's Office of Aeronautics and Space Technology electronic information system network from 1983 to 1985 is illustrated. The RTOP automated system takes advantage of existing hardware, software, and expertise, and provides: (1) computerized cover sheet and resources forms; (2) electronic signature and transmission; (3) a data-based information system; (4) graphics; (5) intercenter communications; (6) management information; and (7) text editing. The system is coordinated with Headquarters efforts in codes R,E, and T.

  20. Microgravity science and applications bibliography, 1987 revision

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This edition of the Microgravity Science and Applications (MSA) Bibliography is a compilation of Government reports, contractor reports, conference proceedings, and journal articles dealing with flight experiments utilizing a low gravity environment to elucidate and control various processes or with ground based activities that provide supporting research. It encompasses literature published but not cited in the 1984 Revision and literature which has been published in the past year. Subdivisions of the bibliography include six major categories: Electronic Materials; Metals, Alloys, and Composites; Fluid Dynamics and Transport; Biotechnology; Glass and Ceramics; and Combustion. Also included are publications from the European, Soviet, and Japanese MSA programs. In addition, there is a list of patents and appendices providing a compilation of an anonymously authored collection of reports and a cross reference index.

  1. Microgravity science and applications bibliography, 1985 revision

    NASA Technical Reports Server (NTRS)

    Pentecost, E. (Compiler)

    1985-01-01

    This edition of the Microgravity Science and Applications (MSA) Bibliography is a compilation of Government reports, contractor reports, conference proceedings, and journal articles dealing with flight experiments utilizing a low-gravity environment to elucidate and control various processes or with ground-based activities that provide supporting research. It encompasses literature published but not cited in the 1984 Revision and that literature which has been published in the past year. Subdivisions of the bibliography include six major categories: Electronic Materials; Metal, Alloys, and Composites; Fluid Dynamics and Transports; Biotechnology; Glass and Ceramics; and Combustion. Also included are publications from the European, Soviet, and Japanese MSA programs. In addition, there is a list of patents and appendices providing a compilation of anonymously authored collection of reports and a cross reference index.

  2. Spontaneous Raman Scattering (SRS) System for Calibrating High-Pressure Flames Became Operational

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet

    2003-01-01

    A high-performance spontaneous Raman scattering (SRS) system for measuring quantitative species concentration and temperature in high-pressure flames is now operational. The system is located in Glenn s Engine Research Building. Raman scattering is perhaps the only optical diagnostic technique that permits the simultaneous (single-shot) measurement of all major species (N2, O2, CO2, H2O, CO, H2, and CH4) as well as temperature in combustion systems. The preliminary data acquired with this new system in a 20-atm hydrogen-air (H2-air) flame show excellent spectral coverage, good resolution, and a signal-to-noise ratio high enough for the data to serve as a calibration standard. This new SRS diagnostic system is used in conjunction with the newly developed High- Pressure Gaseous Burner facility (ref. 1). The main purpose of this diagnostic system and the High-Pressure Gaseous Burner facility is to acquire and establish a comprehensive Raman-scattering spectral database calibration standard for the combustion diagnostic community. A secondary purpose of the system is to provide actual measurements in standardized flames to validate computational combustion models. The High-Pressure Gaseous Burner facility and its associated SRS system will provide researchers throughout the world with new insights into flame conditions that simulate the environment inside the ultra-high-pressure-ratio combustion chambers of tomorrow s advanced aircraft engines.

  3. CRISPR-Cas in Medicinal Chemistry: Applications and Regulatory Concerns.

    PubMed

    Duardo-Sanchez, Aliuska

    2017-01-01

    A rapid search in scientific publication's databases shows how the use of CRISPR-Cas genome editions' technique has considerably expanded, and its growing importance, in modern molecular biology. Just in pub-med platform, the search of the term gives more than 3000 results. Specifically, in Drug Discovery, Medicinal Chemistry and Chemical Biology in general CRISPR method may have multiple applications. Some of these applications are: resistance-selection studies of antimalarial lead organic compounds; investigation of druggability; development of animal models for chemical compounds testing, etc. In this paper, we offer a review of the most relevant scientific literature illustrated with specific examples of application of CRISPR technique to medicinal chemistry and chemical biology. We also present a general overview of the main legal and ethical trends regarding this method of genome editing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. Optimized gene editing technology for Drosophila melanogaster using germ line-specific Cas9.

    PubMed

    Ren, Xingjie; Sun, Jin; Housden, Benjamin E; Hu, Yanhui; Roesel, Charles; Lin, Shuailiang; Liu, Lu-Ping; Yang, Zhihao; Mao, Decai; Sun, Lingzhu; Wu, Qujie; Ji, Jun-Yuan; Xi, Jianzhong; Mohr, Stephanie E; Xu, Jiang; Perrimon, Norbert; Ni, Jian-Quan

    2013-11-19

    The ability to engineer genomes in a specific, systematic, and cost-effective way is critical for functional genomic studies. Recent advances using the CRISPR-associated single-guide RNA system (Cas9/sgRNA) illustrate the potential of this simple system for genome engineering in a number of organisms. Here we report an effective and inexpensive method for genome DNA editing in Drosophila melanogaster whereby plasmid DNAs encoding short sgRNAs under the control of the U6b promoter are injected into transgenic flies in which Cas9 is specifically expressed in the germ line via the nanos promoter. We evaluate the off-targets associated with the method and establish a Web-based resource, along with a searchable, genome-wide database of predicted sgRNAs appropriate for genome engineering in flies. Finally, we discuss the advantages of our method in comparison with other recently published approaches.

  5. Daniel Landis: Award for Distinguished Contributions to the International Advancement of Psychology.

    PubMed

    2012-11-01

    Presents a short biography of one of the co-recipients of the American Psychological Association's Award for Distinguished Contributions to the International Advancement of Psychology. One of the 2012 winners is Daniel Landis for his unparalleled contribution to the field of intercultural research in a distinguished academic career spanning almost half a century. Landis has shaped the field of intercultural research through scholarship of the highest order, reflected in his publications on cross-cultural training and research, the measurement of equal opportunity climate, individual-differences research and methodology, evaluation of social programs, development of theory in social psychology, and cross-cultural aspects of human sexuality. He is the founding editor-in-chief of the International Journal of Intercultural Relations and has edited three editions of the Handbook of Intercultural Training (1983, 1996, 2004). Landis' Award citation and a selected bibliography are also presented. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  6. A Process and Programming Design to Develop Virtual Patients for Medical Education

    PubMed Central

    McGee, James B.; Wu, Martha

    1999-01-01

    Changes in the financing and delivery of healthcare in our nation's teaching hospitals have diminished the variety and quality of a medical student's clinical training. The Virtual Patient Project is a series of computer-based, multimedia, clinical simulations, designed to fill this gap. After the development of a successful prototype and obtaining funding for a series of 16 cases, a method to write and produce many virtual patients was created. Case authors now meet with our production team to write and edit a movie-like script. This script is converted into a design document which specifies the clinical aspects, teaching points, media production, and interactivity of each case. The program's code was modularized, using object-oriented techniques, to allow for the variations in cases and for team programming. All of the clinical and teaching content is stored in a database, that allows for faster and easier editing by many persons simultaneously.

  7. Image-Based Reverse Engineering and Visual Prototyping of Woven Cloth.

    PubMed

    Schroder, Kai; Zinke, Arno; Klein, Reinhard

    2015-02-01

    Realistic visualization of cloth has many applications in computer graphics. An ongoing research problem is how to best represent and capture cloth models, specifically when considering computer aided design of cloth. Previous methods produce highly realistic images, however, they are either difficult to edit or require the measurement of large databases to capture all variations of a cloth sample. We propose a pipeline to reverse engineer cloth and estimate a parametrized cloth model from a single image. We introduce a geometric yarn model, integrating state-of-the-art textile research. We present an automatic analysis approach to estimate yarn paths, yarn widths, their variation and a weave pattern. Several examples demonstrate that we are able to model the appearance of the original cloth sample. Properties derived from the input image give a physically plausible basis that is fully editable using a few intuitive parameters.

  8. Structural and incremental validity of the Wechsler Adult Intelligence Scale-Fourth Edition with a clinical sample.

    PubMed

    Nelson, Jason M; Canivez, Gary L; Watkins, Marley W

    2013-06-01

    Structural and incremental validity of the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV; Wechsler, 2008a) was examined with a sample of 300 individuals referred for evaluation at a university-based clinic. Confirmatory factor analysis indicated that the WAIS-IV structure was best represented by 4 first-order factors as well as a general intelligence factor in a direct hierarchical model. The general intelligence factor accounted for the most common and total variance among the subtests. Incremental validity analyses indicated that the Full Scale IQ (FSIQ) generally accounted for medium to large portions of academic achievement variance. For all measures of academic achievement, the first-order factors combined accounted for significant achievement variance beyond that accounted for by the FSIQ, but individual factor index scores contributed trivial amounts of achievement variance. Implications for interpreting WAIS-IV results are discussed. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  9. Analog-to-digital clinical data collection on networked workstations with graphic user interface.

    PubMed

    Lunt, D

    1991-02-01

    An innovative respiratory examination system has been developed that combines physiological response measurement, real-time graphic displays, user-driven operating sequences, and networked file archiving and review into a scientific research and clinical diagnosis tool. This newly constructed computer network is being used to enhance the research center's ability to perform patient pulmonary function examinations. Respiratory data are simultaneously acquired and graphically presented during patient breathing maneuvers and rapidly transformed into graphic and numeric reports, suitable for statistical analysis or database access. The environment consists of the hardware (Macintosh computer, MacADIOS converters, analog amplifiers), the software (HyperCard v2.0, HyperTalk, XCMDs), and the network (AppleTalk, fileservers, printers) as building blocks for data acquisition, analysis, editing, and storage. System operation modules include: Calibration, Examination, Reports, On-line Help Library, Graphic/Data Editing, and Network Storage.

  10. An integrated software suite for surface-based analyses of cerebral cortex.

    PubMed

    Van Essen, D C; Drury, H A; Dickson, J; Harwell, J; Hanlon, D; Anderson, C H

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  11. Korean association of medical journal editors at the forefront of improving the quality and indexing chances of its member journals.

    PubMed

    Suh, Chang-Ok; Oh, Se Jeong; Hong, Sung-Tae

    2013-05-01

    The article overviews some achievements and problems of Korean medical journals published in the highly competitive journal environment. Activities of Korean Association of Medical Journal Editors (KAMJE) are viewed as instrumental for improving the quality of Korean articles, indexing large number of local journals in prestigious bibliographic databases and launching new abstract and citation tracking databases or platforms (eg KoreaMed, KoreaMed Synapse, the Western Pacific Regional Index Medicus [WPRIM]). KAMJE encourages its member journals to upgrade science editing standards and to legitimately increase citation rates, primarily by publishing more great articles with global influence. Experience gained by KAMJE and problems faced by Korean editors may have global implications.

  12. Contaminated sediments database for the Gulf of Maine

    USGS Publications Warehouse

    Buchholtz ten Brink, Marilyn R.; Manheim, F.T.; Mecray, E.L.; Hastings, M.E.; Currence, J.M.; Farrington, J.W.; Jones, S.H.; Larsen, P.F.; Tripp, B.W.; Wallace, G.T.; Ward, L.G.; Fredette, T.J.; Liebman, M.L.; Smith Leo, W.

    2002-01-01

    Bottom sediments in the Gulf of Maine and its estuaries have accumulated pollutants of many types, including metals and organic compounds of agricultural, industrial, and household derivation. Much analytical and descriptive data has been obtained on these sediments over the past decades, but only a small effort had been made, prior to this project, to compile and edit the published and unpublished data in forms suitable for a variety of users. The Contaminated Sediments Database for the Gulf of Maine provides a compilation and synthesis of existing data to help establish the environmental status of our coastal sediments and the transport paths and fate of contaminants in this region. This information, in turn, forms one of the essential bases for developing successful remediation and resource management policies.

  13. Map showing geologic terranes of the Hailey 1 degree x 2 degrees quadrangle and the western part of the Idaho Falls 1 degree x 2 degrees quadrangle, south-central Idaho

    USGS Publications Warehouse

    Worl, R.G.; Johnson, K.M.

    1995-01-01

    The paper version of Map Showing Geologic Terranes of the Hailey 1x2 Quadrangle and the western part of the Idaho Falls 1x2 Quadrangle, south-central Idaho was compiled by Ron Worl and Kate Johnson in 1995. The plate was compiled on a 1:250,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a geographic information system database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  14. An integrated software suite for surface-based analyses of cerebral cortex

    NASA Technical Reports Server (NTRS)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  15. Patient-Oriented Cancer Information on the Internet: A Comparison of Wikipedia and a Professionally Maintained Database

    PubMed Central

    Rajagopalan, Malolan S.; Khanna, Vineet K.; Leiter, Yaacov; Stott, Meghan; Showalter, Timothy N.; Dicker, Adam P.; Lawrence, Yaacov R.

    2011-01-01

    Purpose: A wiki is a collaborative Web site, such as Wikipedia, that can be freely edited. Because of a wiki's lack of formal editorial control, we hypothesized that the content would be less complete and accurate than that of a professional peer-reviewed Web site. In this study, the coverage, accuracy, and readability of cancer information on Wikipedia were compared with those of the patient-orientated National Cancer Institute's Physician Data Query (PDQ) comprehensive cancer database. Methods: For each of 10 cancer types, medically trained personnel scored PDQ and Wikipedia articles for accuracy and presentation of controversies by using an appraisal form. Reliability was assessed by using interobserver variability and test-retest reproducibility. Readability was calculated from word and sentence length. Results: Evaluators were able to rapidly assess articles (18 minutes/article), with a test-retest reliability of 0.71 and interobserver variability of 0.53. For both Web sites, inaccuracies were rare, less than 2% of information examined. PDQ was significantly more readable than Wikipedia: Flesch-Kincaid grade level 9.6 versus 14.1. There was no difference in depth of coverage between PDQ and Wikipedia (29.9, 34.2, respectively; maximum possible score 72). Controversial aspects of cancer care were relatively poorly discussed in both resources (2.9 and 6.1 for PDQ and Wikipedia, respectively, NS; maximum possible score 18). A planned subanalysis comparing common and uncommon cancers demonstrated no difference. Conclusion: Although the wiki resource had similar accuracy and depth as the professionally edited database, it was significantly less readable. Further research is required to assess how this influences patients' understanding and retention. PMID:22211130

  16. Corruption of genomic databases with anomalous sequence.

    PubMed

    Lamperti, E D; Kittelberger, J M; Smith, T F; Villa-Komaroff, L

    1992-06-11

    We describe evidence that DNA sequences from vectors used for cloning and sequencing have been incorporated accidentally into eukaryotic entries in the GenBank database. These incorporations were not restricted to one type of vector or to a single mechanism. Many minor instances may have been the result of simple editing errors, but some entries contained large blocks of vector sequence that had been incorporated by contamination or other accidents during cloning. Some cases involved unusual rearrangements and areas of vector distant from the normal insertion sites. Matches to vector were found in 0.23% of 20,000 sequences analyzed in GenBank Release 63. Although the possibility of anomalous sequence incorporation has been recognized since the inception of GenBank and should be easy to avoid, recent evidence suggests that this problem is increasing more quickly than the database itself. The presence of anomalous sequence may have serious consequences for the interpretation and use of database entries, and will have an impact on issues of database management. The incorporated vector fragments described here may also be useful for a crude estimate of the fidelity of sequence information in the database. In alignments with well-defined ends, the matching sequences showed 96.8% identity to vector; when poorer matches with arbitrary limits were included, the aggregate identity to vector sequence was 94.8%.

  17. A Flexible Online Metadata Editing and Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilar, Raul; Pan, Jerry Yun; Gries, Corinna

    2010-01-01

    A metadata editing and management system is being developed employing state of the art XML technologies. A modular and distributed design was chosen for scalability, flexibility, options for customizations, and the possibility to add more functionality at a later stage. The system consists of a desktop design tool or schema walker used to generate code for the actual online editor, a native XML database, and an online user access management application. The design tool is a Java Swing application that reads an XML schema, provides the designer with options to combine input fields into online forms and give the fieldsmore » user friendly tags. Based on design decisions, the tool generates code for the online metadata editor. The code generated is an implementation of the XForms standard using the Orbeon Framework. The design tool fulfills two requirements: First, data entry forms based on one schema may be customized at design time and second data entry applications may be generated for any valid XML schema without relying on custom information in the schema. However, the customized information generated at design time is saved in a configuration file which may be re-used and changed again in the design tool. Future developments will add functionality to the design tool to integrate help text, tool tips, project specific keyword lists, and thesaurus services. Additional styling of the finished editor is accomplished via cascading style sheets which may be further customized and different look-and-feels may be accumulated through the community process. The customized editor produces XML files in compliance with the original schema, however, data from the current page is saved into a native XML database whenever the user moves to the next screen or pushes the save button independently of validity. Currently the system uses the open source XML database eXist for storage and management, which comes with third party online and desktop management tools. However, access to metadata files in the application introduced here is managed in a custom online module, using a MySQL backend accessed by a simple Java Server Faces front end. A flexible system with three grouping options, organization, group and single editing access is provided. Three levels were chosen to distribute administrative responsibilities and handle the common situation of an information manager entering the bulk of the metadata but leave specifics to the actual data provider.« less

  18. RTECS database (on the internet). Online data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The Registry of Toxic Effects of Chemical Substances (RTECS (trademark)) is a database of toxicological information compiled, maintained, and updated by the National Institute for Occupational Safety and Health. The program is mandated by the Occupational Safety and Health Act of 1970. The original edition, known as the `Toxic Substances List,` was published on June 28, 1971, and included toxicologic data for approximately 5,000 chemicals. Since that time, the list has continuously grown and been updated, and its name changed to the current title, `Registry of Toxic Effects of Chemical Substances.` RTECS (trademark) now contains over 133,000 chemicals as NIOSHmore » strives to fulfill the mandate to list `all known toxic substances...and the concentrations at which...toxicity is known to occur.` This database is now available for searching through the Gov. Research-Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less

  19. Identifying the effective evidence sources to use in developing Clinical Guidelines for Acute Stroke Management: lived experiences of the search specialist and project manager.

    PubMed

    Parkhill, Anne; Hill, Kelvin

    2009-03-01

    The Australian National Stroke Foundation appointed a search specialist to find the best available evidence for the second edition of its Clinical Guidelines for Acute Stroke Management. To identify the relative effectiveness of differing evidence sources for the guideline update. We searched and reviewed references from five valid evidence sources for clinical and economic questions: (i) electronic databases; (ii) reference lists of relevant systematic reviews, guidelines, and/or primary studies; (iii) table of contents of a number of key journals for the last 6 months; (iv) internet/grey literature; and (v) experts. Reference sources were recorded, quantified, and analysed. In the clinical portion of the guidelines document, there was a greater use of previous knowledge and sources other than electronic databases for evidence, while there was a greater use of electronic databases for the economic section. The results confirmed that searchers need to be aware of the context and range of sources for evidence searches. For best available evidence, searchers cannot rely solely on electronic databases and need to encompass many different media and sources.

  20. A comparative cellular and molecular biology of longevity database.

    PubMed

    Stuart, Jeffrey A; Liang, Ping; Luo, Xuemei; Page, Melissa M; Gallagher, Emily J; Christoff, Casey A; Robb, Ellen L

    2013-10-01

    Discovering key cellular and molecular traits that promote longevity is a major goal of aging and longevity research. One experimental strategy is to determine which traits have been selected during the evolution of longevity in naturally long-lived animal species. This comparative approach has been applied to lifespan research for nearly four decades, yielding hundreds of datasets describing aspects of cell and molecular biology hypothesized to relate to animal longevity. Here, we introduce a Comparative Cellular and Molecular Biology of Longevity Database, available at ( http://genomics.brocku.ca/ccmbl/ ), as a compendium of comparative cell and molecular data presented in the context of longevity. This open access database will facilitate the meta-analysis of amalgamated datasets using standardized maximum lifespan (MLSP) data (from AnAge). The first edition contains over 800 data records describing experimental measurements of cellular stress resistance, reactive oxygen species metabolism, membrane composition, protein homeostasis, and genome homeostasis as they relate to vertebrate species MLSP. The purpose of this review is to introduce the database and briefly demonstrate its use in the meta-analysis of combined datasets.

  1. Task 1.13 -- Data collection and database development for clean coal technology by-product characteristics and management practices. Semi-annual report, July 1--December 31, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pflughoeft-Hassett, D.F.

    1997-08-01

    Information from DOE projects and commercial endeavors in fluidized-bed combustion and coal gasification is the focus of this task by the Energy and Environmental Research Center. The primary goal of this task is to provide an easily accessible compilation of characterization information on CCT (Clean Coal Technology) by-products to government agencies and industry to facilitate sound regulatory and management decisions. Supporting objectives are (1) to fully utilize information from previous DOE projects, (2) to coordinate with industry and other research groups, (3) to focus on by-products from pressurized fluidized-bed combustion (PFBC) and gasification, and (4) to provide information relevant tomore » the EPA evaluation criteria for the Phase 2 decision.« less

  2. Network Application Server Using Extensible Mark-Up Language (XML) to Support Distributed Databases and 3D Environments

    DTIC Science & Technology

    2001-12-01

    diides.ncr.disa.mil/xmlreg/user/index.cfm] [ Deitel ] Deitel , H., Deitel , P., Java How to Program 3rd Edition, Prentice Hall, 1999. [DL99...presentation, and data) of information and the programming functionality. The Web framework addressed ability to provide a framework for the distribution...BLANK v ABSTRACT Advances in computer communication technology and an increased awareness of how enhanced information access can lead to improved

  3. Ionospheric Characteristics for Archiving at the World Data Centers

    DTIC Science & Technology

    1990-12-01

    for the application and programs written to generate these database files using as input Digisonde 256 ARTIST [Reinisch and Huang, 1983; Reinisch et al...1983; Tang et al., 19891 data, post processed by the ULCAR ADEP [Zhang, 1989; Zhang et al., 19881 ( ARTIST Data Editing Program) system. The...Digisonde ARTIST [Tang et al. 19891. FORTRAN records i+lI to k give the Dimensions corresponding to the characteristics list (see 3 Table 2), these are in

  4. References that anyone can edit: review of Wikipedia citations in peer reviewed health science literature.

    PubMed

    Bould, M Dylan; Hladkowicz, Emily S; Pigford, Ashlee-Ann E; Ufholz, Lee-Anne; Postonogova, Tatyana; Shin, Eunkyung; Boet, Sylvain

    2014-03-06

    To examine indexed health science journals to evaluate the prevalence of Wikipedia citations, identify the journals that publish articles with Wikipedia citations, and determine how Wikipedia is being cited. Bibliometric analysis. Publications in the English language that included citations to Wikipedia were retrieved using the online databases Scopus and Web of Science. To identify health science journals, results were refined using Ulrich's database, selecting for citations from journals indexed in Medline, PubMed, or Embase. Using Thomson Reuters Journal Citation Reports, 2011 impact factors were collected for all journals included in the search. Resulting citations were thematically coded, and descriptive statistics were calculated. 1433 full text articles from 1008 journals indexed in Medline, PubMed, or Embase with 2049 Wikipedia citations were accessed. The frequency of Wikipedia citations has increased over time; most citations occurred after December 2010. More than half of the citations were coded as definitions (n = 648; 31.6%) or descriptions (n=482; 23.5%). Citations were not limited to journals with a low or no impact factor; the search found Wikipedia citations in many journals with high impact factors. Many publications are citing information from a tertiary source that can be edited by anyone, although permanent, evidence based sources are available. We encourage journal editors and reviewers to use caution when publishing articles that cite Wikipedia.

  5. A best-case probe, light source, and database for H2O absorption thermometry to 2100 K and 50 bar

    NASA Astrophysics Data System (ADS)

    Brittelle, Mack S.

    This work aspired to improve the ability of forthcoming researchers to utilize near IR H2O absorption spectroscopy for thermometry with development of three best-case techniques: the design of novel high temperature sapphire optical access probes, the construction of a fixed-wavelength H 2O absorption spectroscopy system enhanced by an on-board external-cavity diode laser, and the creation of an architecture for a high-temperature and -pressure H2O absorption cross-section database. Each area's main goal was to realize the best-case for direct absorption spectroscopy H2O vapor thermometry at combustion conditions. Optical access to combustion devices is explored through the design and implementation of two versions of novel high-temperature (2000 K) sapphire immersion probes (HTSIPs) for use in ambient flames and gas turbine combustors. The development and evaluation of a fixed wavelength H2O absorption spectroscopy (FWAS) system that is demonstrates how the ECDL allows the system to operate in multiple modes that enhance FWAS measurement accuracy by improving wavelength position monitoring, and reducing non-absorption based contamination in spectral scans. The architecture of a high temperature (21000 K) and pressure (50 bar) database (HTPD) is developed that can enhance absorption spectroscopy based thermometry. The HTPD formation is developed by the evaluation of two approaches, a line-by-line (LBL) approach, where transition lineshape parameters are extracted from spectra and used along with a physics based model to allow the simulation of spectra over a wide range of temperatures and pressures, or an absorption cross-section (sigmaabs) approach, where spectra generated from a high temperature and pressure furnace are catalog spectra at various conditions forming a database of absorption cross-sections that is then interpolated to provide a simulated absorbance spectra based on measured reference grade spectra. Utilizing near future reference grade H2O absorption spectra, generated by the Sanders Group by means of an ECDL and a high temperature and pressure furnace, a unique opportunity is taken to provide the research community with a database that can be utilized for optical thermometry.

  6. HIGH PRESSURE COAL COMBUSTON KINETICS PROJECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stefano Orsino

    As part of the U.S. Department of Energy (DoE) initiative to improve the efficiency of coal-fired power plants and reduce the pollution generated by these facilities, DOE has funded the High-Pressure Coal Combustion Kinetics (HPCCK) Projects. A series of laboratory experiments were conducted on selected pulverized coals at elevated pressures with the specific goals to provide new data for pressurized coal combustion that will help extend to high pressure and validate models for burnout, pollutant formation, and generate samples of solid combustion products for analyses to fill crucial gaps in knowledge of char morphology and fly ash formation. Two seriesmore » of high-pressure coal combustion experiments were performed using SRI's pressurized radiant coal flow reactor. The first series of tests characterized the near burner flame zone (NBFZ). Three coals were tested, two high volatile bituminous (Pittsburgh No.8 and Illinois No.6), and one sub-bituminous (Powder River Basin), at pressures of 1, 2, and 3 MPa (10, 20, and 30 atm). The second series of experiments, which covered high-pressure burnout (HPBO) conditions, utilized a range of substantially longer combustion residence times to produce char burnout levels from 50% to 100%. The same three coals were tested at 1, 2, and 3 MPa, as well as at 0.2 MPa. Tests were also conducted on Pittsburgh No.8 coal in CO2 entrainment gas at 0.2, 1, and 2 MPa to begin establishing a database of experiments relevant to carbon sequestration techniques. The HPBO test series included use of an impactor-type particle sampler to measure the particle size distribution of fly ash produced under complete burnout conditions. The collected data have been interpreted with the help of CFD and detailed kinetics simulation to extend and validate devolatilization, char combustion and pollutant model at elevated pressure. A global NOX production sub-model has been proposed. The submodel reproduces the performance of the detailed chemical reaction mechanism for the NBFZ tests.« less

  7. How do consumers perceive differences in risk across nicotine products? A review of relative risk perceptions across smokeless tobacco, e-cigarettes, nicotine replacement therapy and combustible cigarettes.

    PubMed

    Czoli, Christine D; Fong, Geoffrey T; Mays, Darren; Hammond, David

    2017-03-01

    To systematically review the literature regarding relative risk perceptions (RRPs) across non-combustible nicotine products. MEDLINE and PsycINFO databases were searched for articles published up to October 2014. Of the 5266 records identified, articles not published in English that did not quantitatively assess RRPs across categories of non-combustible nicotine products were excluded, yielding 55 records. One reviewer extracted measures and findings of RRPs for product comparisons of smokeless tobacco (SLT), e-cigarettes (ECs) and nicotine replacement therapy (NRT) to one another, and to combustible cigarettes (CCs). A total of 157 samples from 54 studies were included in the analyses. The accuracy of RRPs differed based on the products being compared: although the accuracy of RRPs was variable across studies, substantial proportions of respondents reported inaccurate beliefs about the relative harmfulness of SLT versus CCs, as well as of ECs versus NRT. In addition, in most studies, respondents did not know the relative harmfulness of SLT versus NRT. In contrast, respondents in many studies correctly perceived NRT and ECs as less harmful than CCs. Cigarette smokers and users of non-combustible nicotine products tended to correctly perceive the relative harmfulness of products more often than non-users. Measures used to assess RRPs varied across studies, with different approaches characterised by certain strengths and limitations. The highly variable and context-specific nature of non-combustible nicotine product RRPs have direct implications for researchers and present several challenges for policymakers working with modified risk products, including issues of measurement, health risk communication and behaviour change. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. Dual-Pump CARS Development and Application to Supersonic Combustion

    NASA Astrophysics Data System (ADS)

    Magnotti, Gaetano

    Successful design of hypersonic air-breathing engines requires new computational fluid dynamics (CFD) models for turbulence and turbulence-chemistry interaction in supersonic combustion. Unfortunately, not enough data are available to the modelers to develop and validate their codes, due to difficulties in taking measurements in such a harsh environment. Dual-pump coherent anti-Stokes Raman spectroscopy (CARS) is a non-intrusive, non-linear, laser-based technique that provides temporally and spatially resolved measurements of temperature and absolute mole fractions of N2, O2 and H2 in H2-air flames. A dual-pump CARS instrument has been developed to obtain measurements in supersonic combustion and generate databases for the CFD community. Issues that compromised previous attempts, such as beam steering and high irradiance perturbation effects, have been alleviated or avoided. Improvements in instrument precision and accuracy have been achieved. An axis-symmetric supersonic combusting coaxial jet facility has been developed to provide a simple, yet suitable flow to CFD modelers. The facility provides a central jet of hot "vitiated air" simulating the hot air entering the engine of a hypersonic vehicle flying at Mach numbers between 5 and 7. Three different silicon carbide nozzles, with exit Mach number 1, 1.6 and 2, are used to provide flows with the effects of varying compressibility. H2 co-flow is available in order to generate a supersonic combusting free jet. Dual-pump CARS measurements have been obtained for varying values of flight and exit Mach numbers at several locations. Approximately one million Dual-pump CARS single shots have been collected in the supersonic jet for varying values of flight and exit Mach numbers at several locations. Data have been acquired with a H2 co-flow (combustion case) or a N 2 co-flow (mixing case). Results are presented and the effects of the compressibility and of the heat release are discussed.

  9. Geochemical database of feed coal and coal combustion products (CCPs) from five power plants in the United States

    USGS Publications Warehouse

    Affolter, Ronald H.; Groves, Steve; Betterton, William J.; William, Benzel; Conrad, Kelly L.; Swanson, Sharon M.; Ruppert, Leslie F.; Clough, James G.; Belkin, Harvey E.; Kolker, Allan; Hower, James C.

    2011-01-01

    The principal mission of the U.S. Geological Survey (USGS) Energy Resources Program (ERP) is to (1) understand the processes critical to the formation, accumulation, occurrence, and alteration of geologically based energy resources; (2) conduct scientifically robust assessments of those resources; and (3) study the impacts of energy resource occurrence and (or) their production and use on both the environment and human health. The ERP promotes and supports research resulting in original, geology-based, non-biased energy information products for policy and decision makers, land and resource managers, other Federal and State agencies, the domestic energy industry, foreign governments, non-governmental groups, and academia. Investigations include research on the geology of oil, gas, and coal, and the impacts associated with energy resource occurrence, production, quality, and utilization. The ERP's focus on coal is to support investigations into current issues pertaining to coal production, beneficiation and (or) conversion, and the environmental impact of the coal combustion process and coal combustion products (CCPs). To accomplish these studies, the USGS combines its activities with other organizations to address domestic and international issues that relate to the development and use of energy resources.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory Corman; Krishan Luthra

    This report covers work performed under the Continuous Fiber Ceramic Composites (CFCC) program by GE Global Research and its partners from 1994 through 2005. The processing of prepreg-derived, melt infiltrated (MI) composite systems based on monofilament and multifilament tow SiC fibers is described. Extensive mechanical and environmental exposure characterizations were performed on these systems, as well as on competing Ceramic Matrix Composite (CMC) systems. Although current monofilament SiC fibers have inherent oxidative stability limitations due to their carbon surface coatings, the MI CMC system based on multifilament tow (Hi-Nicalon ) proved to have excellent mechanical, thermal and time-dependent properties. Themore » materials database generated from the material testing was used to design turbine hot gas path components, namely the shroud and combustor liner, utilizing the CMC materials. The feasibility of using such MI CMC materials in gas turbine engines was demonstrated via combustion rig testing of turbine shrouds and combustor liners, and through field engine tests of shrouds in a 2MW engine for >1000 hours. A unique combustion test facility was also developed that allowed coupons of the CMC materials to be exposed to high-pressure, high-velocity combustion gas environments for times up to {approx}4000 hours.« less

  11. A flamelet model for supersonic non-premixed combustion with pressure variation

    NASA Astrophysics Data System (ADS)

    Zhao, Guo-Yan; Sun, Ming-Bo; Wu, Jin-Shui; Wang, Hong-Bo

    2015-08-01

    A modified flamelet model is proposed for studying supersonic combustion with pressure variation considering that pressure is far from homogenous in a supersonic combustor. In this model, the flamelet database are tabulated at a reference pressure, while quantities at other pressure are obtained using a sixth-order polynomial in pressure. Attributed to merit of the modified model which compute coefficients for the expansion only. And they brought less requirements for memory and table lookup time, expensive cost is avoided. The performance of modified model is much better than the approach of using a flamelet model-based method with tabulation at different pressure values. Two types of hydrogen fueled scramjet combustors were introduced to validate the modified flamelet model. It was observed that the temperature is sensitive to the choice of model in combustion area, which in return will significantly affect the pressure. It was found that the results of modified model were in good agreement with the experimental data compared with the isobaric flamelet model, especially for temperature, whose value is more accurately predicted. It is concluded that the modified flamelet model was more effective for cases with a wide range of pressure variation.

  12. Operative record using intraoperative digital data in neurosurgery.

    PubMed

    Houkin, K; Kuroda, S; Abe, H

    2000-01-01

    The purpose of this study was to develop a new method for more efficient and accurate operative records using intra-operative digital data in neurosurgery, including macroscopic procedures and microscopic procedures under an operating microscope. Macroscopic procedures were recorded using a digital camera and microscopic procedures were also recorded using a microdigital camera attached to an operating microscope. Operative records were then recorded digitally and filed in a computer using image retouch software and database base software. The time necessary for editing of the digital data and completing the record was less than 30 minutes. Once these operative records are digitally filed, they are easily transferred and used as database. Using digital operative records along with digital photography, neurosurgeons can document their procedures more accurately and efficiently than by the conventional method (handwriting). A complete digital operative record is not only accurate but also time saving. Construction of a database, data transfer and desktop publishing can be achieved using the intra-operative data, including intra-operative photographs.

  13. Online Mendelian Inheritance in Man (OMIM), a knowledgebase of human genes and genetic disorders.

    PubMed

    Hamosh, Ada; Scott, Alan F; Amberger, Joanna S; Bocchini, Carol A; McKusick, Victor A

    2005-01-01

    Online Mendelian Inheritance in Man (OMIM) is a comprehensive, authoritative and timely knowledgebase of human genes and genetic disorders compiled to support human genetics research and education and the practice of clinical genetics. Started by Dr Victor A. McKusick as the definitive reference Mendelian Inheritance in Man, OMIM (http://www.ncbi.nlm.nih.gov/omim/) is now distributed electronically by the National Center for Biotechnology Information, where it is integrated with the Entrez suite of databases. Derived from the biomedical literature, OMIM is written and edited at Johns Hopkins University with input from scientists and physicians around the world. Each OMIM entry has a full-text summary of a genetically determined phenotype and/or gene and has numerous links to other genetic databases such as DNA and protein sequence, PubMed references, general and locus-specific mutation databases, HUGO nomenclature, MapViewer, GeneTests, patient support groups and many others. OMIM is an easy and straightforward portal to the burgeoning information in human genetics.

  14. Compressing DNA sequence databases with coil.

    PubMed

    White, W Timothy J; Hendy, Michael D

    2008-05-20

    Publicly available DNA sequence databases such as GenBank are large, and are growing at an exponential rate. The sheer volume of data being dealt with presents serious storage and data communications problems. Currently, sequence data is usually kept in large "flat files," which are then compressed using standard Lempel-Ziv (gzip) compression - an approach which rarely achieves good compression ratios. While much research has been done on compressing individual DNA sequences, surprisingly little has focused on the compression of entire databases of such sequences. In this study we introduce the sequence database compression software coil. We have designed and implemented a portable software package, coil, for compressing and decompressing DNA sequence databases based on the idea of edit-tree coding. coil is geared towards achieving high compression ratios at the expense of execution time and memory usage during compression - the compression time represents a "one-off investment" whose cost is quickly amortised if the resulting compressed file is transmitted many times. Decompression requires little memory and is extremely fast. We demonstrate a 5% improvement in compression ratio over state-of-the-art general-purpose compression tools for a large GenBank database file containing Expressed Sequence Tag (EST) data. Finally, coil can efficiently encode incremental additions to a sequence database. coil presents a compelling alternative to conventional compression of flat files for the storage and distribution of DNA sequence databases having a narrow distribution of sequence lengths, such as EST data. Increasing compression levels for databases having a wide distribution of sequence lengths is a direction for future work.

  15. Compressing DNA sequence databases with coil

    PubMed Central

    White, W Timothy J; Hendy, Michael D

    2008-01-01

    Background Publicly available DNA sequence databases such as GenBank are large, and are growing at an exponential rate. The sheer volume of data being dealt with presents serious storage and data communications problems. Currently, sequence data is usually kept in large "flat files," which are then compressed using standard Lempel-Ziv (gzip) compression – an approach which rarely achieves good compression ratios. While much research has been done on compressing individual DNA sequences, surprisingly little has focused on the compression of entire databases of such sequences. In this study we introduce the sequence database compression software coil. Results We have designed and implemented a portable software package, coil, for compressing and decompressing DNA sequence databases based on the idea of edit-tree coding. coil is geared towards achieving high compression ratios at the expense of execution time and memory usage during compression – the compression time represents a "one-off investment" whose cost is quickly amortised if the resulting compressed file is transmitted many times. Decompression requires little memory and is extremely fast. We demonstrate a 5% improvement in compression ratio over state-of-the-art general-purpose compression tools for a large GenBank database file containing Expressed Sequence Tag (EST) data. Finally, coil can efficiently encode incremental additions to a sequence database. Conclusion coil presents a compelling alternative to conventional compression of flat files for the storage and distribution of DNA sequence databases having a narrow distribution of sequence lengths, such as EST data. Increasing compression levels for databases having a wide distribution of sequence lengths is a direction for future work. PMID:18489794

  16. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    PubMed Central

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  17. Efficiently Distributing Component-Based Applications Across Wide-Area Environments

    DTIC Science & Technology

    2002-01-01

    Oracle 8.1.7 Enterprise Edition), each running on a dedicated 1GHz dual-processor Pentium III workstation. For the RUBiS tests, we used a MySQL 4.0.12...a variety of sophisticated network-accessible services such as e-mail, banking, on-line shopping, entertainment, and serv - ing as a data exchange...Beans Catalog Handles read-only queries to product database Customer Serves as a façade to Order and Account Stateful Session Beans ShoppingCart

  18. Using tablet technology in operational radiation safety applications.

    PubMed

    Phillips, Andrew; Linsley, Mark; Houser, Mike

    2013-11-01

    Tablet computers have become a mainstream product in today's personal, educational, and business worlds. These tablets offer computing power, storage, and a wide range of available products to meet nearly every user need. To take advantage of this new computing technology, a system was developed for the Apple iPad (Apple Inc. 1 Infinite Loop Cupertino, CA 95014) to perform health and safety inspections in the field using editable PDFs and saving them to a database while keeping the process easy and paperless.

  19. An integrated set of UNIX based system tools at control room level

    NASA Astrophysics Data System (ADS)

    Potepan, F.; Scafuri, C.; Bortolotto, C.; Surace, G.

    1994-12-01

    The design effort of providing a simple point-and-click approach to the equipment access has led to the definition and realization of a modular set of software tools to be used at the ELETTRA control room level. Point-to-point equipment access requires neither programming nor specific knowledge of the control system architecture. The development and integration of communication, graphic, editing and global database modules are described in depth, followed by a report of their use in the first commissioning period.

  20. Current Experiments in Particle Physics. 1996 Edition.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galic, Hrvoje

    2003-06-27

    This report contains summaries of current and recent experiments in Particle Physics. Included are experiments at BEPC (Beijing), BNL, CEBAF, CERN, CESR, DESY, FNAL, Frascati, ITEP (Moscow), JINR (Dubna), KEK, LAMPF, Novosibirsk, PNPI (St. Petersburg), PSI, Saclay, Serpukhov, SLAC, and TRIUMF, and also several proton decay and solar neutrino experiments. Excluded are experiments that finished taking data before 1991. Instructions are given for the World Wide Web (WWW) searching of the computer database (maintained under the SLAC-SPIRES system) that contains the summaries.

  1. Geologic map of the west-central Buffalo National River region, northern Arkansas

    USGS Publications Warehouse

    Hudson, Mark R.; Turner, Kenzie J.

    2014-01-01

    This report provides a geologic map database of the map area that improves understanding of the regional geologic framework and its influence on the regional groundwater flow system. Furthermore, additional edits were made to the Ponca and Jasper quadrangles in the following ways: new control points on important contacts were obtained using modern GPS; recent higher resolution elevation data allowed further control on placement of contacts; some new contacts were added, in particular the contact separating the upper and lower Everton Formation.

  2. Project Manager’s Guide to the Scientific and Technical Information (STINFO) Program and Technical Publications Process

    DTIC Science & Technology

    1993-12-01

    Iaporta .. y be definitive for the tubjoct proaentod, exploratory in natura, or an evaluation of critical Aubayato• or of technical problema , 4...International Security 9 Social and Natural Science Studies Field 41 Edit: (Type 3) -Entry of an invalid code when Performance Type is "C" or "M" will...analysis SF Foreign area social science research SP Foreign area policy planAing research BF Identifies databases with data on foreign forces or

  3. The Development of PIPA: An Integrated and Automated Pipeline for Genome-Wide Protein Function Annotation

    DTIC Science & Technology

    2008-01-25

    limitations and plans for improvement Perhaps, one of PIPA’s main limitations is that all of its currently integrated resources to predict protein function...are planning on expending PIPA’s function prediction capabilities by incorporating comparative analysis approaches, e.g., phy- logenetic tree analysis...tools and services. Nucleic Acids Res 2005/12/31 edition. 2006, 34(Database issue):D247-51. 6. Bru C, Courcelle E, Carrere S, Beausse Y, Dalmar S

  4. One-Dimensional Spontaneous Raman Measurements of Temperature Made in a Gas Turbine Combustor

    NASA Technical Reports Server (NTRS)

    Hicks, Yolanda R.; Locke, Randy J.; DeGroot, Wilhelmus A.; Anderson, Robert C.

    2002-01-01

    The NASA Glenn Research Center is working with the aeronautics industry to develop highly fuel-efficient and environmentally friendly gas turbine combustor technology. This effort includes testing new hardware designs at conditions that simulate the high-temperature, high-pressure environment expected in the next-generation of high-performance engines. Glenn has the only facilities in which such tests can be performed. One aspect of these tests is the use of nonintrusive optical and laser diagnostics to measure combustion species concentration, fuel/air ratio, fuel drop size, and velocity, and to visualize the fuel injector spray pattern and some combustion species distributions. These data not only help designers to determine the efficacy of specific designs, but provide a database for computer modelers and enhance our understanding of the many processes that take place within a combustor. Until recently, we lacked one critical capability, the ability to measure temperature. This article summarizes our latest developments in that area. Recently, we demonstrated the first-ever use of spontaneous Raman scattering to measure combustion temperatures within the Advanced Subsonics Combustion Rig (ASCR) sector rig. We also established the highest rig pressure ever achieved for a continuous-flow combustor facility, 54.4 bar. The ASCR facility can provide operating pressures from 1 to 60 bar (60 atm). This photograph shows the Raman system setup next to the ASCR rig. The test was performed using a NASA-concept fuel injector and Jet-A fuel over a range of air inlet temperatures, pressures, and fuel/air ratios.

  5. Dynamic Temperature and Pressure Measurements in the Core of a Propulsion Engine

    NASA Technical Reports Server (NTRS)

    Schuster, Bill; Gordon, Grant; Hultgren, Lennart S.

    2015-01-01

    Dynamic temperature and pressure measurements were made in the core of a TECH977 propulsion engine as part of a NASA funded investigation into indirect combustion noise. Dynamic temperature measurements were made in the combustor, the inter-turbine duct, and the mixer using ten two-wire thermocouple probes. Internal dynamic pressure measurements were made at the same locations using piezoresistive transducers installed in semi-infinite coils. Measurements were acquired at four steady state operating conditions covering the range of aircraft approach power settings. Fluctuating gas temperature spectra were computed from the thermocouple probe voltage measurements using a compensation procedure that was developed under previous NASA test programs. A database of simultaneously acquired dynamic temperature and dynamic pressure measurements was produced. Spectral and cross-spectral analyses were conducted to explore the characteristics of the temperature and pressure fluctuations inside the engine, with a particular focus on attempting to identify the presence of indirect combustion noise.

  6. Virus taxonomy: the database of the International Committee on Taxonomy of Viruses (ICTV)

    PubMed Central

    Dempsey, Donald M; Hendrickson, Robert Curtis; Orton, Richard J; Siddell, Stuart G; Smith, Donald B

    2018-01-01

    Abstract The International Committee on Taxonomy of Viruses (ICTV) is charged with the task of developing, refining, and maintaining a universal virus taxonomy. This task encompasses the classification of virus species and higher-level taxa according to the genetic and biological properties of their members; naming virus taxa; maintaining a database detailing the currently approved taxonomy; and providing the database, supporting proposals, and other virus-related information from an open-access, public web site. The ICTV web site (http://ictv.global) provides access to the current taxonomy database in online and downloadable formats, and maintains a complete history of virus taxa back to the first release in 1971. The ICTV has also published the ICTV Report on Virus Taxonomy starting in 1971. This Report provides a comprehensive description of all virus taxa covering virus structure, genome structure, biology and phylogenetics. The ninth ICTV report, published in 2012, is available as an open-access online publication from the ICTV web site. The current, 10th report (http://ictv.global/report/), is being published online, and is replacing the previous hard-copy edition with a completely open access, continuously updated publication. No other database or resource exists that provides such a comprehensive, fully annotated compendium of information on virus taxa and taxonomy. PMID:29040670

  7. The GEISA 2009 Spectroscopic Database System and its CNES/CNRS Ether Products and Services Center Interactive Distribution

    NASA Astrophysics Data System (ADS)

    Jacquinet-Husson, Nicole; Crépeau, Laurent; Capelle, Virginie; Scott, Noëlle; Armante, Raymond; Chédin, Alain; Boonne, Cathy; Poulet-Crovisier, Nathalie

    2010-05-01

    The GEISA (1) (Gestion et Etude des Informations Spectroscopiques Atmosphériques: Management and Study of Atmospheric Spectroscopic Information) computer-accessible database, initiated in 1976, is developed and maintained at LMD (Laboratoire de Météorologie Dynamique, France) a system comprising three independent sub-databases devoted respectively to : line transition parameters, infrared and ultraviolet/visible absorption cross-sections, microphysical and optical properties of atmospheric aerosols. The updated 2009 edition (GEISA-09) archives, in its line transition parameters sub-section, 50 molecules, corresponding to 111 isotopes, for a total of 3,807,997 entries, in the spectral range from 10-6 to 35,877.031 cm-1. Detailed description of the whole database contents will be documented. GEISA and GEISA/IASI are implemented on the CNES/CNRS Ether Products and Services Centre WEB site (http://ether.ipsl.jussieu.fr), where all archived spectroscopic data can be handled through general and user friendly associated management software facilities. These facilities will be described and widely illustrated, as well. Interactive demonstrations will be given if technical possibilities are feasible at the time of the Poster Display Session. More than 350 researchers are registered for on line use of GEISA on Ether. Currently, GEISA is involved in activities (2) related to the remote sensing of the terrestrial atmosphere thanks to the sounding performances of new generation of hyperspectral Earth' atmospheric sounders, like AIRS (Atmospheric Infrared Sounder -http://www-airs.jpl.nasa.gov/), in the USA, and IASI (Infrared Atmospheric Sounding Interferometer -http://earth-sciences.cnes.fr/IASI/) in Europe, using the 4A radiative transfer model (3) (4A/LMD http://ara.lmd.polytechnique.fr; 4A/OP co-developed by LMD and NOVELTIS -http://www.noveltis.fr/) with the support of CNES (2006). Refs: (1) Jacquinet-Husson N., N.A. Scott, A. Chédin,L. Crépeau, R. Armante, V. Capelle, J. Orphal, A. Coustenis, C. Boonne, N. Poulet-Crovisier, et al. : THE GEISA SPECTROSCOPIC DATABASE: Current and future archive for Earth and planetary atmosphere studies. JQSRT 109 (2008) 1043-1059. (2) Jacquinet-Husson N., N.A. Scott, A. Chédin, K. Garceran, R. Armante, et al. : The 2003 edition of the GEISA/IASI spectroscopic database. JQSRT, 95 (2005) 429-467. (3) Scott, N.A. and A. Chedin. A fast line-by-line method for atmospheric absorption computations: The Automatized Atmospheric Absorption Atlas. J. Appl. Meteor., 20 (1981) 556-564.

  8. The Iranian National Geodata Revision Strategy and Realization Based on Geodatabase

    NASA Astrophysics Data System (ADS)

    Haeri, M.; Fasihi, A.; Ayazi, S. M.

    2012-07-01

    In recent years, using of spatial database for storing and managing spatial data has become a hot topic in the field of GIS. Accordingly National Cartographic Center of Iran (NCC) produces - from time to time - some spatial data which is usually included in some databases. One of the NCC major projects was designing National Topographic Database (NTDB). NCC decided to create National Topographic Database of the entire country-based on 1:25000 coverage maps. The standard of NTDB was published in 1994 and its database was created at the same time. In NTDB geometric data was stored in MicroStation design format (DGN) which each feature has a link to its attribute data (stored in Microsoft Access file). Also NTDB file was produced in a sheet-wise mode and then stored in a file-based style. Besides map compilation, revision of existing maps has already been started. Key problems of NCC are revision strategy, NTDB file-based style storage and operator challenges (NCC operators are almost preferred to edit and revise geometry data in CAD environments). A GeoDatabase solution for national Geodata, based on NTDB map files and operators' revision preferences, is introduced and released herein. The proposed solution extends the traditional methods to have a seamless spatial database which it can be revised in CAD and GIS environment, simultaneously. The proposed system is the common data framework to create a central data repository for spatial data storage and management.

  9. Catalog of infrared observations. Part 1: Data

    NASA Technical Reports Server (NTRS)

    Gezari, Daniel Y.; Schmitz, Marion; Mead, Jaylee M.

    1987-01-01

    The Catalog of Infrared Observations (CIO) is a compilation of infrared astronomical observational data obtained from an extensive literature search of astronomical journals and major astronomical catalogs and surveys. The literature searches are complete for 1965 through 1986 in this Second Edition. The Catalog is published in two parts, with the observational data (roughly 200,000 observations of 20,000 individual sources) listed in Part I, and supporting appendices in Part II. The expanded Second Edition contains a new feature: complete IRAS 4-band data for all CIO sources detected, listed with the main Catalog observations, as well as in complete detail in the Appendix. The appendices include an atlas of infrared source positions, two bibliographies of infrared literature upon which the search was based, and, keyed to the main Catalog listings (organized alphabetically by author and then chronologically), an atlas of infrared spectral ranges, and IRAS data from the CIO sources. The complete CIO database is available to qualified users in printed microfiche and magnetic tape formats.

  10. Photogrammetric mapping for cadastral land information systems

    NASA Astrophysics Data System (ADS)

    Muzakidis, Panagiotis D.

    The creation of a "clean" digital database is a most important and complex task, upon which the usefulness of a Parcel-Based Land Information System depends. Capturing data by photogrammetric methods for cadastral purposes necessitates the transformation of data into a computer compatible form. Such input requires the encoding, editing and structuring of data. The research is carried out in two phases, the first is concerned with defining the data modelling schemes and the classification of basic data for a parcel-based land information system together with the photogrammetric methods to be adopted to collect these data. The second deals with data editing and data structuring processes in order to produce "clean" information relevant to such a system. Implementation of the proposed system at both the data collection stage and within the data processing stage itself demands a number of flexible criteria to be defined within the methodology. Development of these criteria will include consideration of the cadastral characteristics peculiar to Greece.

  11. Predatory Publishing Is a Threat to Non-Mainstream Science

    PubMed Central

    Nurmashev, Bekaidar

    2017-01-01

    This article highlights the issue of wasteful publishing practices that primarily affect non-mainstream science countries and rapidly growing academic disciplines. Numerous start-up open access publishers with soft or nonexistent quality checks and huge commercial interests have created a global crisis in the publishing market. Their publishing practices have been thoroughly examined, leading to the blacklisting of many journals by Jeffrey Beall. However, it appears that some subscription journals are also falling short of adhering to the international recommendations of global editorial associations. Unethical editing agencies that promote their services in non-mainstream science countries create more problems for inexperienced authors. It is suggested to regularly monitor the quality of already indexed journals and upgrade criteria of covering new sources by the Emerging Sources Citation Index (Web of Science), Scopus, and specialist bibliographic databases. Regional awareness campaigns to inform stakeholders of science communication about the importance of ethical writing, transparency of editing services, and permanent archiving can be also helpful for eradicating unethical publishing practices. PMID:28378542

  12. Predatory Publishing Is a Threat to Non-Mainstream Science.

    PubMed

    Gasparyan, Armen Yuri; Nurmashev, Bekaidar; Udovik, Elena E; Koroleva, Anna M; Kitas, George D

    2017-05-01

    This article highlights the issue of wasteful publishing practices that primarily affect non-mainstream science countries and rapidly growing academic disciplines. Numerous start-up open access publishers with soft or nonexistent quality checks and huge commercial interests have created a global crisis in the publishing market. Their publishing practices have been thoroughly examined, leading to the blacklisting of many journals by Jeffrey Beall. However, it appears that some subscription journals are also falling short of adhering to the international recommendations of global editorial associations. Unethical editing agencies that promote their services in non-mainstream science countries create more problems for inexperienced authors. It is suggested to regularly monitor the quality of already indexed journals and upgrade criteria of covering new sources by the Emerging Sources Citation Index (Web of Science), Scopus, and specialist bibliographic databases. Regional awareness campaigns to inform stakeholders of science communication about the importance of ethical writing, transparency of editing services, and permanent archiving can be also helpful for eradicating unethical publishing practices. © 2017 The Korean Academy of Medical Sciences.

  13. References that anyone can edit: review of Wikipedia citations in peer reviewed health science literature

    PubMed Central

    Hladkowicz, Emily S; Pigford, Ashlee-Ann E; Ufholz, Lee-Anne; Postonogova, Tatyana; Shin, Eunkyung; Boet, Sylvain

    2014-01-01

    Objectives To examine indexed health science journals to evaluate the prevalence of Wikipedia citations, identify the journals that publish articles with Wikipedia citations, and determine how Wikipedia is being cited. Design Bibliometric analysis. Study selection Publications in the English language that included citations to Wikipedia were retrieved using the online databases Scopus and Web of Science. Data sources To identify health science journals, results were refined using Ulrich’s database, selecting for citations from journals indexed in Medline, PubMed, or Embase. Using Thomson Reuters Journal Citation Reports, 2011 impact factors were collected for all journals included in the search. Data extraction Resulting citations were thematically coded, and descriptive statistics were calculated. Results 1433 full text articles from 1008 journals indexed in Medline, PubMed, or Embase with 2049 Wikipedia citations were accessed. The frequency of Wikipedia citations has increased over time; most citations occurred after December 2010. More than half of the citations were coded as definitions (n=648; 31.6%) or descriptions (n=482; 23.5%). Citations were not limited to journals with a low or no impact factor; the search found Wikipedia citations in many journals with high impact factors. Conclusions Many publications are citing information from a tertiary source that can be edited by anyone, although permanent, evidence based sources are available. We encourage journal editors and reviewers to use caution when publishing articles that cite Wikipedia. PMID:24603564

  14. Seventh edition (2010) of the AJCC/UICC staging system for gastric adenocarcinoma: is there room for improvement?

    PubMed

    Patel, Manali I; Rhoads, Kim F; Ma, Yifei; Ford, James M; Visser, Brendan C; Kunz, Pamela L; Fisher, George A; Chang, Daniel T; Koong, Albert; Norton, Jeffrey A; Poultsides, George A

    2013-05-01

    The gastric cancer AJCC/UICC staging system recently underwent significant revisions, but studies on Asian patients have reported a lack of adequate discrimination between various consecutive stages. We sought to validate the new system on a U.S. population database. California Cancer Registry data linked to the Office of Statewide Health Planning and Development discharge abstracts were used to identify patients with gastric adenocarcinoma (esophagogastric junction and gastric cardia tumors excluded) who underwent curative-intent surgical resection in California from 2002 to 2006. AJCC/UICC stage was recalculated based on the latest seventh edition. Overall survival probabilities were calculated using the Kaplan-Meier method. Of 1905 patients analyzed, 54 % were males with a median age of 70 years. Median number of pathologically examined lymph nodes was 12 (range, 1-90); 40 % of patients received adjuvant chemotherapy, and 31 % received adjuvant radiotherapy. The seventh edition AJCC/UICC system did not distinguish outcome adequately between stages IB and IIA (P = 0.40), or IIB and IIIA (P = 0.34). By merging stage II into 1 category and moving T2N1 to stage IB and T2N2, T1N3 to stage IIIA, we propose a new grouping system with improved discriminatory ability In this first study validating the new seventh edition AJCC/UICC staging system for gastric cancer on a U.S. population with a relatively limited number of lymph nodes examined, we found stages IB and IIA, as well as IIB and IIIA to perform similarly. We propose a revised stage grouping for the AJCC/UICC staging system that better discriminates between outcomes.

  15. Estimates of global, regional, and national annual CO{sub 2} emissions from fossil-fuel burning, hydraulic cement production, and gas flaring: 1950--1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boden, T.A.; Marland, G.; Andres, R.J.

    1995-12-01

    This document describes the compilation, content, and format of the most comprehensive C0{sub 2}-emissions database currently available. The database includes global, regional, and national annual estimates of C0{sub 2} emissions resulting from fossil-fuel burning, cement manufacturing, and gas flaring in oil fields for 1950--92 as well as the energy production, consumption, and trade data used for these estimates. The methods of Marland and Rotty (1983) are used to calculate these emission estimates. For the first time, the methods and data used to calculate CO, emissions from gas flaring are presented. This C0{sub 2}-emissions database is useful for carbon-cycle research, providesmore » estimates of the rate at which fossil-fuel combustion has released C0{sub 2} to the atmosphere, and offers baseline estimates for those countries compiling 1990 C0{sub 2}-emissions inventories.« less

  16. Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1 degree x 2 degrees quadrangle and part of the southern part of the Challis 1 degree x 2 degrees quadrangle, south-central Idaho

    USGS Publications Warehouse

    Link, P.K.; Mahoney, J.B.; Bruner, D.J.; Batatian, L.D.; Wilson, Eric; Williams, F.J.C.

    1995-01-01

    The paper version of the Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1x2 Quadrangle and part of the southern part of the Challis 1x2 Quadrangle, south-central Idaho was compiled by Paul Link and others in 1995. The plate was compiled on a 1:100,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  17. Evaluation of scientific periodicals and the Brazilian production of nursing articles.

    PubMed

    Erdmann, Alacoque Lorenzini; Marziale, Maria Helena Palucci; Pedreira, Mavilde da Luz Gonçalves; Lana, Francisco Carlos Félix; Pagliuca, Lorita Marlena Freitag; Padilha, Maria Itayra; Fernandes, Josicelia Dumêt

    2009-01-01

    This study aimed to identify nursing journals edited in Brazil indexed in the main bibliographic databases in the areas of health and nursing. It also aimed to classify the production of nursing graduate programs in 2007 according to the QUALIS/CAPES criteria used to classify scientific periodicals that disseminate the intellectual production of graduate programs in Brazil. This exploratory study used data from reports and documents available from CAPES to map scientific production and from searching the main international and national indexing databases. The findings from this research can help students, professors and coordinators of graduate programs in several ways: to understand the criteria of classifying periodicals; to be aware of the current production of graduate programs in the area of nursing; and to provide information that authors can use to select periodicals in which to publish their articles.

  18. [The Chilean Association of Biomedical Journal Editors].

    PubMed

    Reyes, H

    2001-01-01

    On September 29th, 2000, The Chilean Association of Biomedical Journal Editors was founded, sponsored by the "Comisión Nacional de Investigación Científica y Tecnológica (CONICYT)" (the Governmental Agency promoting and funding scientific research and technological development in Chile) and the "Sociedad Médica de Santiago" (Chilean Society of Internal Medicine). The Association adopted the goals of the World Association of Medical Editors (WAME) and therefore it will foster "cooperation and communication among Editors of Chilean biomedical journals; to improve editorial standards, to promote professionalism in medical editing through education, self-criticism and self-regulation; and to encourage research on the principles and practice of medical editing". Twenty nine journals covering a closely similar number of different biomedical sciences, medical specialties, veterinary, dentistry and nursing, became Founding Members of the Association. A Governing Board was elected: President: Humberto Reyes, M.D. (Editor, Revista Médica de Chile); Vice-President: Mariano del Sol, M.D. (Editor, Revista Chilena de Anatomía); Secretary: Anna María Prat (CONICYT); Councilors: Manuel Krauskopff, Ph.D. (Editor, Biological Research) and Maritza Rahal, M.D. (Editor, Revista de Otorrinolaringología y Cirugía de Cabeza y Cuello). The Association will organize a Symposium on Biomedical Journal Editing and will spread information stimulating Chilean biomedical journals to become indexed in international databases and in SciELO-Chile, the main Chilean scientific website (www.scielo.cl).

  19. A study of ignition phenomena of bulk metals by radiant heating

    NASA Technical Reports Server (NTRS)

    Branch, Melvin C.; Abbud-Madrid, A.; Feiereisen, T. J.; Daily, J. W.

    1993-01-01

    Early research on combustion of metals was motivated by the knowledge of the large heat release and corresponding high temperatures associated with metal-oxygen reactions. The advent of space flight brought about an increased interest in the ignition and combustion of metallic particles as additives in solid rocket propellants. More recently, attention has been given to the flammability properties of bulk, structural metals due to the number of accidental explosions of metal components in high-pressure oxygen systems. The following work represents a preliminary study that is part of a broader research effort aimed at providing further insight into the phenomena of bulk metal combustion by looking at the effects of gravity on the ignition behavior of metals. The scope of this preliminary experimental study includes the use of a non-coherent, continuous radiation ignition source, the measurement of temperature profiles of a variety of metals and a qualitative observation of the ignition phenomena at normal gravity. The specific objectives of the investigation include: (1) a feasibility study of the use of a continuous radiation source for metal ignition; (2) testing and characterization of the ignition behavior of a variety of metals; and (3) building a preliminary experimental database on ignition of metals under normal gravity conditions.

  20. Application of Boiler Op for combustion optimization at PEPCO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maines, P.; Williams, S.; Levy, E.

    1997-09-01

    Title IV requires the reduction of NOx at all stations within the PEPCO system. To assist PEPCO plant personnel in achieving low heat rates while meeting NOx targets, Lehigh University`s Energy Research Center and PEPCO developed a new combustion optimization software package called Boiler Op. The Boiler Op code contains an expert system, neural networks and an optimization algorithm. The expert system guides the plant engineer through a series of parametric boiler tests, required for the development of a comprehensive boiler database. The data are then analyzed by the neural networks and optimization algorithm to provide results on the boilermore » control settings which result in the best possible heat rate at a target NOx level or produce minimum NOx. Boiler Op has been used at both Potomac River and Morgantown Stations to help PEPCO engineers optimize combustion. With the use of Boiler Op, Morgantown Station operates under low NOx restrictions and continues to achieve record heat rate values, similar to pre-retrofit conditions. Potomac River Station achieves the regulatory NOx limit through the use of Boiler Op recommended control settings and without NOx burners. Importantly, any software like Boiler Op cannot be used alone. Its application must be in concert with human intelligence to ensure unit safety, reliability and accurate data collection.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, J.; Larson, E.M.; Holt, J.B.

    Real-time synchrotron diffraction has been used to monitor the phase transformations of highly exothermic, fast self-propagating solid combustion reactions on a subsecond time scale down to 100 milliseconds and in some instances to 10 milliseconds. Three systems were investigated: Ti + C {yields} TiC; Ti + C + xNi {yields} TiC + Ni-Ti alloy; and Al + Ni {yields} AlNi. In all three reactions, the first step was the melting of the metal reactants. Formation of TiC in the first two reactions was completed within 400 milliseconds of the melting of the Ti metal, indicating that the formation of TiCmore » took place during the passage of the combustion wave front. In the Al + Ni reaction, however, passage of the wave front was followed by the appearance and disappearance of at least one intermediate in the afterburn region. The final AlNi was formed some 5 seconds later and exhibited a delayed appearance of the (210) reflection, which tends to support a phase transformation from a disordered AlNi phase at high temperature to an ordered CsCl structure some 20 seconds later. This new experimental approach can be used to study the chemical dynamics of high-temperature solid-state phenomena and to provide the needed database to test various models for solid combustion. 28 refs., 4 figs.« less

  2. Exergie /4th revised and enlarged edition/

    NASA Astrophysics Data System (ADS)

    Baloh, T.; Wittwer, E.

    The theoretical concept of exergy is explained and its practical applications are discussed. Equilibrium and thermal equilibrium are reviewed as background, and exergy is considered as a reference point for solid-liquid, liquid-liquid, and liquid-gas systems. Exergetic calculations and their graphic depictions are covered. The concepts of enthalpy and entropy are reviewed in detail, including their applications to gas mixtures, solutions, and isolated substances. The exergy of gas mixtures, solutions, and isolated substances is discussed, including moist air, liquid water in water vapor, dry air, and saturation-limited solutions. Mollier exergy-enthalpy-entropy diagrams are presented for two-component systems, and exergy losses for throttling, isobaric mixing, and heat transfer are addressed. The relationship of exergy to various processes is covered, including chemical processes, combustion, and nuclear reactions. The optimization of evaporation plants through exergy is discussed. Calculative examples are presented for energy production and heating, industrial chemical processes, separation of liquid air, nuclear reactors, and others.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMeeking, Gavin R.; Kreidenweis, Sonia M.; Baker, Stephen

    We characterized the gas- and speciated aerosol-phase emissions from the open combustion of 33 different plant species during a series of 255 controlled laboratory burns during the Fire Laboratory at Missoula Experiments (FLAME). The plant species we tested were chosen to improve the existing database for U.S. domestic fuels: laboratory-based emission factors have not previously been reported for many commonly-burned species that are frequently consumed by fires near populated regions and protected scenic areas. The plants we tested included the chaparral species chamise, manzanita, and ceanothus, and species common to the southeastern US (common reed, hickory, kudzu, needlegrass rush, rhododendron,more » cord grass, sawgrass, titi, and wax myrtle). Fire-integrated emission factors for gas-phase CO{sub 2}, CO, CH{sub 4}, C{sub 2-4} hydrocarbons, NH{sub 3}, SO{sub 2}, NO, NO{sub 2}, HNO{sub 3} and particle-phase organic carbon (OC), elemental carbon (EC), SO{sub 4}{sup 2-}, NO{sub 3}{sup -}, Cl{sup -}, Na{sup +}, K{sup +}, and NH{sub 4}{sup +} generally varied with both fuel type and with the fire-integrated modified combustion efficiency (MCE), a measure of the relative importance of flaming- and smoldering-phase combustion to the total emissions during the burn. Chaparral fuels tended to emit less particulate OC per unit mass of dry fuel than did other fuel types, whereas southeastern species had some of the largest observed EF for total fine particulate matter. Our measurements often spanned a larger range of MCE than prior studies, and thus help to improve estimates for individual fuels of the variation of emissions with combustion conditions.« less

  4. Theoretical Innovations in Combustion Stability Research: Integrated Analysis and Computation

    DTIC Science & Technology

    2011-04-14

    Quirk JJ, Shepherd JE (1997) An analytical model for direct initiation of gaseous detonation waves, in 21st International Symposium on Shock Waves...the initial vorticity thickness, hi is here performed over (x1, x3) planes and ∆U0 is the initial velocity difference across the layer. In all cases...Reynolds numbers were 1452, 1507 and 2004. Selle et al. [9] showed that this database is relevant for fully-turbulent flow modeling . VI. RESULTS In all

  5. Evaluation of consumer drug information databases.

    PubMed

    Choi, J A; Sullivan, J; Pankaskie, M; Brufsky, J

    1999-01-01

    To evaluate prescription drug information contained in six consumer drug information databases available on CD-ROM, and to make health care professionals aware of the information provided, so that they may appropriately recommend these databases for use by their patients. Observational study of six consumer drug information databases: The Corner Drug Store, Home Medical Advisor, Mayo Clinic Family Pharmacist, Medical Drug Reference, Mosby's Medical Encyclopedia, and PharmAssist. Not applicable. Not applicable. Information on 20 frequently prescribed drugs was evaluated in each database. The databases were ranked using a point-scale system based on primary and secondary assessment criteria. For the primary assessment, 20 categories of information based on those included in the 1998 edition of the USP DI Volume II, Advice for the Patient: Drug Information in Lay Language were evaluated for each of the 20 drugs, and each database could earn up to 400 points (for example, 1 point was awarded if the database mentioned a drug's mechanism of action). For the secondary assessment, the inclusion of 8 additional features that could enhance the utility of the databases was evaluated (for example, 1 point was awarded if the database contained a picture of the drug), and each database could earn up to 8 points. The results of the primary and secondary assessments, listed in order of highest to lowest number of points earned, are as follows: Primary assessment--Mayo Clinic Family Pharmacist (379), Medical Drug Reference (251), PharmAssist (176), Home Medical Advisor (113.5), The Corner Drug Store (98), and Mosby's Medical Encyclopedia (18.5); secondary assessment--The Mayo Clinic Family Pharmacist (8), The Corner Drug Store (5), Mosby's Medical Encyclopedia (5), Home Medical Advisor (4), Medical Drug Reference (4), and PharmAssist (3). The Mayo Clinic Family Pharmacist was the most accurate and complete source of prescription drug information based on the USP DI Volume II and would be an appropriate database for health care professionals to recommend to patients.

  6. Tabulated Combustion Model Development For Non-Premixed Flames

    NASA Astrophysics Data System (ADS)

    Kundu, Prithwish

    Turbulent non-premixed flames play a very important role in the field of engineering ranging from power generation to propulsion. The coupling of fluid mechanics and complicated combustion chemistry of fuels pose a challenge for the numerical modeling of these type of problems. Combustion modeling in Computational Fluid Dynamics (CFD) is one of the most important tools used for predictive modeling of complex systems and to understand the basic fundamentals of combustion. Traditional combustion models solve a transport equation of each species with a source term. In order to resolve the complex chemistry accurately it is important to include a large number of species. However, the computational cost is generally proportional to the cube of number of species. The presence of a large number of species in a flame makes the use of CFD computationally expensive and beyond reach for some applications or inaccurate when solved with simplified chemistry. For highly turbulent flows, it also becomes important to incorporate the effects of turbulence chemistry interaction (TCI). The aim of this work is to develop high fidelity combustion models based on the flamelet concept and to significantly advance the existing capabilities. A thorough investigation of existing models (Finite-rate chemistry and Representative Interactive Flamelet (RIF)) and comparative study of combustion models was done initially on a constant volume combustion chamber with diesel fuel injection. The CFD modeling was validated with experimental results and was also successfully applied to a single cylinder diesel engine. The effect of number of flamelets on the RIF model and flamelet initialization strategies were studied. The RIF model with multiple flamelets is computationally expensive and a model was proposed on the frame work of RIF. The new model was based on tabulated chemistry and incorporated TCI effects. A multidimensional tabulated chemistry database generation code was developed based on the 1D diffusion flame solver. The proposed model did not use progress variables like the traditional chemistry tabulation methods. The resulting model demonstrated an order of magnitude computational speed up over the RIF model. The results were validated across a wide range of operating conditions for diesel injections and the results were in close agreement to those of the experimental data. History of scalar dissipation rates plays a very important role in non premixed flames. However, tabulated methods have not been able to incorporate this physics in their models. A comparative approach is developed that can quantify these effects and find correlations with flow variables. A new model is proposed to include these effects in tabulated combustion models. The model is initially validated for 1D counterflow diffusion flame problems at engine conditions. The model is further implemented and validated in a 3D RANS code across a range of operating conditions for spray flames.

  7. Digital Archiving: Where the Past Lives Again

    NASA Astrophysics Data System (ADS)

    Paxson, K. B.

    2012-06-01

    The process of digital archiving for variable star data by manual entry with an Excel spreadsheet is described. Excel-based tools including a Step Magnitude Calculator and a Julian Date Calculator for variable star observations where magnitudes and Julian dates have not been reduced are presented. Variable star data in the literature and the AAVSO International Database prior to 1911 are presented and reviewed, with recent archiving work being highlighted. Digitization using optical character recognition software conversion is also demonstrated, with editing and formatting suggestions for the OCR-converted text.

  8. Simulations to Evaluate Accuracy and Patient Dose in Neutron-Stimulated, Emission-Computed Tomography (NSECT) for Diagnosis of Breast Cancer

    DTIC Science & Technology

    2009-04-01

    table 1 and includes all of the excited state transitions that are included in the BNL database. In this study we considered only states that could be...16Cross Section Evaluation Working Group, ENDEIB-VI Summary Docu- mentation, Report BNL -NCS-17541 ENDF-201, edited by P. F. Rose, National Nuclear Data...Med. Phys., vol. 34, pp. 3866–3871, 2007. [37] National Nuclear Data Center BNL , NuDat 2.3 2007. [38] A. J. Kapadia and C. E. Floyd, “An attenuation

  9. Geologic Map of the Mount Trumbull 30' X 60' Quadrangle, Mohave and Coconino Counties, Northwestern Arizona

    USGS Publications Warehouse

    Billingsley, George H.; Wellmeyer, Jessica L.

    2003-01-01

    The geologic map of the Mount Trumbull 30' x 60' quadrangle is a cooperative product of the U.S. Geological Survey, the National Park Service, and the Bureau of Land Management that provides geologic map coverage and regional geologic information for visitor services and resource management of Grand Canyon National Park, Lake Mead Recreational Area, and Grand Canyon Parashant National Monument, Arizona. This map is a compilation of previous and new geologic mapping that encompasses the Mount Trumbull 30' x 60' quadrangle of Arizona. This digital database, a compilation of previous and new geologic mapping, contains geologic data used to produce the 100,000-scale Geologic Map of the Mount Trumbull 30' x 60' Quadrangle, Mohave and Coconino Counties, Northwestern Arizona. The geologic features that were mapped as part of this project include: geologic contacts and faults, bedrock and surficial geologic units, structural data, fold axes, karst features, mines, and volcanic features. This map was produced using 1:24,000-scale 1976 infrared aerial photographs followed by extensive field checking. Volcanic rocks were mapped as separate units when identified on aerial photographs as mappable and distinctly separate units associated with one or more pyroclastic cones and flows. Many of the Quaternary alluvial deposits that have similar lithology but different geomorphic characteristics were mapped almost entirely by photogeologic methods. Stratigraphic position and amount of erosional degradation were used to determine relative ages of alluvial deposits having similar lithologies. Each map unit and structure was investigated in detail in the field to ensure accuracy of description. Punch-registered mylar sheets were scanned at the Flagstaff Field Center using an Optronics 5040 raster scanner at a resolution of 50 microns (508 dpi). The scans were output in .rle format, converted to .rlc, and then converted to ARC/INFO grids. A tic file was created in geographic coordinates and projected into the base map projection (Polyconic) using a central meridian of -113.500. The tic file was used to transform the grid into Universal Transverse Mercator projection. The linework was vectorized using gridline. Scanned lines were edited interactively in ArcEdit. Polygons were attributed in ArcEdit and all artifacts and scanning errors visible at 1:100,000 were removed. Point data were digitized onscreen. Due to the discovery of digital and geologic errors on the original files, the ARC/INFO coverages were converted to a personal geodatabase and corrected in ArcMap. The feature classes which define the geologic units, lines and polygons, are topologically related and maintained in the geodatabase by a set of validation rules. The internal database structure and feature attributes were then modified to match other geologic map databases being created for the Grand Canyon region. Faults were edited with the downthrown block, if known, on the 'right side' of the line. The 'right' and 'left' sides of a line are determined from 'starting' at the line's 'from node' and moving to the line's end or 'to node'.

  10. The 2015 edition of the GEISA spectroscopic database

    NASA Astrophysics Data System (ADS)

    Jacquinet-Husson, N.; Armante, R.; Scott, N. A.; Chédin, A.; Crépeau, L.; Boutammine, C.; Bouhdaoui, A.; Crevoisier, C.; Capelle, V.; Boonne, C.; Poulet-Crovisier, N.; Barbe, A.; Chris Benner, D.; Boudon, V.; Brown, L. R.; Buldyreva, J.; Campargue, A.; Coudert, L. H.; Devi, V. M.; Down, M. J.; Drouin, B. J.; Fayt, A.; Fittschen, C.; Flaud, J.-M.; Gamache, R. R.; Harrison, J. J.; Hill, C.; Hodnebrog, Ø.; Hu, S.-M.; Jacquemart, D.; Jolly, A.; Jiménez, E.; Lavrentieva, N. N.; Liu, A.-W.; Lodi, L.; Lyulin, O. M.; Massie, S. T.; Mikhailenko, S.; Müller, H. S. P.; Naumenko, O. V.; Nikitin, A.; Nielsen, C. J.; Orphal, J.; Perevalov, V. I.; Perrin, A.; Polovtseva, E.; Predoi-Cross, A.; Rotger, M.; Ruth, A. A.; Yu, S. S.; Sung, K.; Tashkun, S. A.; Tennyson, J.; Tyuterev, Vl. G.; Vander Auwera, J.; Voronin, B. A.; Makie, A.

    2016-09-01

    The GEISA database (Gestion et Etude des Informations Spectroscopiques Atmosphériques: Management and Study of Atmospheric Spectroscopic Information) has been developed and maintained by the http://ara.abct.lmd.polytechnique.fr. The "line parameters database" contains 52 molecular species (118 isotopologues) and transitions in the spectral range from 10-6 to 35,877.031 cm-1, representing 5,067,351 entries, against 3,794,297 in GEISA-2011. Among the previously existing molecules, 20 molecular species have been updated. A new molecule (SO3) has been added. HDO, isotopologue of H2O, is now identified as an independent molecular species. Seven new isotopologues have been added to the GEISA-2015 database. The "cross section sub-database" has been enriched by the addition of 43 new molecular species in its infrared part, 4 molecules (ethane, propane, acetone, acetonitrile) are also updated; they represent 3% of the update. A new section is added, in the near-infrared spectral region, involving 7 molecular species: CH3CN, CH3I, CH3O2, H2CO, HO2, HONO, NH3. The "microphysical and optical properties of atmospheric aerosols sub-database" has been updated for the first time since 2003. It contains more than 40 species originating from NCAR and 20 from the http://eodg.atm.ox.ac.uk/ARIA/introduction_nocol.html. As for the previous versions, this new release of GEISA and associated management software facilities are implemented and freely accessible on the http://cds-espri.ipsl.fr/etherTypo/?id=950.

  11. Spectroscopy for Industrial Applications: High-Temperature Processes

    NASA Astrophysics Data System (ADS)

    Fateev, Alexander; Grosch, Helge; Clausen, Sonnik; Barton, Emma J.; Yurchenko, Sergei N.; Tennyson, Jonathan

    2014-06-01

    The continuous development of the spectroscopic databases brings new perspectives in the environmental and industrial on-line process control, monitoring and stimulates further optical sensor developments. This is because no calibration gases are needed and, in general, temperature-dependent spectral absorption features gases of interest for a specific instrument can in principle be calculated by knowing only the gas temperature and pressure in the process under investigation/monitoring. The latest HITRAN-2012 database contains IR/UV spectral data for 47 molecules and it is still growing. However use of HITRAN is limited to low-temperature processes (< 400 K) and therefor can be used for absorption spectra calculations at limited temperature/pressure ranges. For higher temperatures, the HITEMP-2010 database is available. Only a few molecules CO2, H2O, CO and NO are those of interest for e.g. various combustion and astronomical applications are included. In the recent few years, several efforts towards a development of hot line lists have been made; those have been implemented in the latest HITRAN2012 database1. High-resolution absorption measurements of NH3 (IR, 0.1 cm-1) and phenol (UV, 0.019 nm) on a flow gas cell2 up to 800 K are presented. Molecules are of great interest in various high-temperature environments including exoplanets, combustion and gasification. Measured NH3 hot lines have been assigned and spectra have been compared with that obtained by calculations based on the BYTe hot line list1. High-temperature NH3 absorption spectra have been used in the analysis of in situ high-resolution IR absorption measurements on the producer gas in low-temperature gasification process on a large scale. High-resolution UV temperature-dependent absorption cross-sections of phenol are reported for the first time. All UV data have been calibrated by relevant GC/MS measurements. Use of the data is demonstrated by the analysis of in situ UV absorption measurements on a small-scale low-temperature gasifier. A comparison between in situ, gas extraction and conventional gas sampling measurements is presented. Overall the presentation shows an example of successful industrial and academic partnerships within the framework of national and international ongoing projects.

  12. SPLICE: A program to assemble partial query solutions from three-dimensional database searches into novel ligands

    NASA Astrophysics Data System (ADS)

    Ho, Chris M. W.; Marshall, Garland R.

    1993-12-01

    SPLICE is a program that processes partial query solutions retrieved from 3D, structural databases to generate novel, aggregate ligands. It is designed to interface with the database searching program FOUNDATION, which retrieves fragments containing any combination of a user-specified minimum number of matching query elements. SPLICE eliminates aspects of structures that are physically incapable of binding within the active site. Then, a systematic rule-based procedure is performed upon the remaining fragments to ensure receptor complementarity. All modifications are automated and remain transparent to the user. Ligands are then assembled by linking components into composite structures through overlapping bonds. As a control experiment, FOUNDATION and SPLICE were used to reconstruct a know HIV-1 protease inhibitor after it had been fragmented, reoriented, and added to a sham database of fifty different small molecules. To illustrate the capabilities of this program, a 3D search query containing the pharmacophoric elements of an aspartic proteinase-inhibitor crystal complex was searched using FOUNDATION against a subset of the Cambridge Structural Database. One hundred thirty-one compounds were retrieved, each containing any combination of at least four query elements. Compounds were automatically screened and edited for receptor complementarity. Numerous combinations of fragments were discovered that could be linked to form novel structures, containing a greater number of pharmacophoric elements than any single retrieved fragment.

  13. Virus taxonomy: the database of the International Committee on Taxonomy of Viruses (ICTV).

    PubMed

    Lefkowitz, Elliot J; Dempsey, Donald M; Hendrickson, Robert Curtis; Orton, Richard J; Siddell, Stuart G; Smith, Donald B

    2018-01-04

    The International Committee on Taxonomy of Viruses (ICTV) is charged with the task of developing, refining, and maintaining a universal virus taxonomy. This task encompasses the classification of virus species and higher-level taxa according to the genetic and biological properties of their members; naming virus taxa; maintaining a database detailing the currently approved taxonomy; and providing the database, supporting proposals, and other virus-related information from an open-access, public web site. The ICTV web site (http://ictv.global) provides access to the current taxonomy database in online and downloadable formats, and maintains a complete history of virus taxa back to the first release in 1971. The ICTV has also published the ICTV Report on Virus Taxonomy starting in 1971. This Report provides a comprehensive description of all virus taxa covering virus structure, genome structure, biology and phylogenetics. The ninth ICTV report, published in 2012, is available as an open-access online publication from the ICTV web site. The current, 10th report (http://ictv.global/report/), is being published online, and is replacing the previous hard-copy edition with a completely open access, continuously updated publication. No other database or resource exists that provides such a comprehensive, fully annotated compendium of information on virus taxa and taxonomy. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Online Mendelian Inheritance in Man (OMIM), a knowledgebase of human genes and genetic disorders.

    PubMed

    Hamosh, Ada; Scott, Alan F; Amberger, Joanna; Bocchini, Carol; Valle, David; McKusick, Victor A

    2002-01-01

    Online Mendelian Inheritance in Man (OMIM) is a comprehensive, authoritative and timely knowledgebase of human genes and genetic disorders compiled to support research and education in human genomics and the practice of clinical genetics. Started by Dr Victor A. McKusick as the definitive reference Mendelian Inheritance in Man, OMIM (www.ncbi.nlm.nih.gov/omim) is now distributed electronically by the National Center for Biotechnology Information (NCBI), where it is integrated with the Entrez suite of databases. Derived from the biomedical literature, OMIM is written and edited at Johns Hopkins University with input from scientists and physicians around the world. Each OMIM entry has a full-text summary of a genetically determined phenotype and/or gene and has numerous links to other genetic databases such as DNA and protein sequence, PubMed references, general and locus-specific mutation databases, approved gene nomenclature, and the highly detailed mapviewer, as well as patient support groups and many others. OMIM is an easy and straightforward portal to the burgeoning information in human genetics.

  15. A Dual-Plane PIV Study of Turbulent Heat Transfer Flows

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Wroblewski, Adam C.; Locke, Randy J.

    2016-01-01

    Thin film cooling is a widely used technique in turbomachinery and rocket propulsion applications, where cool injection air protects a surface from hot combustion gases. The injected air typically has a different velocity and temperature from the free stream combustion flow, yielding a flow field with high turbulence and large temperature differences. These thin film cooling flows provide a good test case for evaluating computational model prediction capabilities. The goal of this work is to provide a database of flow field measurements for validating computational flow prediction models applied to turbulent heat transfer flows. In this work we describe the application of a Dual-Plane Particle Image Velocimetry (PIV) technique in a thin film cooling wind tunnel facility where the injection air stream velocity and temperatures are varied in order to provide benchmark turbulent heat transfer flow field measurements. The Dual-Plane PIV data collected include all three components of velocity and all three components of vorticity, spanning the width of the tunnel at multiple axial measurement planes.

  16. Characterisation of two-stage ignition in diesel engine-relevant thermochemical conditions using direct numerical simulation

    DOE PAGES

    Krisman, Alex; Hawkes, Evatt R.; Talei, Mohsen; ...

    2016-08-30

    With the goal of providing a more detailed fundamental understanding of ignition processes in diesel engines, this study reports analysis of a direct numerical simulation (DNS) database. In the DNS, a pseudo turbulent mixing layer of dimethyl ether (DME) at 400 K and air at 900 K is simulated at a pressure of 40 atmospheres. At these conditions, DME exhibits a two-stage ignition and resides within the negative temperature coefficient (NTC) regime of ignition delay times, similar to diesel fuel. The analysis reveals a complex ignition process with several novel features. Autoignition occurs as a distributed, two-stage event. The high-temperaturemore » stage of ignition establishes edge flames that have a hybrid premixed/autoignition flame structure similar to that previously observed for lifted laminar flames at similar thermochemical conditions. In conclusion, a combustion mode analysis based on key radical species illustrates the multi-stage and multi-mode nature of the ignition process and highlights the substantial modelling challenge presented by diesel combustion.« less

  17. Measurement of transient gas flow parameters by diode laser absorption spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolshov, M A; Kuritsyn, Yu A; Liger, V V

    2015-04-30

    An absorption spectrometer based on diode lasers is developed for measuring two-dimension maps of temperature and water vapour concentration distributions in the combustion zones of two mixing supersonic flows of fuel and oxidiser in the single run regime. The method of measuring parameters of hot combustion zones is based on detection of transient spectra of water vapour absorption. The design of the spectrometer considerably reduces the influence of water vapour absorption along the path of a sensing laser beam outside the burning chamber. The optical scheme is developed, capable of matching measurement results in different runs of mixture burning. Amore » new algorithm is suggested for obtaining information about the mixture temperature by constructing the correlation functions of the experimental spectrum with those simulated from databases. A two-dimensional map of temperature distribution in a test chamber is obtained for the first time under the conditions of plasma-induced combusion of the ethylene – air mixture. (laser applications and other topics in quantum electronics)« less

  18. Experiment Management System for the SND Detector

    NASA Astrophysics Data System (ADS)

    Pugachev, K.

    2017-10-01

    We present a new experiment management system for the SND detector at the VEPP-2000 collider (Novosibirsk). An important part to report about is access to experimental databases (configuration, conditions and metadata). The system is designed in client-server architecture. User interaction comes true using web-interface. The server side includes several logical layers: user interface templates; template variables description and initialization; implementation details. The templates are meant to involve as less IT knowledge as possible. Experiment configuration, conditions and metadata are stored in a database. To implement the server side Node.js, a modern JavaScript framework, has been chosen. A new template engine having an interesting feature is designed. A part of the system is put into production. It includes templates dealing with showing and editing first level trigger configuration and equipment configuration and also showing experiment metadata and experiment conditions data index.

  19. Description of the process used to create 1992 Hanford Morality Study database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, E.S.; Buchanan, J.A.; Holter, N.A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL`s Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL`s Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositionsmore » of radionuclides also maintained by PNL`s Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.« less

  20. Description of the process used to create 1992 Hanford Morality Study database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, E. S.; Buchanan, J. A.; Holter, N. A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL's Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL's Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositionsmore » of radionuclides also maintained by PNL's Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.« less

  1. National launch strategy vehicle data management system

    NASA Technical Reports Server (NTRS)

    Cordes, David

    1990-01-01

    The national launch strategy vehicle data management system (NLS/VDMS) was developed as part of the 1990 NASA Summer Faculty Fellowship Program. The system was developed under the guidance of the Engineering Systems Branch of the Information Systems Office, and is intended for use within the Program Development Branch PD34. The NLS/VDMS is an on-line database system that permits the tracking of various launch vehicle configurations within the program development office. The system is designed to permit the definition of new launch vehicles, as well as the ability to display and edit existing launch vehicles. Vehicles can be grouped in logical architectures within the system. Reports generated from this package include vehicle data sheets, architecture data sheets, and vehicle flight rate reports. The topics covered include: (1) system overview; (2) initial system development; (3) supercard hypermedia authoring system; (4) the ORACLE database; and (5) system evaluation.

  2. [Over- or underestimated? Bibliographic survey of the biomedical periodicals published in Hungary].

    PubMed

    Berhidi, Anna; Horváth, Katalin; Horváth, Gabriella; Vasas, Lívia

    2013-06-30

    This publication - based on an article published in 2006 - emphasises the qualities of the current biomedical periodicals of Hungarian editions. The aim of this study was to analyse how Hungarian journals meet the requirements of the scientific aspect and international visibility. Authors evaluated 93 Hungarian biomedical periodicals by 4 viewpoints of the two criteria mentioned above. 35% of the analysed journals complete the attributes of scientific aspect, 5% the international visibility, 6% fulfill all examined criteria, and 25% are indexed in international databases. 6 biomedical Hungarian periodicals covered by each of the three main bibliographic databases (Medline, Scopus, Web of Science) have the best qualities. Authors recommend to improve viewpoints of the scientific aspect and international visibility. The basis of qualitative adequacy are the accurate authors' guidelines, title, abstract, keywords of the articles in English, and the ability to publish on time.

  3. Kinetic Modeling using BioPAX ontology

    PubMed Central

    Ruebenacker, Oliver; Moraru, Ion. I.; Schaff, James C.; Blinov, Michael L.

    2010-01-01

    Thousands of biochemical interactions are available for download from curated databases such as Reactome, Pathway Interaction Database and other sources in the Biological Pathways Exchange (BioPAX) format. However, the BioPAX ontology does not encode the necessary information for kinetic modeling and simulation. The current standard for kinetic modeling is the System Biology Markup Language (SBML), but only a small number of models are available in SBML format in public repositories. Additionally, reusing and merging SBML models presents a significant challenge, because often each element has a value only in the context of the given model, and information encoding biological meaning is absent. We describe a software system that enables a variety of operations facilitating the use of BioPAX data to create kinetic models that can be visualized, edited, and simulated using the Virtual Cell (VCell), including improved conversion to SBML (for use with other simulation tools that support this format). PMID:20862270

  4. Subscale Test Methods for Combustion Devices

    NASA Technical Reports Server (NTRS)

    Anderson, W. E.; Sisco, J. C.; Long, M. R.; Sung, I.-K.

    2005-01-01

    Stated goals for long-life LRE s have been between 100 and 500 cycles: 1) Inherent technical difficulty of accurately defining the transient and steady state thermochemical environments and structural response (strain); 2) Limited statistical basis on failure mechanisms and effects of design and operational variability; and 3) Very high test costs and budget-driven need to protect test hardware (aversion to test-to-failure). Ambitious goals will require development of new databases: a) Advanced materials, e.g., tailored composites with virtually unlimited property variations; b) Innovative functional designs to exploit full capabilities of advanced materials; and c) Different cycles/operations. Subscale testing is one way to address technical and budget challenges: 1) Prototype subscale combustors exposed to controlled simulated conditions; 2) Complementary to conventional laboratory specimen database development; 3) Instrumented with sensors to measure thermostructural response; and 4) Coupled with analysis

  5. Development of the updated system of city underground pipelines based on Visual Studio

    NASA Astrophysics Data System (ADS)

    Zhang, Jianxiong; Zhu, Yun; Li, Xiangdong

    2009-10-01

    Our city has owned the integrated pipeline network management system with ArcGIS Engine 9.1 as the bottom development platform and with Oracle9i as basic database for storaging data. In this system, ArcGIS SDE9.1 is applied as the spatial data engine, and the system was a synthetic management software developed with Visual Studio visualization procedures development tools. As the pipeline update function of the system has the phenomenon of slower update and even sometimes the data lost, to ensure the underground pipeline data can real-time be updated conveniently and frequently, and the actuality and integrity of the underground pipeline data, we have increased a new update module in the system developed and researched by ourselves. The module has the powerful data update function, and can realize the function of inputting and outputting and rapid update volume of data. The new developed module adopts Visual Studio visualization procedures development tools, and uses access as the basic database to storage data. We can edit the graphics in AutoCAD software, and realize the database update using link between the graphics and the system. Practice shows that the update module has good compatibility with the original system, reliable and high update efficient of the database.

  6. Development of a web-based video management and application processing system

    NASA Astrophysics Data System (ADS)

    Chan, Shermann S.; Wu, Yi; Li, Qing; Zhuang, Yueting

    2001-07-01

    How to facilitate efficient video manipulation and access in a web-based environment is becoming a popular trend for video applications. In this paper, we present a web-oriented video management and application processing system, based on our previous work on multimedia database and content-based retrieval. In particular, we extend the VideoMAP architecture with specific web-oriented mechanisms, which include: (1) Concurrency control facilities for the editing of video data among different types of users, such as Video Administrator, Video Producer, Video Editor, and Video Query Client; different users are assigned various priority levels for different operations on the database. (2) Versatile video retrieval mechanism which employs a hybrid approach by integrating a query-based (database) mechanism with content- based retrieval (CBR) functions; its specific language (CAROL/ST with CBR) supports spatio-temporal semantics of video objects, and also offers an improved mechanism to describe visual content of videos by content-based analysis method. (3) Query profiling database which records the `histories' of various clients' query activities; such profiles can be used to provide the default query template when a similar query is encountered by the same kind of users. An experimental prototype system is being developed based on the existing VideoMAP prototype system, using Java and VC++ on the PC platform.

  7. Effect of multiphase radiation on coal combustion in a pulverized coal jet flame

    NASA Astrophysics Data System (ADS)

    Wu, Bifen; Roy, Somesh P.; Zhao, Xinyu; Modest, Michael F.

    2017-08-01

    The accurate modeling of coal combustion requires detailed radiative heat transfer models for both gaseous combustion products and solid coal particles. A multiphase Monte Carlo ray tracing (MCRT) radiation solver is developed in this work to simulate a laboratory-scale pulverized coal flame. The MCRT solver considers radiative interactions between coal particles and three major combustion products (CO2, H2O, and CO). A line-by-line spectral database for the gas phase and a size-dependent nongray correlation for the solid phase are employed to account for the nongray effects. The flame structure is significantly altered by considering nongray radiation and the lift-off height of the flame increases by approximately 35%, compared to the simulation without radiation. Radiation is also found to affect the evolution of coal particles considerably as it takes over as the dominant mode of heat transfer for medium-to-large coal particles downstream of the flame. To investigate the respective effects of spectral models for the gas and solid phases, a Planck-mean-based gray gas model and a size-independent gray particle model are applied in a frozen-field analysis of a steady-state snapshot of the flame. The gray gas approximation considerably underestimates the radiative source terms for both the gas phase and the solid phase. The gray coal approximation also leads to under-prediction of the particle emission and absorption. However, the level of under-prediction is not as significant as that resulting from the employment of the gray gas model. Finally, the effect of the spectral property of ash on radiation is also investigated and found to be insignificant for the present target flame.

  8. Predicting the formation and the dispersion of toxic combustion products from the fires of dangerous substances

    NASA Astrophysics Data System (ADS)

    Nevrlý, V.; Bitala, P.; Danihelka, P.; Dobeš, P.; Dlabka, J.; Hejzlar, T.; Baudišová, B.; Míček, D.; Zelinger, Z.

    2012-04-01

    Natural events, such as wildfires, lightning or earthquakes represent a frequent trigger of industrial fires involving dangerous substances. Dispersion of smoke plume from such fires and the effects of toxic combustion products are one of the reference scenarios expected in the framework of major accident prevention. Nowadays, tools for impact assessment of these events are rather missing. Detailed knowledge of burning material composition, atmospheric conditions, and other factors are required in order to describe quantitatively the source term of toxic fire products and to evaluate the parameters of smoke plume. Nevertheless, an assessment of toxic emissions from large scale fires involves a high degree of uncertainty, because of the complex character of physical and chemical processes in the harsh environment of uncontrolled flame. Among the others, soot particle formation can be mentioned as still being one of the unresolved problems in combustion chemistry, as well as decomposition pathways of chemical substances. Therefore, simplified approach for estimating the emission factors from outdoor fires of dangerous chemicals, utilizable for major accident prevention and preparedness, was developed and the case study illustrating the application of the proposed method was performed. ALOFT-FT software tool based on large eddy simulation of buoyant fire plumes was employed for predicting the local toxic contamination in the down-wind vicinity of the fire. The database of model input parameters can be effectively modified enabling the simulation of the smoke plume from pool fires or jet fires of arbitrary flammable (or combustible) gas, liquid or solid. This work was supported by the Ministry of Education, Youth and Sports of the Czech Republic via the project LD11012 (in the frame of the COST CM0901 Action) and the Ministry of Environment of the Czech Republic (project no. SPII 1a10 45/70).

  9. The Fleet Application for Scheduling and Tracking (FAST) Management Website

    NASA Technical Reports Server (NTRS)

    Marrero-Perez, Radames J.

    2014-01-01

    The FAST application was designed to replace the paper and pen method of checking out and checking in GSA Vehicles at KSC. By innovating from a paper and pen based checkout system to a fully digital one, not only the resources wasted by printing the checkout forms have been reduced, but it also reduces significantly the time that users and fleet managers need to interact with the system as well as improving the record accuracy for each vehicle. The vehicle information is pulled from a centralized database server in the SPSDL. In an attempt to add a new feature to the FAST application, the author of this report (alongside the FAST developers) has been designing and developing the FAST Management Website. The GSA fleet managers had to rely on the FAST developers in order to add new vehicles, edit vehicles and previous transactions, or for generating vehicles reports. By providing an easy-to-use FAST Management Website portal, the GSA fleet managers are now able to easily move vehicles, edit records, and print reports.

  10. Independent examination of the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV): what does the WAIS-IV measure?

    PubMed

    Benson, Nicholas; Hulac, David M; Kranzler, John H

    2010-03-01

    Published empirical evidence for the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) does not address some essential questions pertaining to the applied practice of intellectual assessment. In this study, the structure and cross-age invariance of the latest WAIS-IV revision were examined to (a) elucidate the nature of the constructs measured and (b) determine whether the same constructs are measured across ages. Results suggest that a Cattell-Horn-Carroll (CHC)-inspired structure provides a better description of test performance than the published scoring structure does. Broad CHC abilities measured by the WAIS-IV include crystallized ability (Gc), fluid reasoning (Gf), visual processing (Gv), short-term memory (Gsm), and processing speed (Gs), although some of these abilities are measured more comprehensively than are others. Additionally, the WAIS-IV provides a measure of quantitative reasoning (QR). Results also suggest a lack of cross-age invariance resulting from age-related differences in factor loadings. Formulas for calculating CHC indexes and suggestions for interpretation are provided. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  11. Factor structure of the Wechsler Intelligence Scale for Children-Fifth Edition: Exploratory factor analyses with the 16 primary and secondary subtests.

    PubMed

    Canivez, Gary L; Watkins, Marley W; Dombrowski, Stefan C

    2016-08-01

    The factor structure of the 16 Primary and Secondary subtests of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V; Wechsler, 2014a) standardization sample was examined with exploratory factor analytic methods (EFA) not included in the WISC-V Technical and Interpretive Manual (Wechsler, 2014b). Factor extraction criteria suggested 1 to 4 factors and results favored 4 first-order factors. When this structure was transformed with the Schmid and Leiman (1957) orthogonalization procedure, the hierarchical g-factor accounted for large portions of total and common variance while the 4 first-order factors accounted for small portions of total and common variance; rendering interpretation at the factor index level less appropriate. Although the publisher favored a 5-factor model where the Perceptual Reasoning factor was split into separate Visual Spatial and Fluid Reasoning dimensions, no evidence for 5 factors was found. It was concluded that the WISC-V provides strong measurement of general intelligence and clinical interpretation should be primarily, if not exclusively, at that level. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. [Investigation on events of bus on fire in 6 years in the mainland of China].

    PubMed

    Wang, X G; Liu, Y; Cen, Y; Wu, P; Zhou, H L; Han, C M

    2016-12-20

    Objective: To retrospectively analyze the characteristics of events of bus on fire in 6 years in the mainland of China. Methods: Events of bus on fire happened between January 2009 and December 2014 were retrieved through Baidu search engine, Chinese Journals Full - text Database, and PubMed database in the search strategy with " bus" and " fire" or " arson" as keywords combined with the name of provinces, autonomous regions, and municipalities of the mainland of China. The occurrence time, region, cause of fire, casualties of each event were recorded, and the correlative analysis was conducted. Data were processed with Microsoft Excel software. Results: Totally 287 events of bus on fire were retrieved, among which 49 events happened in 2009, 36 events happened in 2010, 35 events happened in 2011, 37 events happened in 2012, and respectively 65 events happened in 2013 and 2014. The events of bus on fire most frequently happened in June and July, respectively 49 and 39 events. Among the distribution of occurrence regions of events of bus on fire, there were 78 events (27.18%) in east China, 52 events (18.12%) in northeast China, 41 events (14.29%) both in north China and south China. Among the causes of events of bus on fire, spontaneous combustion of bus ranked in the first (267 events, accounting for 93.03%), followed by arson (13 events, accounting for 4.53%). Among the 13 events of bus on fire caused by arson, 7 events happened between 16: 00 and 20: 00, and 3 events happened between 8: 00 and 10: 00. Totally 27 events of bus on fire (9.41%) were with casualties, among which 13 events (48.15%) were caused by spontaneous combustion of bus, 10 events (37.04%) were caused by arson, and 4 events (14.81%) were caused by traffic accidents. Arson caused the most severe casualties (at least 88 deaths and 287 injuries), followed by spontaneous combustion of bus (at least 35 deaths and 140 injuries) and traffic accidents (at least 9 deaths and 20 injuries). Conclusions: Events of bus on fire happened more frequently in recent years in the mainland of China, and the frequencies were much higher especially in June and July. Most events were caused by spontaneous combustion of bus, followed by arson. Most of the events of bus on fire caused by arson happened in the morning and evening rush hours of urban traffic, and althouth the occurrence rate was not high, the casualties were most severe.

  13. Fiber-coupled 2.7 µm laser absorption sensor for CO2 in harsh combustion environments

    NASA Astrophysics Data System (ADS)

    Spearrin, R. M.; Goldenstein, C. S.; Jeffries, J. B.; Hanson, R. K.

    2013-05-01

    A tunable diode laser absorption sensor near 2.7 µm, based on 1f-normalized wavelength-modulation spectroscopy with second-harmonic detection (WMS-2f), was developed to measure CO2 concentration in harsh combustion flows. Wavelength selection at 3733.48 cm-1 exploited the overlap of two CO2 transitions in the ν1 + ν3 vibrational band at 3733.468 cm-1 and 3733.498 cm-1. Primary factors influencing wavelength selection were isolation and strength of the CO2 absorption lines relative to infrared water absorption at elevated pressures and temperatures. The HITEMP 2010 database was used to model the combined CO2 and H2O absorption spectra, and key line-strength and line-broadening spectroscopic parameters were verified by high-temperature static cell measurements. To validate the accuracy and precision of the WMS-based sensor, measurements of CO2 concentration were carried out in non-reactive shock-tube experiments (P ˜ 3-12 atm, T ˜ 1000-2600 K). The laser was then free-space fiber-coupled with a zirconium fluoride single-mode fiber for remote light delivery to harsh combustion environments, and demonstrated on an ethylene/air pulse detonation combustor at pressures up to 10 atm and temperatures up to 2500 K. To our knowledge, this work represents the first time-resolved in-stream measurements of CO2 concentration in a detonation-based engine.

  14. Simultaneous species concentration and temperature measurements using laser induced breakdown spectroscopy with direct spectrum matching

    NASA Astrophysics Data System (ADS)

    McGann, Brendan J.

    Laser induced breakdown spectroscopy (LIBS) is used to simultaneously measure hydrocarbon fuel concentration and temperature in high temperature, high speed, compressible, and reacting flows, a regime in which LIBS has not been done previously. Emission spectra from the plasma produced from a focused laser pulse is correlated in the combustion region of a model scramjet operating in supersonic wind tunnel. A 532 nm Nd:YAG laser operating at 10 Hz is used to induce break-down. The emissions are captured during a 10 ns gate time approximately 75 ns after the first arrival of photons at the measurement location in order to minimize the measurement uncertainty in the turbulent, compressible, high-speed, and reacting environment. Three methods of emission detection are used and a new backward scattering direction method is developed that is beneficial in reducing the amount of optical access needed to perform LIBS measurements. Measurements are taken in the model supersonic combustion and the ignition process is shown to be highly dependent on fuel concentration and gas density as well as combustion surface temperature, concentration gradient, and flow field. Direct spectrum matching method is developed and used for quantitative measurements. In addition, a comprehensive database of spectra covering the fuel concentrations and gas densities found in the wind tunnel of Research Cell 19 at Wright Patterson Air Force Base is created which can be used for further work.

  15. Fast online and index-based algorithms for approximate search of RNA sequence-structure patterns

    PubMed Central

    2013-01-01

    Background It is well known that the search for homologous RNAs is more effective if both sequence and structure information is incorporated into the search. However, current tools for searching with RNA sequence-structure patterns cannot fully handle mutations occurring on both these levels or are simply not fast enough for searching large sequence databases because of the high computational costs of the underlying sequence-structure alignment problem. Results We present new fast index-based and online algorithms for approximate matching of RNA sequence-structure patterns supporting a full set of edit operations on single bases and base pairs. Our methods efficiently compute semi-global alignments of structural RNA patterns and substrings of the target sequence whose costs satisfy a user-defined sequence-structure edit distance threshold. For this purpose, we introduce a new computing scheme to optimally reuse the entries of the required dynamic programming matrices for all substrings and combine it with a technique for avoiding the alignment computation of non-matching substrings. Our new index-based methods exploit suffix arrays preprocessed from the target database and achieve running times that are sublinear in the size of the searched sequences. To support the description of RNA molecules that fold into complex secondary structures with multiple ordered sequence-structure patterns, we use fast algorithms for the local or global chaining of approximate sequence-structure pattern matches. The chaining step removes spurious matches from the set of intermediate results, in particular of patterns with little specificity. In benchmark experiments on the Rfam database, our improved online algorithm is faster than the best previous method by up to factor 45. Our best new index-based algorithm achieves a speedup of factor 560. Conclusions The presented methods achieve considerable speedups compared to the best previous method. This, together with the expected sublinear running time of the presented index-based algorithms, allows for the first time approximate matching of RNA sequence-structure patterns in large sequence databases. Beyond the algorithmic contributions, we provide with RaligNAtor a robust and well documented open-source software package implementing the algorithms presented in this manuscript. The RaligNAtor software is available at http://www.zbh.uni-hamburg.de/ralignator. PMID:23865810

  16. Radiative transfer and spectroscopic databases: A line-sampling Monte Carlo approach

    NASA Astrophysics Data System (ADS)

    Galtier, Mathieu; Blanco, Stéphane; Dauchet, Jérémi; El Hafi, Mouna; Eymet, Vincent; Fournier, Richard; Roger, Maxime; Spiesser, Christophe; Terrée, Guillaume

    2016-03-01

    Dealing with molecular-state transitions for radiative transfer purposes involves two successive steps that both reach the complexity level at which physicists start thinking about statistical approaches: (1) constructing line-shaped absorption spectra as the result of very numerous state-transitions, (2) integrating over optical-path domains. For the first time, we show here how these steps can be addressed simultaneously using the null-collision concept. This opens the door to the design of Monte Carlo codes directly estimating radiative transfer observables from spectroscopic databases. The intermediate step of producing accurate high-resolution absorption spectra is no longer required. A Monte Carlo algorithm is proposed and applied to six one-dimensional test cases. It allows the computation of spectrally integrated intensities (over 25 cm-1 bands or the full IR range) in a few seconds, regardless of the retained database and line model. But free parameters need to be selected and they impact the convergence. A first possible selection is provided in full detail. We observe that this selection is highly satisfactory for quite distinct atmospheric and combustion configurations, but a more systematic exploration is still in progress.

  17. MimoSA: a system for minimotif annotation

    PubMed Central

    2010-01-01

    Background Minimotifs are short peptide sequences within one protein, which are recognized by other proteins or molecules. While there are now several minimotif databases, they are incomplete. There are reports of many minimotifs in the primary literature, which have yet to be annotated, while entirely novel minimotifs continue to be published on a weekly basis. Our recently proposed function and sequence syntax for minimotifs enables us to build a general tool that will facilitate structured annotation and management of minimotif data from the biomedical literature. Results We have built the MimoSA application for minimotif annotation. The application supports management of the Minimotif Miner database, literature tracking, and annotation of new minimotifs. MimoSA enables the visualization, organization, selection and editing functions of minimotifs and their attributes in the MnM database. For the literature components, Mimosa provides paper status tracking and scoring of papers for annotation through a freely available machine learning approach, which is based on word correlation. The paper scoring algorithm is also available as a separate program, TextMine. Form-driven annotation of minimotif attributes enables entry of new minimotifs into the MnM database. Several supporting features increase the efficiency of annotation. The layered architecture of MimoSA allows for extensibility by separating the functions of paper scoring, minimotif visualization, and database management. MimoSA is readily adaptable to other annotation efforts that manually curate literature into a MySQL database. Conclusions MimoSA is an extensible application that facilitates minimotif annotation and integrates with the Minimotif Miner database. We have built MimoSA as an application that integrates dynamic abstract scoring with a high performance relational model of minimotif syntax. MimoSA's TextMine, an efficient paper-scoring algorithm, can be used to dynamically rank papers with respect to context. PMID:20565705

  18. Use of a secure Internet Web site for collaborative medical research.

    PubMed

    Marshall, W W; Haley, R W

    2000-10-11

    Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.

  19. Executing Complexity-Increasing Queries in Relational (MySQL) and NoSQL (MongoDB and EXist) Size-Growing ISO/EN 13606 Standardized EHR Databases

    PubMed Central

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2018-01-01

    This research shows a protocol to assess the computational complexity of querying relational and non-relational (NoSQL (not only Structured Query Language)) standardized electronic health record (EHR) medical information database systems (DBMS). It uses a set of three doubling-sized databases, i.e. databases storing 5000, 10,000 and 20,000 realistic standardized EHR extracts, in three different database management systems (DBMS): relational MySQL object-relational mapping (ORM), document-based NoSQL MongoDB, and native extensible markup language (XML) NoSQL eXist. The average response times to six complexity-increasing queries were computed, and the results showed a linear behavior in the NoSQL cases. In the NoSQL field, MongoDB presents a much flatter linear slope than eXist. NoSQL systems may also be more appropriate to maintain standardized medical information systems due to the special nature of the updating policies of medical information, which should not affect the consistency and efficiency of the data stored in NoSQL databases. One limitation of this protocol is the lack of direct results of improved relational systems such as archetype relational mapping (ARM) with the same data. However, the interpolation of doubling-size database results to those presented in the literature and other published results suggests that NoSQL systems might be more appropriate in many specific scenarios and problems to be solved. For example, NoSQL may be appropriate for document-based tasks such as EHR extracts used in clinical practice, or edition and visualization, or situations where the aim is not only to query medical information, but also to restore the EHR in exactly its original form. PMID:29608174

  20. Executing Complexity-Increasing Queries in Relational (MySQL) and NoSQL (MongoDB and EXist) Size-Growing ISO/EN 13606 Standardized EHR Databases.

    PubMed

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2018-03-19

    This research shows a protocol to assess the computational complexity of querying relational and non-relational (NoSQL (not only Structured Query Language)) standardized electronic health record (EHR) medical information database systems (DBMS). It uses a set of three doubling-sized databases, i.e. databases storing 5000, 10,000 and 20,000 realistic standardized EHR extracts, in three different database management systems (DBMS): relational MySQL object-relational mapping (ORM), document-based NoSQL MongoDB, and native extensible markup language (XML) NoSQL eXist. The average response times to six complexity-increasing queries were computed, and the results showed a linear behavior in the NoSQL cases. In the NoSQL field, MongoDB presents a much flatter linear slope than eXist. NoSQL systems may also be more appropriate to maintain standardized medical information systems due to the special nature of the updating policies of medical information, which should not affect the consistency and efficiency of the data stored in NoSQL databases. One limitation of this protocol is the lack of direct results of improved relational systems such as archetype relational mapping (ARM) with the same data. However, the interpolation of doubling-size database results to those presented in the literature and other published results suggests that NoSQL systems might be more appropriate in many specific scenarios and problems to be solved. For example, NoSQL may be appropriate for document-based tasks such as EHR extracts used in clinical practice, or edition and visualization, or situations where the aim is not only to query medical information, but also to restore the EHR in exactly its original form.

  1. Interactive design of generic chemical patterns.

    PubMed

    Schomburg, Karen T; Wetzer, Lars; Rarey, Matthias

    2013-07-01

    Every medicinal chemist has to create chemical patterns occasionally for querying databases, applying filters or describing functional groups. However, the representations of chemical patterns have been so far limited to languages with highly complex syntax, handicapping the application of patterns. Graphic pattern editors similar to chemical editors can facilitate the work with patterns. In this article, we review the interfaces of frequently used web search engines for chemical patterns. We take a look at pattern editing concepts of standalone chemical editors and finally present a completely new, unpublished graphical approach to pattern design, the SMARTSeditor. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. PDTRT special section: Methodological issues in personality disorder research.

    PubMed

    Widiger, Thomas A

    2017-10-01

    Personality Disorders: Theory, Research, and Treatment includes a rolling, ongoing Special Section concerned with methodological issues in personality disorder research. This third edition of this series includes two articles. The first is by Brian Hicks, Angus Clark, and Emily Durbin: "Person-Centered Approaches in the Study of Personality Disorders." The second article is by Steve Balsis: "Item Response Theory Applications in Personality Disorder Research." Both articles should be excellent resources for future research and certainly manuscripts submitted to this journal that use these analytic tools. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Content-based video retrieval by example video clip

    NASA Astrophysics Data System (ADS)

    Dimitrova, Nevenka; Abdel-Mottaleb, Mohamed

    1997-01-01

    This paper presents a novel approach for video retrieval from a large archive of MPEG or Motion JPEG compressed video clips. We introduce a retrieval algorithm that takes a video clip as a query and searches the database for clips with similar contents. Video clips are characterized by a sequence of representative frame signatures, which are constructed from DC coefficients and motion information (`DC+M' signatures). The similarity between two video clips is determined by using their respective signatures. This method facilitates retrieval of clips for the purpose of video editing, broadcast news retrieval, or copyright violation detection.

  4. Adverse drug reactions and adverse events of 33 varieties of traditional Chinese medicine injections on National Essential medicines List (2004 edition) of China: an overview on published literatures.

    PubMed

    Wang, Li; Yuan, Qiang; Marshall, Gareth; Cui, Xiaohua; Cheng, Lan; Li, Yuanyuan; Shang, Hongcai; Zhang, Boli; Li, Youping

    2010-05-01

    We conducted a literature review on adverse drug reactions (ADRs) related to 33 kinds of traditional Chinese medicine injections (CMIs) on China's National Essential medicines List (2004 edition). We aimed to retrieve basic ADR information, identify trends related to CMIs, and provide evidence for the research, development, and application of CMIs. We electronically searched the Chinese Biomedical Literature Database (CBM, January 1978-April 2009), the China National Knowledge Infrastructure Database (CNKI, January 1979-April 2009), the Chinese Science and Technology Periodical Database (January 1989-April 2009) and the Traditional Chinese Medicine Database (January 1984-April 2009). We used the terms of 'adverse drug reaction', 'adverse event', 'side effects', 'side reaction', 'toxicity', and 'Chinese medicine injections', as well as the names of the 33 CMIs to search. We also collected CMI-related ADR reports and regulations from the Chinese Food and Drug Administration's 'Newsletter of Adverse Drug Reactions' (Issue 1 to 22). Then we descriptively analyzed all the articles by year published, periodical, and study design. We also analyzed regulations relevants to ADRs. (1) We found 5405 relevant citations, of which 1010 studies met the eligibility criteria. (2) The rate of publishing of research articles on CMI-linked ADRs has risen over time. (3) The included 1010 articles were scattered among 297 periodicals. Of these, 55 journals on pharmaceutical medicine accounted for 39.5% of the total (399/1010); the 64 journals on traditional Chinese medicine, accounted for only 19.5% (197/1010). Only 22 periodicals with relevant articles were included on the core journals of the Beijing University List (2008 edition); these published 129 articles (12.8% of the included articles). (4) The relevant articles consisted of 348 case reports (34.5%), 254 case series (25.2%), 119 reviews (11.8%), 116 randomized controlled trials (11.5%), 78 cross-sectional studies (7.7%), 61 literature analyses of ADR (6.0%), and 28 non-randomized controlled clinical studies (2.8%). (5) Three journals, Adverse Drug Reactions Journal, China Medical Herald, and Chinese Pharmaceuticals, together published 12.3% of the included literature. (6) The most commonly-reported CMI-related ADRs were to Shuanghuanglian, Qingkailing, and Yuxingcao injections, each of which had ADRs mentioned in more than 200 articles. Four of the five CMIs with the most ADR reports (Shuanghuanglian, Ciwujia, Yuxingcao, and Yinzhihuang injections) had been suspended use or sale in the market. (1) Articles published on CMI-related ADRs increased over time, but overall the research is of low quality and is scattered through a large number of sources. (2) Four CMIs (Shuanghuanglian, Ciwujia, Yuxingcao, and Yinzhihuang injections) had been suspended for clinical use or sale. (3) There is an urgent need for a clear standard to grade ADRs of CMIs in order to better risk manage. (4) It is necessary to continually re-evaluate the safety of CMIs and to promote rational use of CMIs. © 2010 Blackwell Publishing Asia Pty Ltd and Chinese Cochrane Center, West China Hospital of Sichuan University.

  5. Studying Turbulence Using Numerical Simulation Databases. 5: Proceedings of the 1994 Summer Program

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Direct numerical simulation databases were used to study turbulence physics and modeling issues at the fifth Summer Program of the Center for Turbulence Research. The largest group, comprising more than half of the participants, was the Turbulent Reacting Flows and Combustion group. The remaining participants were in three groups: Fundamentals, Modeling & LES, and Rotating Turbulence. For the first time in the CTR Summer Programs, participants included engineers from the U.S. aerospace industry. They were exposed to a variety of problems involving turbulence, and were able to incorporate the models developed at CTR in their company codes. They were exposed to new ideas on turbulence prediction, methods which already appear to have had an impact on their capabilities at their laboratories. Such interactions among the practitioners in the government, academia, and industry are the most meaningful way of transferring technology.

  6. PRIDE: new developments and new datasets.

    PubMed

    Jones, Philip; Côté, Richard G; Cho, Sang Yun; Klie, Sebastian; Martens, Lennart; Quinn, Antony F; Thorneycroft, David; Hermjakob, Henning

    2008-01-01

    The PRIDE (http://www.ebi.ac.uk/pride) database of protein and peptide identifications was previously described in the NAR Database Special Edition in 2006. Since this publication, the volume of public data in the PRIDE relational database has increased by more than an order of magnitude. Several significant public datasets have been added, including identifications and processed mass spectra generated by the HUPO Brain Proteome Project and the HUPO Liver Proteome Project. The PRIDE software development team has made several significant changes and additions to the user interface and tool set associated with PRIDE. The focus of these changes has been to facilitate the submission process and to improve the mechanisms by which PRIDE can be queried. The PRIDE team has developed a Microsoft Excel workbook that allows the required data to be collated in a series of relatively simple spreadsheets, with automatic generation of PRIDE XML at the end of the process. The ability to query PRIDE has been augmented by the addition of a BioMart interface allowing complex queries to be constructed. Collaboration with groups outside the EBI has been fruitful in extending PRIDE, including an approach to encode iTRAQ quantitative data in PRIDE XML.

  7. YTPdb: a wiki database of yeast membrane transporters.

    PubMed

    Brohée, Sylvain; Barriot, Roland; Moreau, Yves; André, Bruno

    2010-10-01

    Membrane transporters constitute one of the largest functional categories of proteins in all organisms. In the yeast Saccharomyces cerevisiae, this represents about 300 proteins ( approximately 5% of the proteome). We here present the Yeast Transport Protein database (YTPdb), a user-friendly collaborative resource dedicated to the precise classification and annotation of yeast transporters. YTPdb exploits an evolution of the MediaWiki web engine used for popular collaborative databases like Wikipedia, allowing every registered user to edit the data in a user-friendly manner. Proteins in YTPdb are classified on the basis of functional criteria such as subcellular location or their substrate compounds. These classifications are hierarchical, allowing queries to be performed at various levels, from highly specific (e.g. ammonium as a substrate or the vacuole as a location) to broader (e.g. cation as a substrate or inner membranes as location). Other resources accessible for each transporter via YTPdb include post-translational modifications, K(m) values, a permanently updated bibliography, and a hierarchical classification into families. The YTPdb concept can be extrapolated to other organisms and could even be applied for other functional categories of proteins. YTPdb is accessible at http://homes.esat.kuleuven.be/ytpdb/. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. The Emotional Movie Database (EMDB): a self-report and psychophysiological study.

    PubMed

    Carvalho, Sandra; Leite, Jorge; Galdo-Álvarez, Santiago; Gonçalves, Oscar F

    2012-12-01

    Film clips are an important tool for evoking emotional responses in the laboratory. When compared with other emotionally potent visual stimuli (e.g., pictures), film clips seem to be more effective in eliciting emotions for longer periods of time at both the subjective and physiological levels. The main objective of the present study was to develop a new database of affective film clips without auditory content, based on a dimensional approach to emotional stimuli (valence, arousal and dominance). The study had three different phases: (1) the pre-selection and editing of 52 film clips (2) the self-report rating of these film clips by a sample of 113 participants and (3) psychophysiological assessment [skin conductance level (SCL) and the heart rate (HR)] on 32 volunteers. Film clips from different categories were selected to elicit emotional states from different quadrants of affective space. The results also showed that sustained exposure to the affective film clips resulted in a pattern of a SCL increase and HR deceleration in high arousal conditions (i.e., horror and erotic conditions). The resulting emotional movie database can reliably be used in research requiring the presentation of non-auditory film clips with different ratings of valence, arousal and dominance.

  9. Measuring and predicting sooting tendencies of oxygenates, alkanes, alkenes, cycloalkanes, and aromatics on a unified scale

    DOE PAGES

    Das, Dhrubajyoti D.; St. John, Peter C.; McEnally, Charles S.; ...

    2017-12-27

    Databases of sooting indices, based on measuring some aspect of sooting behavior in a standardized combustion environment, are useful in providing information on the comparative sooting tendencies of different fuels or pure compounds. However, newer biofuels have varied chemical structures including both aromatic and oxygenated functional groups, which expands the chemical space of relevant compounds. In this work, we propose a unified sooting tendency database for pure compounds, including both regular and oxygenated hydrocarbons, which is based on combining two disparate databases of yield-based sooting tendency measurements in the literature. Unification of the different databases was made possible by leveragingmore » the greater dynamic range of the color ratio pyrometry soot diagnostic. This unified database contains a substantial number of pure compounds (≥ 400 total) from multiple categories of hydrocarbons important in modern fuels and establishes the sooting tendencies of aromatic and oxygenated hydrocarbons on the same numeric scale for the first time. Then, using this unified sooting tendency database, we have developed a predictive model for sooting behavior applicable to a broad range of hydrocarbons and oxygenated hydrocarbons. The model decomposes each compound into single-carbon fragments and assigns a sooting tendency contribution to each fragment based on regression against the unified database. The model’s predictive accuracy (as demonstrated by leave-one-out cross-validation) is comparable to a previously developed, more detailed predictive model. The fitted model provides insight into the effects of chemical structure on soot formation, and cases where its predictions fail reveal the presence of more complicated kinetic sooting mechanisms. Our work will therefore enable the rational design of low-sooting fuel blends from a wide range of feedstocks and chemical functionalities.« less

  10. Measuring and predicting sooting tendencies of oxygenates, alkanes, alkenes, cycloalkanes, and aromatics on a unified scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, Dhrubajyoti D.; St. John, Peter C.; McEnally, Charles S.

    Databases of sooting indices, based on measuring some aspect of sooting behavior in a standardized combustion environment, are useful in providing information on the comparative sooting tendencies of different fuels or pure compounds. However, newer biofuels have varied chemical structures including both aromatic and oxygenated functional groups, which expands the chemical space of relevant compounds. In this work, we propose a unified sooting tendency database for pure compounds, including both regular and oxygenated hydrocarbons, which is based on combining two disparate databases of yield-based sooting tendency measurements in the literature. Unification of the different databases was made possible by leveragingmore » the greater dynamic range of the color ratio pyrometry soot diagnostic. This unified database contains a substantial number of pure compounds (≥ 400 total) from multiple categories of hydrocarbons important in modern fuels and establishes the sooting tendencies of aromatic and oxygenated hydrocarbons on the same numeric scale for the first time. Then, using this unified sooting tendency database, we have developed a predictive model for sooting behavior applicable to a broad range of hydrocarbons and oxygenated hydrocarbons. The model decomposes each compound into single-carbon fragments and assigns a sooting tendency contribution to each fragment based on regression against the unified database. The model’s predictive accuracy (as demonstrated by leave-one-out cross-validation) is comparable to a previously developed, more detailed predictive model. The fitted model provides insight into the effects of chemical structure on soot formation, and cases where its predictions fail reveal the presence of more complicated kinetic sooting mechanisms. Our work will therefore enable the rational design of low-sooting fuel blends from a wide range of feedstocks and chemical functionalities.« less

  11. Using Crowdsourced Trajectories for Automated OSM Data Entry Approach

    PubMed Central

    Basiri, Anahid; Amirian, Pouria; Mooney, Peter

    2016-01-01

    The concept of crowdsourcing is nowadays extensively used to refer to the collection of data and the generation of information by large groups of users/contributors. OpenStreetMap (OSM) is a very successful example of a crowd-sourced geospatial data project. Unfortunately, it is often the case that OSM contributor inputs (including geometry and attribute data inserts, deletions and updates) have been found to be inaccurate, incomplete, inconsistent or vague. This is due to several reasons which include: (1) many contributors with little experience or training in mapping and Geographic Information Systems (GIS); (2) not enough contributors familiar with the areas being mapped; (3) contributors having different interpretations of the attributes (tags) for specific features; (4) different levels of enthusiasm between mappers resulting in different number of tags for similar features and (5) the user-friendliness of the online user-interface where the underlying map can be viewed and edited. This paper suggests an automatic mechanism, which uses raw spatial data (trajectories of movements contributed by contributors to OSM) to minimise the uncertainty and impact of the above-mentioned issues. This approach takes the raw trajectory datasets as input and analyses them using data mining techniques. In addition, we extract some patterns and rules about the geometry and attributes of the recognised features for the purpose of insertion or editing of features in the OSM database. The underlying idea is that certain characteristics of user trajectories are directly linked to the geometry and the attributes of geographic features. Using these rules successfully results in the generation of new features with higher spatial quality which are subsequently automatically inserted into the OSM database. PMID:27649192

  12. [Status of ethical awareness based on 88 medical journals in China and combined evaluation].

    PubMed

    Chen, Liwen; Wang, Yiren; Li, Lingjiang

    2015-09-01

    To evaluate the status of ethical awareness of medical journals in China.
 We surveyed editorial awareness based on 88 medical journals by using self-made questionnaire. Five aspects were selected by literature and systematic analysis: Instruction for authors, the first review stage, the peer-review stage, the editing stage, as well as education and training, which covered 11 indexes in the system. Weight values of indexes were gained by scoring of senior editors, and analytic hierarchy process, TOPSIS method, and weight rank-sun ratio were used to evaluate the status of editorial awareness.
 Of the 88 biomedical journals, 56 (63.6%) had no ethical requirement in the instruction for authors in 2010, 14 (15.9%) were at high level of ethical awareness, 45 (51.1%) were at medium level, and 29(33.0%) were at low level. There were significant differences in the scores of instruction for authors and peer-review stage among the journals administrated by different authorities (H(C)=10.175, H=7.305, P<0.05). There were significant differences in the scores of instruction for authors, the first review stage, the peer-review stage, and the editing stage among the journals covered by different databases (H(C)=11.951, 7.661, 6.146, or 8.085, P<0.05), meanwhile, there was significant difference in the multi-level results of comprehensive evaluation for different databases covered journals (H(C)=6.109, P<0.05). The results from 3 comprehensive approaches were positively correlated.
 Ethical awareness of medical journals in China should be improved. Comprehensive approach is more reliable and practical than that of single approach.

  13. Beyond PM2.5: The role of ultrafine particles on adverse health effects of air pollution.

    PubMed

    Chen, Rui; Hu, Bin; Liu, Ying; Xu, Jianxun; Yang, Guosheng; Xu, Diandou; Chen, Chunying

    2016-12-01

    Air pollution constitutes the major threat to human health, whereas their adverse impacts and underlying mechanisms of different particular matters are not clearly defined. Ultrafine particles (UFPs) are high related to the anthropogenic emission sources, i.e. combustion engines and power plants. Their composition, source, typical characters, oxidative effects, potential exposure routes and health risks were thoroughly reviewed. UFPs play a major role in adverse impacts on human health and require further investigations in future toxicological research of air pollution. Unlike PM2.5, UFPs may have much more impacts on human health considering loads of evidences emerging from particulate matters and nanotoxicology research fields. The knowledge of nanotoxicology contributes to the understanding of toxicity mechanisms of airborne UFPs in air pollution. This article is part of a Special Issue entitled Air Pollution, edited by Wenjun Ding, Andrew J. Ghio and Weidong Wu. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Increasing insect reactions in Alaska: is this related to changing climate?

    PubMed

    Demain, Jeffrey G; Gessner, Bradford D; McLaughlin, Joseph B; Sikes, Derek S; Foote, J Timothy

    2009-01-01

    In 2006, Fairbanks, AK, reported its first cases of fatal anaphylaxis as a result of Hymenoptera stings concurrent with an increase in insect reactions observed throughout the state. This study was designed to determine whether Alaska medical visits for insect reactions have increased. We conducted a retrospective review of three independent patient databases in Alaska to identify trends of patients seeking medical care for adverse reactions after insect-related events. For each database, an insect reaction was defined as a claim for the International Classification of Diseases, Ninth Edition (ICD-9), codes E9053, E906.4, and 989.5. Increases in insect reactions in each region were compared with temperature changes in the same region. Each database revealed a statistically significant trend in patients seeking care for insect reactions. Fairbanks Memorial Hospital Emergency Department reported a fourfold increase in patients in 2006 compared with previous years (1992-2005). The Allergy, Asthma, and Immunology Center of Alaska reported a threefold increase in patients from 1999 to 2002 to 2003 to 2007. A retrospective review of the Alaska Medicaid database from 1999 to 2006 showed increases in medical claims for insect reactions among all regions, with the largest percentage of increases occurring in the most northern areas. Increases in insect reactions in Alaska have occurred after increases in annual and winter temperatures, and these findings may be causally related.

  15. Systematic review of the effectiveness of training programs in writing for scholarly publication, journal editing, and manuscript peer review (protocol).

    PubMed

    Galipeau, James; Moher, David; Skidmore, Becky; Campbell, Craig; Hendry, Paul; Cameron, D William; Hébert, Paul C; Palepu, Anita

    2013-06-17

    An estimated $100 billion is lost to 'waste' in biomedical research globally, annually, much of which comes from the poor quality of published research. One area of waste involves bias in reporting research, which compromises the usability of published reports. In response, there has been an upsurge in interest and research in the scientific process of writing, editing, peer reviewing, and publishing (that is, journalology) of biomedical research. One reason for bias in reporting and the problem of unusable reports could be due to authors lacking knowledge or engaging in questionable practices while designing, conducting, or reporting their research. Another might be that the peer review process for journal publication has serious flaws, including possibly being ineffective, and having poorly trained and poorly motivated reviewers. Similarly, many journal editors have limited knowledge related to publication ethics. This can ultimately have a negative impact on the healthcare system. There have been repeated calls for better, more numerous training opportunities in writing for publication, peer review, and publishing. However, little research has taken stock of journalology training opportunities or evaluations of their effectiveness. We will conduct a systematic review to synthesize studies that evaluate the effectiveness of training programs in journalology. A comprehensive three-phase search approach will be employed to identify evaluations of training opportunities, involving: 1) forward-searching using the Scopus citation database, 2) a search of the MEDLINE In-Process and Non-Indexed Citations, MEDLINE, Embase, ERIC, and PsycINFO databases, as well as the databases of the Cochrane Library, and 3) a grey literature search. This project aims to provide evidence to help guide the journalological training of authors, peer reviewers, and editors. While there is ample evidence that many members of these groups are not getting the necessary training needed to excel at their respective journalology-related tasks, little is known about the characteristics of existing training opportunities, including their effectiveness. The proposed systematic review will provide evidence regarding the effectiveness of training, therefore giving potential trainees, course designers, and decision-makers evidence to help inform their choices and policies regarding the merits of specific training opportunities or types of training.

  16. Genetic Architectures of Quantitative Variation in RNA Editing Pathways

    PubMed Central

    Gu, Tongjun; Gatti, Daniel M.; Srivastava, Anuj; Snyder, Elizabeth M.; Raghupathy, Narayanan; Simecek, Petr; Svenson, Karen L.; Dotu, Ivan; Chuang, Jeffrey H.; Keller, Mark P.; Attie, Alan D.; Braun, Robert E.; Churchill, Gary A.

    2016-01-01

    RNA editing refers to post-transcriptional processes that alter the base sequence of RNA. Recently, hundreds of new RNA editing targets have been reported. However, the mechanisms that determine the specificity and degree of editing are not well understood. We examined quantitative variation of site-specific editing in a genetically diverse multiparent population, Diversity Outbred mice, and mapped polymorphic loci that alter editing ratios globally for C-to-U editing and at specific sites for A-to-I editing. An allelic series in the C-to-U editing enzyme Apobec1 influences the editing efficiency of Apob and 58 additional C-to-U editing targets. We identified 49 A-to-I editing sites with polymorphisms in the edited transcript that alter editing efficiency. In contrast to the shared genetic control of C-to-U editing, most of the variable A-to-I editing sites were determined by local nucleotide polymorphisms in proximity to the editing site in the RNA secondary structure. Our results indicate that RNA editing is a quantitative trait subject to genetic variation and that evolutionary constraints have given rise to distinct genetic architectures in the two canonical types of RNA editing. PMID:26614740

  17. Psychometric Properties of Language Assessments for Children Aged 4–12 Years: A Systematic Review

    PubMed Central

    Denman, Deborah; Speyer, Renée; Munro, Natalie; Pearce, Wendy M.; Chen, Yu-Wei; Cordier, Reinie

    2017-01-01

    Introduction: Standardized assessments are widely used by speech pathologists in clinical and research settings to evaluate the language abilities of school-aged children and inform decisions about diagnosis, eligibility for services and intervention. Given the significance of these decisions, it is important that assessments have sound psychometric properties. Objective: The aim of this systematic review was to examine the psychometric quality of currently available comprehensive language assessments for school-aged children and identify assessments with the best evidence for use. Methods: Using the PRISMA framework as a guideline, a search of five databases and a review of websites and textbooks was undertaken to identify language assessments and published material on the reliability and validity of these assessments. The methodological quality of selected studies was evaluated using the COSMIN taxonomy and checklist. Results: Fifteen assessments were evaluated. For most assessments evidence of hypothesis testing (convergent and discriminant validity) was identified; with a smaller number of assessments having some evidence of reliability and content validity. No assessments presented with evidence of structural validity, internal consistency or error measurement. Overall, all assessments were identified as having limitations with regards to evidence of psychometric quality. Conclusions: Further research is required to provide good evidence of psychometric quality for currently available language assessments. Of the assessments evaluated, the Assessment of Literacy and Language, the Clinical Evaluation of Language Fundamentals-5th Edition, the Clinical Evaluation of Language Fundamentals-Preschool: 2nd Edition and the Preschool Language Scales-5th Edition presented with most evidence and are thus recommended for use. PMID:28936189

  18. Manual editing of automatically recorded data in an anesthesia information management system.

    PubMed

    Wax, David B; Beilin, Yaakov; Hossain, Sabera; Lin, Hung-Mo; Reich, David L

    2008-11-01

    Anesthesia information management systems allow automatic recording of physiologic and anesthetic data. The authors investigated the prevalence of such data modification in an academic medical center. The authors queried their anesthesia information management system database of anesthetics performed in 2006 and tabulated the counts of data points for automatically recorded physiologic and anesthetic parameters as well as the subset of those data that were manually invalidated by clinicians (both with and without alternate values manually appended). Patient, practitioner, data source, and timing characteristics of recorded values were also extracted to determine their associations with editing of various parameters in the anesthesia information management system record. A total of 29,491 cases were analyzed, 19% of which had one or more data points manually invalidated. Among 58 attending anesthesiologists, each invalidated data in a median of 7% of their cases when working as a sole practitioner. A minority of invalidated values were manually appended with alternate values. Pulse rate, blood pressure, and pulse oximetry were the most commonly invalidated parameters. Data invalidation usually resulted in a decrease in parameter variance. Factors independently associated with invalidation included extreme physiologic values, American Society of Anesthesiologists physical status classification, emergency status, timing (phase of the procedure/anesthetic), presence of an intraarterial catheter, resident or certified registered nurse anesthetist involvement, and procedure duration. Editing of physiologic data automatically recorded in an anesthesia information management system is a common practice and results in decreased variability of intraoperative data. Further investigation may clarify the reasons for and consequences of this behavior.

  19. Development and application of a database of food ingredient fraud and economically motivated adulteration from 1980 to 2010.

    PubMed

    Moore, Jeffrey C; Spink, John; Lipp, Markus

    2012-04-01

    Food ingredient fraud and economically motivated adulteration are emerging risks, but a comprehensive compilation of information about known problematic ingredients and detection methods does not currently exist. The objectives of this research were to collect such information from publicly available articles in scholarly journals and general media, organize into a database, and review and analyze the data to identify trends. The results summarized are a database that will be published in the US Pharmacopeial Convention's Food Chemicals Codex, 8th edition, and includes 1305 records, including 1000 records with analytical methods collected from 677 references. Olive oil, milk, honey, and saffron were the most common targets for adulteration reported in scholarly journals, and potentially harmful issues identified include spices diluted with lead chromate and lead tetraoxide, substitution of Chinese star anise with toxic Japanese star anise, and melamine adulteration of high protein content foods. High-performance liquid chromatography and infrared spectroscopy were the most common analytical detection procedures, and chemometrics data analysis was used in a large number of reports. Future expansion of this database will include additional publically available articles published before 1980 and in other languages, as well as data outside the public domain. The authors recommend in-depth analyses of individual incidents. This report describes the development and application of a database of food ingredient fraud issues from publicly available references. The database provides baseline information and data useful to governments, agencies, and individual companies assessing the risks of specific products produced in specific regions as well as products distributed and sold in other regions. In addition, the report describes current analytical technologies for detecting food fraud and identifies trends and developments. © 2012 US Pharmacupia Journal of Food Science © 2012 Institute of Food Technologistsreg;

  20. Impact of nongray multiphase radiation in pulverized coal combustion

    NASA Astrophysics Data System (ADS)

    Roy, Somesh; Wu, Bifen; Modest, Michael; Zhao, Xinyu

    2016-11-01

    Detailed modeling of radiation is important for accurate modeling of pulverized coal combustion. Because of high temperature and optical properties, radiative heat transfer from coal particles is often more dominant than convective heat transfer. In this work a multiphase photon Monte Carlo radiation solver is used to investigate and to quantify the effect of nongray radiation in a laboratory-scale pulverized coal flame. The nongray radiative properties of carrier phase (gas) is modeled using HITEMP database. Three major species - CO, CO2, and H2O - are treated as participating gases. Two optical models are used to evaluate radiative properties of coal particles: a formulation based on the large particle limit and a size-dependent correlation. Effect of scattering due to coal particle is also investigated using both isotropic scattering and anisotropic scattering using a Henyey-Greenstein function. Lastly, since the optical properties of ash is very different from that of coal, the effect of ash content on the radiative properties of coal particle is examined. This work used Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation Grant Number ACI-1053575.

  1. Obtaining NASA Approval for use of Non-Metallic Materials in Manned Space Flight

    NASA Technical Reports Server (NTRS)

    Davis, Samuel E.; Wise, Harry L.

    2003-01-01

    Material manufacturers and suppliers are often surprised when a material commonly provided to industry is not approved for use on manned spacecraft. Often the reason is a lack of test data in environments that simulate those encountered in space applications, especially oxygen-enriched conditions, which significantly increase both the likelihood of material combustion and the propagation of a fire. This paper introduces the requirements for flight approval of non-metallic materials, focusing on material testing for human-rated space flight programs; it reviews the history of flight materials requirements and provides the rationale for such and introduces specific requirements related to testing and to good material engineering and design practices. After describing the procedure for submitting materials to be tested, the paper outlines options available if a material fails testing. In addition, this treatise introduces the National Aeronautics and Space Administration's (NASA's) Materials and Processes Technical Information System (MAPTIS), a database housing all test data produced in accordance with NASA-STD-6001, Flammability, Odor, Offgassing, and Compatibility Requirements and Test Procedures for Materials in Environments that Support Combustion.

  2. Evaluation of the flame propagation within an SI engine using flame imaging and LES

    NASA Astrophysics Data System (ADS)

    He, Chao; Kuenne, Guido; Yildar, Esra; van Oijen, Jeroen; di Mare, Francesca; Sadiki, Amsini; Ding, Carl-Philipp; Baum, Elias; Peterson, Brian; Böhm, Benjamin; Janicka, Johannes

    2017-11-01

    This work shows experiments and simulations of the fired operation of a spark ignition engine with port-fuelled injection. The test rig considered is an optically accessible single cylinder engine specifically designed at TU Darmstadt for the detailed investigation of in-cylinder processes and model validation. The engine was operated under lean conditions using iso-octane as a substitute for gasoline. Experiments have been conducted to provide a sound database of the combustion process. A planar flame imaging technique has been applied within the swirl- and tumble-planes to provide statistical information on the combustion process to complement a pressure-based comparison between simulation and experiments. This data is then analysed and used to assess the large eddy simulation performed within this work. For the simulation, the engine code KIVA has been extended by the dynamically thickened flame model combined with chemistry reduction by means of pressure dependent tabulation. Sixty cycles have been simulated to perform a statistical evaluation. Based on a detailed comparison with the experimental data, a systematic study has been conducted to obtain insight into the most crucial modelling uncertainties.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wesnor, J.D.

    Since passage of the Clean Air Act, Asea Brown Boveri (ABB) has been actively developing a knowledge base on the Title 3 hazardous air pollutants, more commonly called air toxics. As ABB is a multinational company, US operating companies are able to call upon work performed by European counterparts, who have faced similar legislation several years ago. In addition to the design experience and database acquired in Europe, ABB Inc. has been pursuing several other avenues to expand its air toxics knowledge. ABB Combustion Engineering (ABB CE) is presently studying the formation of organic pollutants within the combustion furnace andmore » partitioning of trace metals among the furnace outlet streams. ABB Environmental Systems (ABBES) has reviewed available and near-term control technologies and methods. Also, both ABB CE and ABBES have conducted source sampling and analysis at commercial installations for hazardous air pollutants to determine the emission rates and removal performance of various types of equipment. Several different plants hosted these activities, allowing for variation in fuel type and composition, boiler configuration, and air pollution control equipment. This paper discusses the results of these investigations.« less

  4. Speciation and Attenuation of Arsenic and Selenium at Coal Combustion By-Product Management Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    K. Ladwig

    2005-12-31

    The overall objective of this project was to evaluate the impact of key constituents captured from power plant air streams (principally arsenic and selenium) on the disposal and utilization of coal combustion products (CCPs). Specific objectives of the project were: (1) to develop a comprehensive database of field leachate concentrations at a wide range of CCP management sites, including speciation of arsenic and selenium, and low-detection limit analyses for mercury; (2) to perform detailed evaluations of the release and attenuation of arsenic species at three CCP sites; and (3) to perform detailed evaluations of the release and attenuation of seleniummore » species at three CCP sites. Each of these objectives was accomplished using a combination of field sampling and laboratory analysis and experimentation. All of the methods used and results obtained are contained in this report. For ease of use, the report is subdivided into three parts. Volume 1 contains methods and results for the field leachate characterization. Volume 2 contains methods and results for arsenic adsorption. Volume 3 contains methods and results for selenium adsorption.« less

  5. Cost effective nuclear commercial grade dedication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maletz, J.J.; Marston, M.J.

    1991-01-01

    This paper describes a new computerized database method to create/edit/view specification technical data sheets (mini-specifications) for procurement of spare parts for nuclear facility maintenance and to develop information that could support possible future facility life extension efforts. This method may reduce cost when compared with current manual methods. The use of standardized technical data sheets (mini-specifications) for items of the same category improves efficiency. This method can be used for a variety of tasks, including: Nuclear safety-related procurement; Non-safety related procurement; Commercial grade item procurement/dedication; Evaluation of replacement items. This program will assist the nuclear facility in upgrading its procurementmore » activities consistent with the recent NUMARC Procurement Initiative. Proper utilization of the program will assist the user in assuring that the procured items are correct for the applications, provide data to assist in detecting fraudulent materials, minimize human error in withdrawing database information, improve data retrievability, improve traceability, and reduce long-term procurement costs.« less

  6. The development of the Project NetWork administrative records database for policy evaluation.

    PubMed

    Rupp, K; Driessen, D; Kornfeld, R; Wood, M

    1999-01-01

    This article describes the development of SSA's administrative records database for the Project NetWork return-to-work experiment targeting persons with disabilities. The article is part of a series of papers on the evaluation of the Project NetWork demonstration. In addition to 8,248 Project NetWork participants randomly assigned to receive case management services and a control group, the simulation identified 138,613 eligible nonparticipants in the demonstration areas. The output data files contain detailed monthly information on Supplemental Security Income (SSI) and Disability Insurance (DI) benefits, annual earnings, and a set of demographic and diagnostic variables. The data allow for the measurement of net outcomes and the analysis of factors affecting participation. The results suggest that it is feasible to simulate complex eligibility rules using administrative records, and create a clean and edited data file for a comprehensive and credible evaluation. The study shows that it is feasible to use administrative records data for selecting control or comparison groups in future demonstration evaluations.

  7. FragariaCyc: A Metabolic Pathway Database for Woodland Strawberry Fragaria vesca

    PubMed Central

    Naithani, Sushma; Partipilo, Christina M.; Raja, Rajani; Elser, Justin L.; Jaiswal, Pankaj

    2016-01-01

    FragariaCyc is a strawberry-specific cellular metabolic network based on the annotated genome sequence of Fragaria vesca L. ssp. vesca, accession Hawaii 4. It was built on the Pathway-Tools platform using MetaCyc as the reference. The experimental evidences from published literature were used for supporting/editing existing entities and for the addition of new pathways, enzymes, reactions, compounds, and small molecules in the database. To date, FragariaCyc comprises 66 super-pathways, 488 unique pathways, 2348 metabolic reactions, 3507 enzymes, and 2134 compounds. In addition to searching and browsing FragariaCyc, researchers can compare pathways across various plant metabolic networks and analyze their data using Omics Viewer tool. We view FragariaCyc as a resource for the community of researchers working with strawberry and related fruit crops. It can help understanding the regulation of overall metabolism of strawberry plant during development and in response to diseases and abiotic stresses. FragariaCyc is available online at http://pathways.cgrb.oregonstate.edu. PMID:26973684

  8. Controlled flexibility in technical editing - The levels-of-edit concept at JPL

    NASA Technical Reports Server (NTRS)

    Buehler, M. F.

    1977-01-01

    The levels-of-edit concept, which can be used to specify the amount of editorial effort involved in the preparation of a manuscript for publication, is discussed. Nine types of editing are identified and described. These include coordination edit (preparing estimates, gathering cost data, monitoring production processes), policy edit, integrity edit (making sure that parts of a publication match in a physical or numerical sense), screening edit (ensuring that the quality of camera-ready copy is sufficient for external publication), copy clarification edit, format edit, mechanical style edit, language edit, and substantive edit (reviewing the manuscript for content coherence, emphasis, subordination and parallelism). These functions are grouped into five levels of edit. An edit-level number is assigned to each manuscript, providing a quantitative and qualitative indicator of the editing to be done which is clearly understood by authors, managers, and editors alike. In addition, clear boundaries are drawn between normal and extraordinary editing tasks. Individual organizations will group various edits in different ways to reflect their needs and priorities; the essential element of the system is unambiguous definition and coding of the types and amount of work to be done.

  9. A paler shade of green? The toxicology of biodiesel emissions: Recent findings from studies with this alternative fuel.

    PubMed

    Madden, Michael C

    2016-12-01

    Biodiesel produced primarily from plants and algal feedstocks is believed to have advantages for production and use compared to petroleum and to some other fuel sources. There is some speculation that exposure to biodiesel combustion emissions may not induce biological responses or health effects or at a minimum reduce the effects relative to other fuels. In evaluating the overall environmental and health effects of biodiesel production to end use scenario, empirical data or modeling data based on such data are needed. This manuscript examines the available toxicology reports examining combustion derived biodiesel emissions since approximately 2007, when our last review of the topic occurred. Toxicity derived from other end uses of biodiesel - e.g., spills, dermal absorption, etc. - are not examined. Findings from biodiesel emissions are roughly divided into three areas: whole non-human animal model exposures; in vitro exposures of mammalian and bacterial cells (used for mutation studies primarily); and human exposures in controlled or other exposure fashions. Overall, these more current studies clearly demonstrate that biodiesel combustion emission exposure- to either 100% biodiesel or a blend in petroleum diesel- can induce biological effects. There are reports that show biodiesel exposure generally induces more effects or a greater magnitude of effect than petroleum diesel, however there are also a similar number of reports showing the opposite trend. It is unclear whether effects induced by exposure to a blend are greater than exposure to 100% biodiesel. Taken together, the evidence suggest biodiesel emissions can have some similar effects as diesel emissions on inflammatory, vascular, mutagenic, and other responses. While acute biodiesel exposures can show toxicity with a variety of endpoints, the potential effects on human health need further validation. Additionally there are few or no findings to date on whether biodiesel emissions can induce effects or even a weaker response that petroleum diesel with repeated exposure scenarios such as in an occupational setting. This article is part of a Special Issue entitled Air Pollution, edited by Wenjun Ding, Andrew J. Ghio and Weidong Wu. Copyright © 2016. Published by Elsevier B.V.

  10. Aided generation of search interfaces to astronomical archives

    NASA Astrophysics Data System (ADS)

    Zorba, Sonia; Bignamini, Andrea; Cepparo, Francesco; Knapic, Cristina; Molinaro, Marco; Smareglia, Riccardo

    2016-07-01

    Astrophysical data provider organizations that host web based interfaces to provide access to data resources have to cope with possible changes in data management that imply partial rewrites of web applications. To avoid doing this manually it was decided to develop a dynamically configurable Java EE web application that can set itself up reading needed information from configuration files. Specification of what information the astronomical archive database has to expose is managed using the TAP SCHEMA schema from the IVOA TAP recommendation, that can be edited using a graphical interface. When configuration steps are done the tool will build a war file to allow easy deployment of the application.

  11. 2012 Annual report of the American Psychological Association.

    PubMed

    2013-01-01

    Provides the 2012 Annual Report of the American Psychological Association. In 2012, APA celebrated its 120th anniversary. It has grown from its original 31 members to the largest association of psychologists in the United States and a worldwide leader within the discipline. This edition of the report introduces each directorate and office within APA and talks about their goals and objectives. the president of APA, Dr. Norman Anderson, also gives a brief report which updates you on the activities of the association during its 120th anniversary as the professional home for psychologists and an advocate for the discipline. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  12. Towards a comprehensive picture of C-to-U RNA editing sites in angiosperm mitochondria.

    PubMed

    Edera, Alejandro A; Gandini, Carolina L; Sanchez-Puerta, M Virginia

    2018-05-14

    Our understanding of the dynamic and evolution of RNA editing in angiosperms is in part limited by the few editing sites identified to date. This study identified 10,217 editing sites from 17 diverse angiosperms. Our analyses confirmed the universality of certain features of RNA editing, and offer new evidence behind the loss of editing sites in angiosperms. RNA editing is a post-transcriptional process that substitutes cytidines (C) for uridines (U) in organellar transcripts of angiosperms. These substitutions mostly take place in mitochondrial messenger RNAs at specific positions called editing sites. By means of publicly available RNA-seq data, this study identified 10,217 editing sites in mitochondrial protein-coding genes of 17 diverse angiosperms. Even though other types of mismatches were also identified, we did not find evidence of non-canonical editing processes. The results showed an uneven distribution of editing sites among species, genes, and codon positions. The analyses revealed that editing sites were conserved across angiosperms but there were some species-specific sites. Non-synonymous editing sites were particularly highly conserved (~ 80%) across the plant species and were efficiently edited (80% editing extent). In contrast, editing sites at third codon positions were poorly conserved (~ 30%) and only partially edited (~ 40% editing extent). We found that the loss of editing sites along angiosperm evolution is mainly occurring by replacing editing sites with thymidines, instead of a degradation of the editing recognition motif around editing sites. Consecutive and highly conserved editing sites had been replaced by thymidines as result of retroprocessing, by which edited transcripts are reverse transcribed to cDNA and then integrated into the genome by homologous recombination. This phenomenon was more pronounced in eudicots, and in the gene cox1. These results suggest that retroprocessing is a widespread driving force underlying the loss of editing sites in angiosperm mitochondria.

  13. Development and implementation of a custom integrated database with dashboards to assist with hematopathology specimen triage and traffic

    PubMed Central

    Azzato, Elizabeth M.; Morrissette, Jennifer J. D.; Halbiger, Regina D.; Bagg, Adam; Daber, Robert D.

    2014-01-01

    Background: At some institutions, including ours, bone marrow aspirate specimen triage is complex, with hematopathology triage decisions that need to be communicated to downstream ancillary testing laboratories and many specimen aliquot transfers that are handled outside of the laboratory information system (LIS). We developed a custom integrated database with dashboards to facilitate and streamline this workflow. Methods: We developed user-specific dashboards that allow entry of specimen information by technologists in the hematology laboratory, have custom scripting to present relevant information for the hematopathology service and ancillary laboratories and allow communication of triage decisions from the hematopathology service to other laboratories. These dashboards are web-accessible on the local intranet and accessible from behind the hospital firewall on a computer or tablet. Secure user access and group rights ensure that relevant users can edit or access appropriate records. Results: After database and dashboard design, two-stage beta-testing and user education was performed, with the first focusing on technologist specimen entry and the second on downstream users. Commonly encountered issues and user functionality requests were resolved with database and dashboard redesign. Final implementation occurred within 6 months of initial design; users report improved triage efficiency and reduced need for interlaboratory communications. Conclusions: We successfully developed and implemented a custom database with dashboards that facilitates and streamlines our hematopathology bone marrow aspirate triage. This provides an example of a possible solution to specimen communications and traffic that are outside the purview of a standard LIS. PMID:25250187

  14. LISTA, LISTA-HOP and LISTA-HON: a comprehensive compilation of protein encoding sequences and its associated homology databases from the yeast Saccharomyces.

    PubMed Central

    Dölz, R; Mossé, M O; Slonimski, P P; Bairoch, A; Linder, P

    1996-01-01

    We continued our effort to make a comprehensive database (LISTA) for the yeast Saccharomyces cerevisiae. As in previous editions the genetic names are consistently associated to each sequence with a known and confirmed ORF. If necessary, synonyms are given in the case of allelic duplicated sequences. Although the first publication of a sequence gives-according to our rules-the genetic name of a gene, in some instances more commonly used names are given to avoid nomenclature problems and the use of ancient designations which are no longer used. In these cases the old designation is given as synonym. Thus sequences can be found either by the name or by synonyms given in LISTA. Each entry contains the genetic name, the mnemonic from the EMBL data bank, the codon bias, reference of the publication of the sequence, Chromosomal location as far as known, SWISSPROT and EMBL accession numbers. New entries will also contain the name from the systematic sequencing efforts. Since the release of LISTA4.1 we update the database continuously. To obtain more information on the included sequences, each entry has been screened against non-redundant nucleotide and protein data bank collections resulting in LISTA-HON and LISTA-HOP. This release includes reports from full Smith and Watermann peptide-level searches against a non-redundant protein sequence database. The LISTA data base can be linked to the associated data sets or to nucleotide and protein banks by the Sequence Retrieval System (SRS). The database is available by FTP and on World Wide Web. PMID:8594599

  15. Secure, web-accessible call rosters for academic radiology departments.

    PubMed

    Nguyen, A V; Tellis, W M; Avrin, D E

    2000-05-01

    Traditionally, radiology department call rosters have been posted via paper and bulletin boards. Frequently, changes to these lists are made by multiple people independently, but often not synchronized, resulting in confusion among the house staff and technical staff as to who is on call and when. In addition, multiple and disparate copies exist in different sections of the department, and changes made would not be propagated to all the schedules. To eliminate such difficulties, a paperless call scheduling application was developed. Our call scheduling program allowed Java-enabled web access to a database by designated personnel from each radiology section who have privileges to make the necessary changes. Once a person made a change, everyone accessing the database would see the modification. This eliminates the chaos resulting from people swapping shifts at the last minute and not having the time to record or broadcast the change. Furthermore, all changes to the database were logged. Users are given a log-in name and password and can only edit their section; however, all personnel have access to all sections' schedules. Our applet was written in Java 2 using the latest technology in database access. We access our Interbase database through the DataExpress and DB Swing (Borland, Scotts Valley, CA) components. The result is secure access to the call rosters via the web. There are many advantages to the web-enabled access, mainly the ability for people to make changes and have the changes recorded and propagated in a single virtual location and available to all who need to know.

  16. Rat Genome and Model Resources.

    PubMed

    Shimoyama, Mary; Smith, Jennifer R; Bryda, Elizabeth; Kuramoto, Takashi; Saba, Laura; Dwinell, Melinda

    2017-07-01

    Rats remain a major model for studying disease mechanisms and discovery, validation, and testing of new compounds to improve human health. The rat's value continues to grow as indicated by the more than 1.4 million publications (second to human) at PubMed documenting important discoveries using this model. Advanced sequencing technologies, genome modification techniques, and the development of embryonic stem cell protocols ensure the rat remains an important mammalian model for disease studies. The 2004 release of the reference genome has been followed by the production of complete genomes for more than two dozen individual strains utilizing NextGen sequencing technologies; their analyses have identified over 80 million variants. This explosion in genomic data has been accompanied by the ability to selectively edit the rat genome, leading to hundreds of new strains through multiple technologies. A number of resources have been developed to provide investigators with access to precision rat models, comprehensive datasets, and sophisticated software tools necessary for their research. Those profiled here include the Rat Genome Database, PhenoGen, Gene Editing Rat Resource Center, Rat Resource and Research Center, and the National BioResource Project for the Rat in Japan. © The Author 2017. Published by Oxford University Press.

  17. Long-term stability of the Wechsler Intelligence Scale for Children--Fourth Edition.

    PubMed

    Watkins, Marley W; Smith, Lourdes G

    2013-06-01

    Long-term stability of the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV; Wechsler, 2003) was investigated with a sample of 344 students from 2 school districts twice evaluated for special education eligibility at an average interval of 2.84 years. Test-retest reliability coefficients for the Verbal Comprehension Index (VCI), Perceptual Reasoning Index (PRI), Working Memory Index (WMI), Processing Speed Index (PSI), and the Full Scale IQ (FSIQ) were .72, .76, .66, .65, and .82, respectively. As predicted, the test-retest reliability coefficients for the subtests (Mdn = .56) were generally lower than the index scores (Mdn = .69) and the FSIQ (.82). On average, subtest scores did not differ by more than 1 point, and index scores did not differ by more than 2 points across the test-retest interval. However, 25% of the students earned FSIQ scores that differed by 10 or more points, and 29%, 39%, 37%, and 44% of the students earned VCI, PRI, WMI, and PSI scores, respectively, that varied by 10 or more points. Given this variability, it cannot be assumed that WISC-IV scores will be consistent across long test-retest intervals for individual students. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  18. Movement concepts approach in studies on flamenco dancing: A systematic review.

    PubMed

    Forczek, Wanda; Baena-Chicón, Irene; Vargas-Macías, Alfonso

    2017-10-01

    Flamenco is a highly emotional and demanding dance form. It is important to understand how the dancer's body works in order to improve fitness levels and reduce injuries. Thus, our investigation reviewed studies on kinesiological aspects of flamenco over recent years. The review was restricted to experimental studies. Literature searches were conducted using the following databases: PubMed, Scopus, and Ebsco: SPORTDiscus with Full Text, Medline, Health Source: Nursing/Academic Edition, Health Source - Consumer Edition. After limiting the search, 180 potential articles remained for analysis. A total of 27 papers on different aspects of flamenco dance were finally selected: biomechanics (14), podiatry (6), injury incidence (3), anthropometry (2), and physiology (2). These studies have applied well-established methods from sports studies. However, we noted a number of potential limitations when applied to flamenco. The evidence from this review shows that flamenco dancing demands high levels of effort. Further research is required to understand how the dancer's body works in order to improve fitness levels and reduce injuries. Most of the results presented here are consistent among studies. However, there is a great scarcity of research addressing flamenco movement in a more comprehensive perspective.

  19. Evolutionary analysis reveals regulatory and functional landscape of coding and non-coding RNA editing.

    PubMed

    Zhang, Rui; Deng, Patricia; Jacobson, Dionna; Li, Jin Billy

    2017-02-01

    Adenosine-to-inosine RNA editing diversifies the transcriptome and promotes functional diversity, particularly in the brain. A plethora of editing sites has been recently identified; however, how they are selected and regulated and which are functionally important are largely unknown. Here we show the cis-regulation and stepwise selection of RNA editing during Drosophila evolution and pinpoint a large number of functional editing sites. We found that the establishment of editing and variation in editing levels across Drosophila species are largely explained and predicted by cis-regulatory elements. Furthermore, editing events that arose early in the species tree tend to be more highly edited in clusters and enriched in slowly-evolved neuronal genes, thus suggesting that the main role of RNA editing is for fine-tuning neurological functions. While nonsynonymous editing events have been long recognized as playing a functional role, in addition to nonsynonymous editing sites, a large fraction of 3'UTR editing sites is evolutionarily constrained, highly edited, and thus likely functional. We find that these 3'UTR editing events can alter mRNA stability and affect miRNA binding and thus highlight the functional roles of noncoding RNA editing. Our work, through evolutionary analyses of RNA editing in Drosophila, uncovers novel insights of RNA editing regulation as well as its functions in both coding and non-coding regions.

  20. Evolutionary analysis reveals regulatory and functional landscape of coding and non-coding RNA editing

    PubMed Central

    Jacobson, Dionna

    2017-01-01

    Adenosine-to-inosine RNA editing diversifies the transcriptome and promotes functional diversity, particularly in the brain. A plethora of editing sites has been recently identified; however, how they are selected and regulated and which are functionally important are largely unknown. Here we show the cis-regulation and stepwise selection of RNA editing during Drosophila evolution and pinpoint a large number of functional editing sites. We found that the establishment of editing and variation in editing levels across Drosophila species are largely explained and predicted by cis-regulatory elements. Furthermore, editing events that arose early in the species tree tend to be more highly edited in clusters and enriched in slowly-evolved neuronal genes, thus suggesting that the main role of RNA editing is for fine-tuning neurological functions. While nonsynonymous editing events have been long recognized as playing a functional role, in addition to nonsynonymous editing sites, a large fraction of 3’UTR editing sites is evolutionarily constrained, highly edited, and thus likely functional. We find that these 3’UTR editing events can alter mRNA stability and affect miRNA binding and thus highlight the functional roles of noncoding RNA editing. Our work, through evolutionary analyses of RNA editing in Drosophila, uncovers novel insights of RNA editing regulation as well as its functions in both coding and non-coding regions. PMID:28166241

  1. Occurrence of plastid RNA editing in all major lineages of land plants

    PubMed Central

    Freyer, Regina; Kiefer-Meyer, Marie-Christine; Kössel, Hans

    1997-01-01

    RNA editing changes posttranscriptionally single nucleotides in chloroplast-encoded transcripts. Although much work has been done on mechanistic and functional aspects of plastid editing, little is known about evolutionary aspects of this RNA processing step. To gain a better understanding of the evolution of RNA editing in plastids, we have investigated the editing patterns in ndhB and rbcL transcripts from various species comprising all major groups of land plants. Our results indicate that RNA editing occurs in plastids of bryophytes, fern allies, true ferns, gymnosperms, and angiosperms. Both editing frequencies and editing patterns show a remarkable degree of interspecies variation. Furthermore, we have found that neither plastid editing frequencies nor the editing pattern of a specific transcript correlate with the phylogenetic tree of the plant kingdom. The poor evolutionary conservation of editing sites among closely related species as well as the occurrence of single species-specific editing sites suggest that the differences in the editing patterns and editing frequencies are probably due both to independent loss and to gain of editing sites. In addition, our results indicate that RNA editing is a relatively ancient process that probably predates the evolution of land plants. This supposition is in good agreement with the phylogenetic data obtained for plant mitochondrial RNA editing, thus providing additional evidence for common evolutionary roots of the two plant organellar editing systems. PMID:9177209

  2. [The research on the edition of Daquanbencao (Complete Collection of Materia Medica)].

    PubMed

    Li, Jian; Zhang, Wei; Zhang, Rui-xian

    2009-07-01

    Zhengleibencao (Classified Materia Medica) had been formed into several kinds of edition systems during its dissemination, among which there was the edition system of Daquanbencao (Complete Collection of Materia Medica). Daquanbencao was originally carved in the Jin dynasty, thereafter it was re-carved in the Yuan, Ming and Qing dynasties so as to form a series of editions such as the edition of Zhenyou in the second year of the Jin dynasty; the edition of the Zongwenshuyuan college in Dade renyan year of the Yuan dynasty; the WANG Qiu's carved edition of Shangyitang hall in the Ming dynasty; the carved edition of Jishanshuyuan, the Jishang mountain college in the Ming dynasty, the reprinted edition of PENG Duan-wu in the Ming dynasty, the supplementary edition of YANG Bi-da in the Qing dynasty;, and the carved edition of KE Feng-shi in the Qing dynasty. Among all the editions, Chongkanjingshizhengleidaquanbencao (Reprinted Classified Daquan Materia Medica from Historical Classics) was the representative one. As a representative of the above editions, the carved edition of WANG took the edition of the Zongwenshuyuan college of the Yuan dynasty as the original edition, but the images picture of materia medica adopted from the edition of Zhenghebencao (Materia Medica of the Zhenghe era).

  3. Human coding RNA editing is generally nonadaptive

    PubMed Central

    Xu, Guixia; Zhang, Jianzhi

    2014-01-01

    Impairment of RNA editing at a handful of coding sites causes severe disorders, prompting the view that coding RNA editing is highly advantageous. Recent genomic studies have expanded the list of human coding RNA editing sites by more than 100 times, raising the question of how common advantageous RNA editing is. Analyzing 1,783 human coding A-to-G editing sites, we show that both the frequency and level of RNA editing decrease as the importance of a site or gene increases; that during evolution, edited As are more likely than unedited As to be replaced with Gs but not with Ts or Cs; and that among nonsynonymously edited As, those that are evolutionarily least conserved exhibit the highest editing levels. These and other observations reveal the overall nonadaptive nature of coding RNA editing, despite the presence of a few sites in which editing is clearly beneficial. We propose that most observed coding RNA editing results from tolerable promiscuous targeting by RNA editing enzymes, the original physiological functions of which remain elusive. PMID:24567376

  4. Measurements of hygroscopicity and volatility of atmospheric ultrafine particles during ultrafine particle formation events at urban, industrial, and coastal sites.

    PubMed

    Park, Kihong; Kim, Jae-Seok; Park, Seung Ho

    2009-09-01

    The tandem differential mobility analyzer (TDMA) technique was applied to determine the hygroscopicity and volatility of atmospheric ultrafine particles in three sites of urban Gwangju, industrial Yeosu, and coastal Taean in South Korea. A database for the hygroscopicity and volatility of the known compositions and sizes of the laboratory-generated particles wasfirst constructed for comparison with the measured properties of atmospheric ultrafine particles. Distinct differences in hygroscopicity and volatility of atmospheric ultrafine particles werefound between a "photochemical event" and a "combustion event" as well as among different sites. At the Gwangju site, ultrafine particles in the "photochemical event" were determined to be more hygroscopic (growth factor (GF) = 1.05-1.33) than those in the "combustion event" (GF = 1.02-1.12), but their hygroscopicity was not as high as pure ammonium sulfate or sulfuric acid particles in the laboratory-generated database, suggesting they were internally mixed with less soluble species. Ultrafine particles in the "photochemical event" at the Yeosu site, having a variety of SO2, CO, and VOC emission sources, were more hygroscopic (GF = 1.34-1.60) and had a higher amount of volatile species (47-75%)than those observed at the Gwangju site. Ultrafine particle concentration at the Taean site increased during daylight hours with low tide, having a higher GF (1.34-1.80) than the Gwangju site and a lower amount of volatile species (17-34%) than the Yeosu site. Occasionally ultrafine particles were externally mixed according to their hygroscopicity and volatility, and TEM/EDS data showed that each type of particle had a distinct morphology and elemental composition.

  5. A mobile trauma database with charge capture.

    PubMed

    Moulton, Steve; Myung, Dan; Chary, Aron; Chen, Joshua; Agarwal, Suresh; Emhoff, Tim; Burke, Peter; Hirsch, Erwin

    2005-11-01

    Charge capture plays an important role in every surgical practice. We have developed and merged a custom mobile database (DB) system with our trauma registry (TRACS), to better understand our billing methods, revenue generators, and areas for improved revenue capture. The mobile database runs on handheld devices using the Windows Compact Edition platform. The front end was written in C# and the back end is SQL. The mobile database operates as a thick client; it includes active and inactive patient lists, billing screens, hot pick lists, and Current Procedural Terminology and International Classification of Diseases, Ninth Revision code sets. Microsoft Information Internet Server provides secure data transaction services between the back ends stored on each device. Traditional, hand written billing information for three of five adult trauma surgeons was averaged over a 5-month period. Electronic billing information was then collected over a 3-month period using handheld devices and the subject software application. One surgeon used the software for all 3 months, and two surgeons used it for the latter 2 months of the electronic data collection period. This electronic billing information was combined with TRACS data to determine the clinical characteristics of the trauma patients who were and were not captured using the mobile database. Total charges increased by 135%, 148%, and 228% for each of the three trauma surgeons who used the mobile DB application. The majority of additional charges were for evaluation and management services. Patients who were captured and billed at the point of care using the mobile DB had higher Injury Severity Scores, were more likely to undergo an operative procedure, and had longer lengths of stay compared with those who were not captured. Total charges more than doubled using a mobile database to bill at the point of care. A subsequent comparison of TRACS data with billing information revealed a large amount of uncaptured patient revenue. Greater familiarity and broader use of mobile database technology holds the potential for even greater revenue capture.

  6. PS1-41: Just Add Data: Implementing an Event-Based Data Model for Clinical Trial Tracking

    PubMed Central

    Fuller, Sharon; Carrell, David; Pardee, Roy

    2012-01-01

    Background/Aims Clinical research trials often have similar fundamental tracking needs, despite being quite variable in their specific logic and activities. A model tracking database that can be quickly adapted by a variety of studies has the potential to achieve significant efficiencies in database development and maintenance. Methods Over the course of several different clinical trials, we have developed a database model that is highly adaptable to a variety of projects. Rather than hard-coding each specific event that might occur in a trial, along with its logical consequences, this model considers each event and its parameters to be a data record in its own right. Each event may have related variables (metadata) describing its prerequisites, subsequent events due, associated mailings, or events that it overrides. The metadata for each event is stored in the same record with the event name. When changes are made to the study protocol, no structural changes to the database are needed. One has only to add or edit events and their metadata. Changes in the event metadata automatically determine any related logic changes. In addition to streamlining application code, this model simplifies communication between the programmer and other team members. Database requirements can be phrased as changes to the underlying data, rather than to the application code. The project team can review a single report of events and metadata and easily see where changes might be needed. In addition to benefitting from streamlined code, the front end database application can also implement useful standard features such as automated mail merges and to do lists. Results The event-based data model has proven itself to be robust, adaptable and user-friendly in a variety of study contexts. We have chosen to implement it as a SQL Server back end and distributed Access front end. Interested readers may request a copy of the Access front end and scripts for creating the back end database. Discussion An event-based database with a consistent, robust set of features has the potential to significantly reduce development time and maintenance expense for clinical trial tracking databases.

  7. C-to-U editing and site-directed RNA editing for the correction of genetic mutations.

    PubMed

    Vu, Luyen Thi; Tsukahara, Toshifumi

    2017-07-24

    Cytidine to uridine (C-to-U) editing is one type of substitutional RNA editing. It occurs in both mammals and plants. The molecular mechanism of C-to-U editing involves the hydrolytic deamination of a cytosine to a uracil base. C-to-U editing is mediated by RNA-specific cytidine deaminases and several complementation factors, which have not been completely identified. Here, we review recent findings related to the regulation and enzymatic basis of C-to-U RNA editing. More importantly, when C-to-U editing occurs in coding regions, it has the power to reprogram genetic information on the RNA level, therefore it has great potential for applications in transcript repair (diseases related to thymidine to cytidine (T>C) or adenosine to guanosine (A>G) point mutations). If it is possible to manipulate or mimic C-to-U editing, T>C or A>G genetic mutation-related diseases could be treated. Enzymatic and non-enzymatic site-directed RNA editing are two different approaches for mimicking C-to-U editing. For enzymatic site-directed RNA editing, C-to-U editing has not yet been successfully performed, and in theory, adenosine to inosine (A-to-I) editing involves the same strategy as C-to-U editing. Therefore, in this review, for applications in transcript repair, we will provide a detailed overview of enzymatic site-directed RNA editing, with a focus on A-to-I editing and non-enzymatic site-directed C-to-U editing.

  8. Whistleblowing: An integrative literature review of data-based studies involving nurses.

    PubMed

    Jackson, Debra; Hickman, Louise D; Hutchinson, Marie; Andrew, Sharon; Smith, James; Potgieter, Ingrid; Cleary, Michelle; Peters, Kath

    2014-10-27

    Abstract Aim To summarise and critique the research literature about whistleblowing and nurses. Background Whistleblowing is identified as a crucial issue in maintenance of healthcare standards and nurses are frequently involved in whistleblowing events. Despite the importance of this issue, to our knowledge an evaluation of this body of the data-based literature has not been undertaken. Method An integrative literature review approach was used to summarise and critique the research literature. A comprehensive search of five databases including Medline, CINAHL, PubMed and Health Science: Nursing/Academic Edition, and Google, were searched using terms including: 'whistleblow*', 'nurs*'. In addition, relevant journals were examined, as well as reference lists of retrieved papers. Papers published during the years 2007-2013 were selected for inclusion. Findings Fifteen papers were identified, capturing data from nurses in seven countries. The findings in this review demonstrate a growing body of research for the nursing profession at large to engage and respond appropriately to issues involving suboptimal patient care or organisational wrongdoing. Conclusions Nursing plays a key role in maintaining practice standards and in reporting care that is unacceptable although the repercussions to nurses who raise concerns are insupportable. Overall, whistleblowing and how it influences the individual, their family, work colleagues, nursing practice and policy overall, requires further national and international research attention.

  9. Whistleblowing: An integrative literature review of data-based studies involving nurses.

    PubMed

    Jackson, Debra; Hickman, Louise D; Hutchinson, Marie; Andrew, Sharon; Smith, James; Potgieter, Ingrid; Cleary, Michelle; Peters, Kath

    2014-01-01

    Abstract Aim: To summarise and critique the research literature about whistleblowing and nurses. Whistleblowing is identified as a crucial issue in maintenance of healthcare standards and nurses are frequently involved in whistleblowing events. Despite the importance of this issue, to our knowledge an evaluation of this body of the data-based literature has not been undertaken. An integrative literature review approach was used to summarise and critique the research literature. A comprehensive search of five databases including Medline, CINAHL, PubMed and Health Science: Nursing/Academic Edition, and Google, were searched using terms including: 'Whistleblow*,' 'nurs*.' In addition, relevant journals were examined, as well as reference lists of retrieved papers. Papers published during the years 2007-2013 were selected for inclusion. Fifteen papers were identified, capturing data from nurses in seven countries. The findings in this review demonstrate a growing body of research for the nursing profession at large to engage and respond appropriately to issues involving suboptimal patient care or organisational wrongdoing. Nursing plays a key role in maintaining practice standards and in reporting care that is unacceptable although the repercussions to nurses who raise concerns are insupportable. Overall, whistleblowing and how it influences the individual, their family, work colleagues, nursing practice and policy overall, requires further national and international research attention.

  10. CoCoMac 2.0 and the future of tract-tracing databases

    PubMed Central

    Bakker, Rembrandt; Wachtler, Thomas; Diesmann, Markus

    2012-01-01

    The CoCoMac database contains the results of several hundred published axonal tract-tracing studies in the macaque monkey brain. The combined results are used for constructing the macaque macro-connectome. Here we discuss the redevelopment of CoCoMac and compare it to six connectome-related projects: two online resources that provide full access to raw tracing data in rodents, a connectome viewer for advanced 3D graphics, a partial but highly detailed rat connectome, a brain data management system that generates custom connectivity matrices, and a software package that covers the complete pipeline from connectivity data to large-scale brain simulations. The second edition of CoCoMac features many enhancements over the original. For example, a search wizard is provided for full access to all tables and their nested dependencies. Connectivity matrices can be computed on demand in a user-selected nomenclature. A new data entry system is available as a preview, and is to become a generic solution for community-driven data entry in manually collated databases. We conclude with the question whether neuronal tracing will remain the gold standard to uncover the wiring of brains, thereby highlighting developments in human connectome construction, tracer substances, polarized light imaging, and serial block-face scanning electron microscopy. PMID:23293600

  11. CoCoMac 2.0 and the future of tract-tracing databases.

    PubMed

    Bakker, Rembrandt; Wachtler, Thomas; Diesmann, Markus

    2012-01-01

    The CoCoMac database contains the results of several hundred published axonal tract-tracing studies in the macaque monkey brain. The combined results are used for constructing the macaque macro-connectome. Here we discuss the redevelopment of CoCoMac and compare it to six connectome-related projects: two online resources that provide full access to raw tracing data in rodents, a connectome viewer for advanced 3D graphics, a partial but highly detailed rat connectome, a brain data management system that generates custom connectivity matrices, and a software package that covers the complete pipeline from connectivity data to large-scale brain simulations. The second edition of CoCoMac features many enhancements over the original. For example, a search wizard is provided for full access to all tables and their nested dependencies. Connectivity matrices can be computed on demand in a user-selected nomenclature. A new data entry system is available as a preview, and is to become a generic solution for community-driven data entry in manually collated databases. We conclude with the question whether neuronal tracing will remain the gold standard to uncover the wiring of brains, thereby highlighting developments in human connectome construction, tracer substances, polarized light imaging, and serial block-face scanning electron microscopy.

  12. Database-driven web interface automating gyrokinetic simulations for validation

    NASA Astrophysics Data System (ADS)

    Ernst, D. R.

    2010-11-01

    We are developing a web interface to connect plasma microturbulence simulation codes with experimental data. The website automates the preparation of gyrokinetic simulations utilizing plasma profile and magnetic equilibrium data from TRANSP analysis of experiments, read from MDSPLUS over the internet. This database-driven tool saves user sessions, allowing searches of previous simulations, which can be restored to repeat the same analysis for a new discharge. The website includes a multi-tab, multi-frame, publication quality java plotter Webgraph, developed as part of this project. Input files can be uploaded as templates and edited with context-sensitive help. The website creates inputs for GS2 and GYRO using a well-tested and verified back-end, in use for several years for the GS2 code [D. R. Ernst et al., Phys. Plasmas 11(5) 2637 (2004)]. A centralized web site has the advantage that users receive bug fixes instantaneously, while avoiding the duplicated effort of local compilations. Possible extensions to the database to manage run outputs, toward prototyping for the Fusion Simulation Project, are envisioned. Much of the web development utilized support from the DoE National Undergraduate Fellowship program [e.g., A. Suarez and D. R. Ernst, http://meetings.aps.org/link/BAPS.2005.DPP.GP1.57.

  13. Gene therapy clinical trials worldwide to 2017: An update.

    PubMed

    Ginn, Samantha L; Amaya, Anais K; Alexander, Ian E; Edelstein, Michael; Abedi, Mohammad R

    2018-03-25

    To date, almost 2600 gene therapy clinical trials have been completed, are ongoing or have been approved worldwide. Our database brings together global information on gene therapy clinical activity from trial databases, official agency sources, published literature, conference presentations and posters kindly provided to us by individual investigators or trial sponsors. This review presents our analysis of clinical trials that, to the best of our knowledge, have been or are being performed worldwide. As of our November 2017 update, we have entries on 2597 trials undertaken in 38 countries. We have analysed the geographical distribution of trials, the disease indications (or other reasons) for trials, the proportions to which different vector types are used, and the genes that have been transferred. Details of the analyses presented, and our searchable database are available via The Journal of Gene Medicine Gene Therapy Clinical Trials Worldwide website at: http://www.wiley.co.uk/genmed/clinical. We also provide an overview of the progress being made in gene therapy clinical trials around the world, and discuss key trends since the previous review, namely the use of chimeric antigen receptor T cells for the treatment of cancer and advancements in genome editing technologies, which have the potential to transform the field moving forward. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Genome-wide identification of RNA editing in hepatocellular carcinoma.

    PubMed

    Kang, Lin; Liu, Xiaoqiao; Gong, Zhoulin; Zheng, Hancheng; Wang, Jun; Li, Yingrui; Yang, Huanming; Hardwick, James; Dai, Hongyue; Poon, Ronnie T P; Lee, Nikki P; Mao, Mao; Peng, Zhiyu; Chen, Ronghua

    2015-02-01

    We did whole-transcriptome sequencing and whole-genome sequencing on nine pairs of Hepatocellular carcinoma (HCC) tumors and matched adjacent tissues to identify RNA editing events. We identified mean 26,982 editing sites with mean 89.5% canonical A→G edits in each sample using an improved bioinformatics pipeline. The editing rate was significantly higher in tumors than adjacent normal tissues. Comparing the difference between tumor and normal tissues of each patient, we found 7 non-synonymous tissue specific editing events including 4 tumor-specific edits and 3 normal-specific edits in the coding region, as well as 292 edits varying in editing degree. The significant expression changes of 150 genes associated with RNA editing were found in tumors, with 3 of the 4 most significant genes being cancer related. Our results show that editing might be related to higher gene expression. These findings indicate that RNA editing modification may play an important role in the development of HCC. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Interdisciplinary Collaboration amongst Colleagues and between Initiatives with the Magnetics Information Consortium (MagIC) Database

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Tauxe, L.; Constable, C.; Jonestrask, L.; Shaar, R.

    2014-12-01

    Earth science grand challenges often require interdisciplinary and geographically distributed scientific collaboration to make significant progress. However, this organic collaboration between researchers, educators, and students only flourishes with the reduction or elimination of technological barriers. The Magnetics Information Consortium (http://earthref.org/MagIC/) is a grass-roots cyberinfrastructure effort envisioned by the geo-, paleo-, and rock magnetic scientific community to archive their wealth of peer-reviewed raw data and interpretations from studies on natural and synthetic samples. MagIC is dedicated to facilitating scientific progress towards several highly multidisciplinary grand challenges and the MagIC Database team is currently beta testing a new MagIC Search Interface and API designed to be flexible enough for the incorporation of large heterogeneous datasets and for horizontal scalability to tens of millions of records and hundreds of requests per second. In an effort to reduce the barriers to effective collaboration, the search interface includes a simplified data model and upload procedure, support for online editing of datasets amongst team members, commenting by reviewers and colleagues, and automated contribution workflows and data retrieval through the API. This web application has been designed to generalize to other databases in MagIC's umbrella website (EarthRef.org) so the Geochemical Earth Reference Model (http://earthref.org/GERM/) portal, Seamount Biogeosciences Network (http://earthref.org/SBN/), EarthRef Digital Archive (http://earthref.org/ERDA/) and EarthRef Reference Database (http://earthref.org/ERR/) will benefit from its development.

  16. Pharmacology Portal: An Open Database for Clinical Pharmacologic Laboratory Services.

    PubMed

    Karlsen Bjånes, Tormod; Mjåset Hjertø, Espen; Lønne, Lars; Aronsen, Lena; Andsnes Berg, Jon; Bergan, Stein; Otto Berg-Hansen, Grim; Bernard, Jean-Paul; Larsen Burns, Margrete; Toralf Fosen, Jan; Frost, Joachim; Hilberg, Thor; Krabseth, Hege-Merete; Kvan, Elena; Narum, Sigrid; Austgulen Westin, Andreas

    2016-01-01

    More than 50 Norwegian public and private laboratories provide one or more analyses for therapeutic drug monitoring or testing for drugs of abuse. Practices differ among laboratories, and analytical repertoires can change rapidly as new substances become available for analysis. The Pharmacology Portal was developed to provide an overview of these activities and to standardize the practices and terminology among laboratories. The Pharmacology Portal is a modern dynamic web database comprising all available analyses within therapeutic drug monitoring and testing for drugs of abuse in Norway. Content can be retrieved by using the search engine or by scrolling through substance lists. The core content is a substance registry updated by a national editorial board of experts within the field of clinical pharmacology. This ensures quality and consistency regarding substance terminologies and classification. All laboratories publish their own repertoires in a user-friendly workflow, adding laboratory-specific details to the core information in the substance registry. The user management system ensures that laboratories are restricted from editing content in the database core or in repertoires within other laboratory subpages. The portal is for nonprofit use, and has been fully funded by the Norwegian Medical Association, the Norwegian Society of Clinical Pharmacology, and the 8 largest pharmacologic institutions in Norway. The database server runs an open-source content management system that ensures flexibility with respect to further development projects, including the potential expansion of the Pharmacology Portal to other countries. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.

  17. Plant rDNA database: update and new features.

    PubMed

    Garcia, Sònia; Gálvez, Francisco; Gras, Airy; Kovařík, Aleš; Garnatje, Teresa

    2014-01-01

    The Plant rDNA database (www.plantrdnadatabase.com) is an open access online resource providing detailed information on numbers, structures and positions of 5S and 18S-5.8S-26S (35S) ribosomal DNA loci. The data have been obtained from >600 publications on plant molecular cytogenetics, mostly based on fluorescent in situ hybridization (FISH). This edition of the database contains information on 1609 species derived from 2839 records, which means an expansion of 55.76 and 94.45%, respectively. It holds the data for angiosperms, gymnosperms, bryophytes and pteridophytes available as of June 2013. Information from publications reporting data for a single rDNA (either 5S or 35S alone) and annotation regarding transcriptional activity of 35S loci now appears in the database. Preliminary analyses suggest greater variability in the number of rDNA loci in gymnosperms than in angiosperms. New applications provide ideograms of the species showing the positions of rDNA loci as well as a visual representation of their genome sizes. We have also introduced other features to boost the usability of the Web interface, such as an application for convenient data export and a new section with rDNA-FISH-related information (mostly detailing protocols and reagents). In addition, we upgraded and/or proofread tabs and links and modified the website for a more dynamic appearance. This manuscript provides a synopsis of these changes and developments. http://www.plantrdnadatabase.com. © The Author(s) 2014. Published by Oxford University Press.

  18. Tripal: a construction toolkit for online genome databases.

    PubMed

    Ficklin, Stephen P; Sanderson, Lacey-Anne; Cheng, Chun-Huai; Staton, Margaret E; Lee, Taein; Cho, Il-Hyung; Jung, Sook; Bett, Kirstin E; Main, Doreen

    2011-01-01

    As the availability, affordability and magnitude of genomics and genetics research increases so does the need to provide online access to resulting data and analyses. Availability of a tailored online database is the desire for many investigators or research communities; however, managing the Information Technology infrastructure needed to create such a database can be an undesired distraction from primary research or potentially cost prohibitive. Tripal provides simplified site development by merging the power of Drupal, a popular web Content Management System with that of Chado, a community-derived database schema for storage of genomic, genetic and other related biological data. Tripal provides an interface that extends the content management features of Drupal to the data housed in Chado. Furthermore, Tripal provides a web-based Chado installer, genomic data loaders, web-based editing of data for organisms, genomic features, biological libraries, controlled vocabularies and stock collections. Also available are Tripal extensions that support loading and visualizations of NCBI BLAST, InterPro, Kyoto Encyclopedia of Genes and Genomes and Gene Ontology analyses, as well as an extension that provides integration of Tripal with GBrowse, a popular GMOD tool. An Application Programming Interface is available to allow creation of custom extensions by site developers, and the look-and-feel of the site is completely customizable through Drupal-based PHP template files. Addition of non-biological content and user-management is afforded through Drupal. Tripal is an open source and freely available software package found at http://tripal.sourceforge.net.

  19. Nursing leadership succession planning in Veterans Health Administration: creating a useful database.

    PubMed

    Weiss, Lizabeth M; Drake, Audrey

    2007-01-01

    An electronic database was developed for succession planning and placement of nursing leaders interested and ready, willing, and able to accept an assignment in a nursing leadership position. The tool is a 1-page form used to identify candidates for nursing leadership assignments. This tool has been deployed nationally, with access to the database restricted to nurse executives at every Veterans Health Administration facility for the purpose of entering the names of developed nurse leaders ready for a leadership assignment. The tool is easily accessed through the Veterans Health Administration Office of Nursing Service, and by limiting access to the nurse executive group, ensures candidates identified are qualified. Demographic information included on the survey tool includes the candidate's demographic information and other certifications/credentials. This completed information form is entered into a database from which a report can be generated, resulting in a listing of potential candidates to contact to supplement a local or Veterans Integrated Service Network wide position announcement. The data forms can be sorted by positions, areas of clinical or functional experience, training programs completed, and geographic preference. The forms can be edited or updated and/or added or deleted in the system as the need is identified. This tool allows facilities with limited internal candidates to have a resource with Department of Veterans Affairs prepared staff in which to seek additional candidates. It also provides a way for interested candidates to be considered for positions outside of their local geographic area.

  20. Tripal: a construction toolkit for online genome databases

    PubMed Central

    Sanderson, Lacey-Anne; Cheng, Chun-Huai; Staton, Margaret E.; Lee, Taein; Cho, Il-Hyung; Jung, Sook; Bett, Kirstin E.; Main, Doreen

    2011-01-01

    As the availability, affordability and magnitude of genomics and genetics research increases so does the need to provide online access to resulting data and analyses. Availability of a tailored online database is the desire for many investigators or research communities; however, managing the Information Technology infrastructure needed to create such a database can be an undesired distraction from primary research or potentially cost prohibitive. Tripal provides simplified site development by merging the power of Drupal, a popular web Content Management System with that of Chado, a community-derived database schema for storage of genomic, genetic and other related biological data. Tripal provides an interface that extends the content management features of Drupal to the data housed in Chado. Furthermore, Tripal provides a web-based Chado installer, genomic data loaders, web-based editing of data for organisms, genomic features, biological libraries, controlled vocabularies and stock collections. Also available are Tripal extensions that support loading and visualizations of NCBI BLAST, InterPro, Kyoto Encyclopedia of Genes and Genomes and Gene Ontology analyses, as well as an extension that provides integration of Tripal with GBrowse, a popular GMOD tool. An Application Programming Interface is available to allow creation of custom extensions by site developers, and the look-and-feel of the site is completely customizable through Drupal-based PHP template files. Addition of non-biological content and user-management is afforded through Drupal. Tripal is an open source and freely available software package found at http://tripal.sourceforge.net PMID:21959868

  1. Alu elements shape the primate transcriptome by cis-regulation of RNA editing

    PubMed Central

    2014-01-01

    Background RNA editing by adenosine to inosine deamination is a widespread phenomenon, particularly frequent in the human transcriptome, largely due to the presence of inverted Alu repeats and their ability to form double-stranded structures – a requisite for ADAR editing. While several hundred thousand editing sites have been identified within these primate-specific repeats, the function of Alu-editing has yet to be elucidated. Results We show that inverted Alu repeats, expressed in the primate brain, can induce site-selective editing in cis on sites located several hundred nucleotides from the Alu elements. Furthermore, a computational analysis, based on available RNA-seq data, finds that site-selective editing occurs significantly closer to edited Alu elements than expected. These targets are poorly edited upon deletion of the editing inducers, as well as in homologous transcripts from organisms lacking Alus. Sequences surrounding sites near edited Alus in UTRs, have been subjected to a lesser extent of evolutionary selection than those far from edited Alus, indicating that their editing generally depends on cis-acting Alus. Interestingly, we find an enrichment of primate-specific editing within encoded sequence or the UTRs of zinc finger-containing transcription factors. Conclusions We propose a model whereby primate-specific editing is induced by adjacent Alu elements that function as recruitment elements for the ADAR editing enzymes. The enrichment of site-selective editing with potentially functional consequences on the expression of transcription factors indicates that editing contributes more profoundly to the transcriptomic regulation and repertoire in primates than previously thought. PMID:24485196

  2. Physical Science Informatics: Providing Open Science Access to Microheater Array Boiling Experiment Data

    NASA Technical Reports Server (NTRS)

    McQuillen, John; Green, Robert D.; Henrie, Ben; Miller, Teresa; Chiaramonte, Fran

    2014-01-01

    The Physical Science Informatics (PSI) system is the next step in this an effort to make NASA sponsored flight data available to the scientific and engineering community, along with the general public. The experimental data, from six overall disciplines, Combustion Science, Fluid Physics, Complex Fluids, Fundamental Physics, and Materials Science, will present some unique challenges. Besides data in textual or numerical format, large portions of both the raw and analyzed data for many of these experiments are digital images and video, requiring large data storage requirements. In addition, the accessible data will include experiment design and engineering data (including applicable drawings), any analytical or numerical models, publications, reports, and patents, and any commercial products developed as a result of the research. This objective of paper includes the following: Present the preliminary layout (Figure 2) of MABE data within the PSI database. Obtain feedback on the layout. Present the procedure to obtain access to this database.

  3. Word aligned bitmap compression method, data structure, and apparatus

    DOEpatents

    Wu, Kesheng; Shoshani, Arie; Otoo, Ekow

    2004-12-14

    The Word-Aligned Hybrid (WAH) bitmap compression method and data structure is a relatively efficient method for searching and performing logical, counting, and pattern location operations upon large datasets. The technique is comprised of a data structure and methods that are optimized for computational efficiency by using the WAH compression method, which typically takes advantage of the target computing system's native word length. WAH is particularly apropos to infrequently varying databases, including those found in the on-line analytical processing (OLAP) industry, due to the increased computational efficiency of the WAH compressed bitmap index. Some commercial database products already include some version of a bitmap index, which could possibly be replaced by the WAH bitmap compression techniques for potentially increased operation speed, as well as increased efficiencies in constructing compressed bitmaps. Combined together, this technique may be particularly useful for real-time business intelligence. Additional WAH applications may include scientific modeling, such as climate and combustion simulations, to minimize search time for analysis and subsequent data visualization.

  4. World commercial aircraft accidents: 1st edition, 1946--1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimura, C.Y.

    1992-02-01

    This report is a compilation of all accidents world-wide involving aircraft in commercial service which resulted in the loss of the airframe or one or more fatality, or both. This information has been gathered in order to present a complete inventory of commercial aircraft accidents. Events involving military action, sabotage, terrorist bombings, hijackings, suicides, and industrial ground accidents are included within this list. This report is organized into six chapters. The first chapter is the introduction. The second chapter contains the compilation of accidents involving world commercial jet aircraft from 1952 to 1991. The third chapter presents a compilation ofmore » accidents involving world commercial turboprop aircraft from 1952 to 1991. The fourth chapter presents a compilation of accidents involving world commercial pistonprop aircraft with four or more engines from 1946 to 1991. Each accident compilation or database in chapters two, three and four is presented in chronological order. Each accident is presented with information the following categories: date of accident, airline or operator and its flight number (if known), type of flight, type of aircraft and model, aircraft registration number, construction number/manufacturers serial number, aircraft damage resulting from accident, accident flight phase, accident location, number of fatalities, number of occupants, references used to compile the information, and finally cause, remarks, or description (brief) of the accident. The fifth chapter presents a list of all commercial aircraft accidents for all aircraft types with 100 or more fatalities in order of decreasing number of fatalities. Chapter six presents the commercial aircraft accidents for all aircraft types by flight phase. Future editions of this report will have additional follow-on chapters which will present other studies still in preparation at the time this edition was being prepared.« less

  5. Efficient algorithms for fast integration on large data sets from multiple sources.

    PubMed

    Mi, Tian; Rajasekaran, Sanguthevar; Aseltine, Robert

    2012-06-28

    Recent large scale deployments of health information technology have created opportunities for the integration of patient medical records with disparate public health, human service, and educational databases to provide comprehensive information related to health and development. Data integration techniques, which identify records belonging to the same individual that reside in multiple data sets, are essential to these efforts. Several algorithms have been proposed in the literatures that are adept in integrating records from two different datasets. Our algorithms are aimed at integrating multiple (in particular more than two) datasets efficiently. Hierarchical clustering based solutions are used to integrate multiple (in particular more than two) datasets. Edit distance is used as the basic distance calculation, while distance calculation of common input errors is also studied. Several techniques have been applied to improve the algorithms in terms of both time and space: 1) Partial Construction of the Dendrogram (PCD) that ignores the level above the threshold; 2) Ignoring the Dendrogram Structure (IDS); 3) Faster Computation of the Edit Distance (FCED) that predicts the distance with the threshold by upper bounds on edit distance; and 4) A pre-processing blocking phase that limits dynamic computation within each block. We have experimentally validated our algorithms on large simulated as well as real data. Accuracy and completeness are defined stringently to show the performance of our algorithms. In addition, we employ a four-category analysis. Comparison with FEBRL shows the robustness of our approach. In the experiments we conducted, the accuracy we observed exceeded 90% for the simulated data in most cases. 97.7% and 98.1% accuracy were achieved for the constant and proportional threshold, respectively, in a real dataset of 1,083,878 records.

  6. Advanced Hepatocellular Carcinoma: Which Staging Systems Best Predict Prognosis?

    PubMed Central

    Huitzil-Melendez, Fidel-David; Capanu, Marinela; O'Reilly, Eileen M.; Duffy, Austin; Gansukh, Bolorsukh; Saltz, Leonard L.; Abou-Alfa, Ghassan K.

    2010-01-01

    Purpose The purpose of cancer staging systems is to accurately predict patient prognosis. The outcome of advanced hepatocellular carcinoma (HCC) depends on both the cancer stage and the extent of liver dysfunction. Many staging systems that include both aspects have been developed. It remains unknown, however, which of these systems is optimal for predicting patient survival. Patients and Methods Patients with advanced HCC treated over a 5-year period at Memorial Sloan-Kettering Cancer Center were identified from an electronic medical record database. Patients with sufficient data for utilization in all staging systems were included. TNM sixth edition, Okuda, Barcelona Clinic Liver Cancer (BCLC), Cancer of the Liver Italian Program (CLIP), Chinese University Prognostic Index (CUPI), Japan Integrated Staging (JIS), and Groupe d'Etude et de Traitement du Carcinome Hepatocellulaire (GETCH) systems were ranked on the basis of their accuracy at predicting survival by using concordance index (c-index). Other independent prognostic variables were also identified. Results Overall, 187 eligible patients were identified and were staged by using the seven staging systems. CLIP, CUPI, and GETCH were the three top-ranking staging systems. BCLC and TNM sixth edition lacked any meaningful prognostic discrimination. Performance status, AST, abdominal pain, and esophageal varices improved the discriminatory ability of CLIP. Conclusion In our selected patient population, CLIP, CUPI, and GETCH were the most informative staging systems in predicting survival in patients with advanced HCC. Prospective validation is required to determine if they can be accurately used to stratify patients in clinical trials and to direct the appropriate need for systemic therapy versus best supportive care. BCLC and TNM sixth edition were not helpful in predicting survival outcome, and their use is not supported by our data. PMID:20458042

  7. Stagnation point reverse flow combustor

    NASA Technical Reports Server (NTRS)

    Zinn, Ben T. (Inventor); Neumeier, Yedidia (Inventor); Seitzman, Jerry M. (Inventor); Jagoda, Jechiel (Inventor); Weksler, Yoav (Inventor)

    2008-01-01

    A method for combusting a combustible fuel includes providing a vessel having an opening near a proximate end and a closed distal end defining a combustion chamber. A combustible reactants mixture is presented into the combustion chamber. The combustible reactants mixture is ignited creating a flame and combustion products. The closed end of the combustion chamber is utilized for directing combustion products toward the opening of the combustion chamber creating a reverse flow of combustion products within the combustion chamber. The reverse flow of combustion products is intermixed with combustible reactants mixture to maintain the flame.

  8. RNA Editing and Its Molecular Mechanism in Plant Organelles

    PubMed Central

    Ichinose, Mizuho; Sugita, Mamoru

    2016-01-01

    RNA editing by cytidine (C) to uridine (U) conversions is widespread in plant mitochondria and chloroplasts. In some plant taxa, “reverse” U-to-C editing also occurs. However, to date, no instance of RNA editing has yet been reported in green algae and the complex thalloid liverworts. RNA editing may have evolved in early land plants 450 million years ago. However, in some plant species, including the liverwort, Marchantia polymorpha, editing may have been lost during evolution. Most RNA editing events can restore the evolutionarily conserved amino acid residues in mRNAs or create translation start and stop codons. Therefore, RNA editing is an essential process to maintain genetic information at the RNA level. Individual RNA editing sites are recognized by plant-specific pentatricopeptide repeat (PPR) proteins that are encoded in the nuclear genome. These PPR proteins are characterized by repeat elements that bind specifically to RNA sequences upstream of target editing sites. In flowering plants, non-PPR proteins also participate in multiple RNA editing events as auxiliary factors. C-to-U editing can be explained by cytidine deamination. The proteins discovered to date are important factors for RNA editing but a bona fide RNA editing enzyme has yet to be identified. PMID:28025543

  9. A comprehensive detailed chemical kinetic reaction mechanism for combustion of n-alkane hydrocarbons from n-octane to n-hexadecane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westbrook, Charles K.; Pitz, William J.; Herbinet, Olivier

    2009-01-15

    Detailed chemical kinetic reaction mechanisms have been developed to describe the pyrolysis and oxidation of nine n-alkanes larger than n-heptane, including n-octane (n-C{sub 8}H{sub 18}), n-nonane (n-C{sub 9}H{sub 20}), n-decane (n-C{sub 10}H{sub 22}), n-undecane (n-C{sub 11}H{sub 24}), n-dodecane (n-C{sub 12}H{sub 26}), n-tridecane (n-C{sub 13}H{sub 28}), n-tetradecane (n-C{sub 14}H{sub 30}), n-pentadecane (n-C{sub 15}H{sub 32}), and n-hexadecane (n-C{sub 16}H{sub 34}). These mechanisms include both high temperature and low temperature reaction pathways. The mechanisms are based on previous mechanisms for the primary reference fuels n-heptane and iso-octane, using the reaction classes first developed for n-heptane. Individual reaction class rules are as simple asmore » possible in order to focus on the parallelism between all of the n-alkane fuels included in the mechanisms. These mechanisms are validated through extensive comparisons between computed and experimental data from a wide variety of different sources. In addition, numerical experiments are carried out to examine features of n-alkane combustion in which the detailed mechanisms can be used to compare reactivities of different n-alkane fuels. The mechanisms for these n-alkanes are presented as a single detailed mechanism, which can be edited to produce efficient mechanisms for any of the n-alkanes included, and the entire mechanism, with supporting thermochemical and transport data, together with an explanatory glossary explaining notations and structural details, is available for download from our web page. (author)« less

  10. Identification of chemical components of combustion emissions that affect pro-atherosclerotic vascular responses in mice

    PubMed Central

    Seilkop, Steven K.; Campen, Matthew J.; Lund, Amie K.; McDonald, Jacob D.; Mauderly, Joe L.

    2012-01-01

    Combustion emissions cause pro-atherosclerotic responses in apolipoprotein E-deficient (ApoE/−) mice, but the causal components of these complex mixtures are unresolved. In studies previously reported, ApoE−/− mice were exposed by inhalation 6 h/day for 50 consecutive days to multiple dilutions of diesel or gasoline exhaust, wood smoke, or simulated “downwind” coal emissions. In this study, the analysis of the combined four-study database using the Multiple Additive Regression Trees (MART) data mining approach to determine putative causal exposure components regardless of combustion source is reported. Over 700 physical–chemical components were grouped into 45 predictor variables. Response variables measured in aorta included endothelin-1, vascular endothelin growth factor, three matrix metalloproteinases (3, 7, 9), metalloproteinase inhibitor 2, heme-oxygenase-1, and thiobarbituric acid reactive substances. Two or three predictors typically explained most of the variation in response among the experimental groups. Overall, sulfur dioxide, ammonia, nitrogen oxides, and carbon monoxide were most highly predictive of responses, although their rankings differed among the responses. Consistent with the earlier finding that filtration of particles had little effect on responses, particulate components ranked third to seventh in predictive importance for the eight response variables. MART proved useful for identifying putative causal components, although the small number of pollution mixtures (4) can provide only suggestive evidence of causality. The potential independent causal contributions of these gases to the vascular responses, as well as possible interactions among them and other components of complex pollutant mixtures, warrant further evaluation. PMID:22486345

  11. Source apportionment of particulate pollutants in the atmosphere over the Northern Yellow Sea

    NASA Astrophysics Data System (ADS)

    Wang, L.; Qi, J. H.; Shi, J. H.; Chen, X. J.; Gao, H. W.

    2013-05-01

    Atmospheric aerosol samples were collected over the Northern Yellow Sea of China during the years of 2006 and 2007, in which the Total Carbon (TC), Cu, Pb, Cd, V, Zn, Fe, Al, Na+, Ca2+, Mg2+, NH4+, NO3-, SO42-, Cl-, and K+ were measured. The principle components analysis (PCA) and positive matrix factorization (PMF) receptor models were used to identify the sources of particulate matter. The results indicated that seven factors contributed to the atmospheric particles over the Northern Yellow Sea, i.e., two secondary aerosols (sulfate and nitrate), soil dust, biomass burning, oil combustion, sea salt, and metal smelting. When the whole database was considered, secondary aerosol formation contributed the most to the atmospheric particle content, followed by soil dust. Secondary aerosols and soil dust consisted of 65.65% of the total mass of particulate matter. The results also suggested that the aerosols over the North Yellow Sea were heavily influenced by ship emission over the local sea area and by continental agricultural activities in the northern China, indicating by high loading of V in oil combustion and high loading of K+ in biomass burning. However, the contribution of each factor varied greatly over the different seasons. In spring and autumn, soil dust and biomass burning were the dominant factors. In summer, heavy oil combustion contributed the most among these factors. In winter, secondary aerosols were major sources. Backward trajectories analysis indicated the 66% of air mass in summer was from the ocean, while the air mass is mainly from the continent in other seasons.

  12. Identification of chemical components of combustion emissions that affect pro-atherosclerotic vascular responses in mice.

    PubMed

    Seilkop, Steven K; Campen, Matthew J; Lund, Amie K; McDonald, Jacob D; Mauderly, Joe L

    2012-04-01

    Combustion emissions cause pro-atherosclerotic responses in apolipoprotein E-deficient (ApoE/⁻) mice, but the causal components of these complex mixtures are unresolved. In studies previously reported, ApoE⁻/⁻ mice were exposed by inhalation 6 h/day for 50 consecutive days to multiple dilutions of diesel or gasoline exhaust, wood smoke, or simulated "downwind" coal emissions. In this study, the analysis of the combined four-study database using the Multiple Additive Regression Trees (MART) data mining approach to determine putative causal exposure components regardless of combustion source is reported. Over 700 physical-chemical components were grouped into 45 predictor variables. Response variables measured in aorta included endothelin-1, vascular endothelin growth factor, three matrix metalloproteinases (3, 7, 9), metalloproteinase inhibitor 2, heme-oxygenase-1, and thiobarbituric acid reactive substances. Two or three predictors typically explained most of the variation in response among the experimental groups. Overall, sulfur dioxide, ammonia, nitrogen oxides, and carbon monoxide were most highly predictive of responses, although their rankings differed among the responses. Consistent with the earlier finding that filtration of particles had little effect on responses, particulate components ranked third to seventh in predictive importance for the eight response variables. MART proved useful for identifying putative causal components, although the small number of pollution mixtures (4) can provide only suggestive evidence of causality. The potential independent causal contributions of these gases to the vascular responses, as well as possible interactions among them and other components of complex pollutant mixtures, warrant further evaluation.

  13. Understanding Editing Behaviors in Multilingual Wikipedia.

    PubMed

    Kim, Suin; Park, Sungjoon; Hale, Scott A; Kim, Sooyoung; Byun, Jeongmin; Oh, Alice H

    2016-01-01

    Multilingualism is common offline, but we have a more limited understanding of the ways multilingualism is displayed online and the roles that multilinguals play in the spread of content between speakers of different languages. We take a computational approach to studying multilingualism using one of the largest user-generated content platforms, Wikipedia. We study multilingualism by collecting and analyzing a large dataset of the content written by multilingual editors of the English, German, and Spanish editions of Wikipedia. This dataset contains over two million paragraphs edited by over 15,000 multilingual users from July 8 to August 9, 2013. We analyze these multilingual editors in terms of their engagement, interests, and language proficiency in their primary and non-primary (secondary) languages and find that the English edition of Wikipedia displays different dynamics from the Spanish and German editions. Users primarily editing the Spanish and German editions make more complex edits than users who edit these editions as a second language. In contrast, users editing the English edition as a second language make edits that are just as complex as the edits by users who primarily edit the English edition. In this way, English serves a special role bringing together content written by multilinguals from many language editions. Nonetheless, language remains a formidable hurdle to the spread of content: we find evidence for a complexity barrier whereby editors are less likely to edit complex content in a second language. In addition, we find that multilinguals are less engaged and show lower levels of language proficiency in their second languages. We also examine the topical interests of multilingual editors and find that there is no significant difference between primary and non-primary editors in each language.

  14. Understanding Editing Behaviors in Multilingual Wikipedia

    PubMed Central

    Hale, Scott A.; Kim, Sooyoung; Byun, Jeongmin; Oh, Alice H.

    2016-01-01

    Multilingualism is common offline, but we have a more limited understanding of the ways multilingualism is displayed online and the roles that multilinguals play in the spread of content between speakers of different languages. We take a computational approach to studying multilingualism using one of the largest user-generated content platforms, Wikipedia. We study multilingualism by collecting and analyzing a large dataset of the content written by multilingual editors of the English, German, and Spanish editions of Wikipedia. This dataset contains over two million paragraphs edited by over 15,000 multilingual users from July 8 to August 9, 2013. We analyze these multilingual editors in terms of their engagement, interests, and language proficiency in their primary and non-primary (secondary) languages and find that the English edition of Wikipedia displays different dynamics from the Spanish and German editions. Users primarily editing the Spanish and German editions make more complex edits than users who edit these editions as a second language. In contrast, users editing the English edition as a second language make edits that are just as complex as the edits by users who primarily edit the English edition. In this way, English serves a special role bringing together content written by multilinguals from many language editions. Nonetheless, language remains a formidable hurdle to the spread of content: we find evidence for a complexity barrier whereby editors are less likely to edit complex content in a second language. In addition, we find that multilinguals are less engaged and show lower levels of language proficiency in their second languages. We also examine the topical interests of multilingual editors and find that there is no significant difference between primary and non-primary editors in each language. PMID:27171158

  15. Pdf modeling for premixed turbulent combustion based on the properties of iso-concentration surfaces

    NASA Technical Reports Server (NTRS)

    Vervisch, L.; Kollmann, W.; Bray, K. N. C.; Mantel, T.

    1994-01-01

    In premixed turbulent flames the presence of intense mixing zones located in front of and behind the flame surface leads to a requirement to study the behavior of iso-concentration surfaces defined for all values of the progress variable (equal to unity in burnt gases and to zero in fresh mixtures). To support this study, some theoretical and mathematical tools devoted to level surfaces are first developed. Then a database of direct numerical simulations of turbulent premixed flames is generated and used to investigate the internal structure of the flame brush, and a new pdf model based on the properties of iso-surfaces is proposed.

  16. Genetic mapping uncovers cis-regulatory landscape of RNA editing.

    PubMed

    Ramaswami, Gokul; Deng, Patricia; Zhang, Rui; Anna Carbone, Mary; Mackay, Trudy F C; Li, Jin Billy

    2015-09-16

    Adenosine-to-inosine (A-to-I) RNA editing, catalysed by ADAR enzymes conserved in metazoans, plays an important role in neurological functions. Although the fine-tuning mechanism provided by A-to-I RNA editing is important, the underlying rules governing ADAR substrate recognition are not well understood. We apply a quantitative trait loci (QTL) mapping approach to identify genetic variants associated with variability in RNA editing. With very accurate measurement of RNA editing levels at 789 sites in 131 Drosophila melanogaster strains, here we identify 545 editing QTLs (edQTLs) associated with differences in RNA editing. We demonstrate that many edQTLs can act through changes in the local secondary structure for edited dsRNAs. Furthermore, we find that edQTLs located outside of the edited dsRNA duplex are enriched in secondary structure, suggesting that distal dsRNA structure beyond the editing site duplex affects RNA editing efficiency. Our work will facilitate the understanding of the cis-regulatory code of RNA editing.

  17. High-Content Analysis of CRISPR-Cas9 Gene-Edited Human Embryonic Stem Cells.

    PubMed

    Carlson-Stevermer, Jared; Goedland, Madelyn; Steyer, Benjamin; Movaghar, Arezoo; Lou, Meng; Kohlenberg, Lucille; Prestil, Ryan; Saha, Krishanu

    2016-01-12

    CRISPR-Cas9 gene editing of human cells and tissues holds much promise to advance medicine and biology, but standard editing methods require weeks to months of reagent preparation and selection where much or all of the initial edited samples are destroyed during analysis. ArrayEdit, a simple approach utilizing surface-modified multiwell plates containing one-pot transcribed single-guide RNAs, separates thousands of edited cell populations for automated, live, high-content imaging and analysis. The approach lowers the time and cost of gene editing and produces edited human embryonic stem cells at high efficiencies. Edited genes can be expressed in both pluripotent stem cells and differentiated cells. This preclinical platform adds important capabilities to observe editing and selection in situ within complex structures generated by human cells, ultimately enabling optical and other molecular perturbations in the editing workflow that could refine the specificity and versatility of gene editing. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  18. CERES ERBE-like Instantaneous TOA Estimates (ES-8) in HDF (CER_ES4_TRMM-PFM_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=1998-08-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  19. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_PFM+FM1_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2000-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  20. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_FM1+FM4_Edition2)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  1. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Terra-FM2_Edition1-CV)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2006-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  2. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM3_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  3. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM3_Edition2)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  4. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM4_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  5. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_FM1+FM2_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2003-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  6. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Terra-FM1_Edition2)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  7. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM4_Edition2)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  8. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM4_Edition1-CV)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  9. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Terra-FM2_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  10. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Terra-FM1_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  11. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_PFM+FM2_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2000-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  12. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_FM1+FM2_Edition2)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2002-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  13. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_FM1+FM3_Edition2)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=1 month; Temporal_Resolution_Range=Monthly - < Annual].

  14. MIRO and IRbase: IT Tools for the Epidemiological Monitoring of Insecticide Resistance in Mosquito Disease Vectors

    PubMed Central

    Dialynas, Emmanuel; Topalis, Pantelis; Vontas, John; Louis, Christos

    2009-01-01

    Background Monitoring of insect vector populations with respect to their susceptibility to one or more insecticides is a crucial element of the strategies used for the control of arthropod-borne diseases. This management task can nowadays be achieved more efficiently when assisted by IT (Information Technology) tools, ranging from modern integrated databases to GIS (Geographic Information System). Here we describe an application ontology that we developed de novo, and a specially designed database that, based on this ontology, can be used for the purpose of controlling mosquitoes and, thus, the diseases that they transmit. Methodology/Principal Findings The ontology, named MIRO for Mosquito Insecticide Resistance Ontology, developed using the OBO-Edit software, describes all pertinent aspects of insecticide resistance, including specific methodology and mode of action. MIRO, then, forms the basis for the design and development of a dedicated database, IRbase, constructed using open source software, which can be used to retrieve data on mosquito populations in a temporally and spatially separate way, as well as to map the output using a Google Earth interface. The dependency of the database on the MIRO allows for a rational and efficient hierarchical search possibility. Conclusions/Significance The fact that the MIRO complies with the rules set forward by the OBO (Open Biomedical Ontologies) Foundry introduces cross-referencing with other biomedical ontologies and, thus, both MIRO and IRbase are suitable as parts of future comprehensive surveillance tools and decision support systems that will be used for the control of vector-borne diseases. MIRO is downloadable from and IRbase is accessible at VectorBase, the NIAID-sponsored open access database for arthropod vectors of disease. PMID:19547750

  15. Administrative Databases Can Yield False Conclusions-An Example of Obesity in Total Joint Arthroplasty.

    PubMed

    George, Jaiben; Newman, Jared M; Ramanathan, Deepak; Klika, Alison K; Higuera, Carlos A; Barsoum, Wael K

    2017-09-01

    Research using large administrative databases has substantially increased in recent years. Accuracy with which comorbidities are represented in these databases has been questioned. The purpose of this study was to evaluate the extent of errors in obesity coding and its impact on arthroplasty research. Eighteen thousand thirty primary total knee arthroplasties (TKAs) and 10,475 total hip arthroplasties (THAs) performed at a single healthcare system from 2004-2014 were included. Patients were classified as obese or nonobese using 2 methods: (1) body mass index (BMI) ≥30 kg/m 2 and (2) international classification of disease, 9th edition codes. Length of stay, operative time, and 90-day complications were collected. Effect of obesity on various outcomes was analyzed separately for both BMI- and coding-based obesity. From 2004 to 2014, the prevalence of BMI-based obesity increased from 54% to 63% and 40% to 45% in TKA and THA, respectively. The prevalence of coding-based obesity increased from 15% to 28% and 8% to 17% in TKA and THA, respectively. Coding overestimated the growth of obesity in TKA and THA by 5.6 and 8.4 times, respectively. When obesity was defined by coding, obesity was falsely shown to be a significant risk factor for deep vein thrombosis (TKA), pulmonary embolism (THA), and longer hospital stay (TKA and THA). The growth in obesity observed in administrative databases may be an artifact because of improvements in coding over the years. Obesity defined by coding can overestimate the actual effect of obesity on complications after arthroplasty. Therefore, studies using large databases should be interpreted with caution, especially when variables prone to coding errors are involved. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Water quality management library. 2. edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eckenfelder, W.W.; Malina, J.F.; Patterson, J.W.

    1998-12-31

    A series of ten books offered in conjunction with Water Quality International, the Biennial Conference and Exposition of the International Association on Water Pollution Research and Control (IAWPRC). Volume 1, Activated Sludge Process, Design and Control, 2nd edition, 1998: Volume 2, Upgrading Wastewater Treatment Plants, 2nd edition, 1998: Volume 3, Toxicity Reduction, 2nd edition, 1998: Volume 4, Municipal Sewage Sludge Management, 2nd edition, 1998: Volume 5, Design and Retrofit of Wastewater Treatment Plants for Biological Nutrient Removal, 1st edition, 1992: Volume 6, Dynamics and Control of the Activated Sludge Process, 2nd edition, 1998: Volume 7: Design of Anaerobic Processes formore » the Treatment of Industrial and Municipal Wastes, 1st edition, 1992: Volume 8, Groundwater Remediation, 1st edition, 1992: Volume 9, Nonpoint Pollution and Urban Stormwater Management, 1st edition, 1995: Volume 10, Wastewater Reclamation and Reuse, 1st edition, 1998.« less

  17. Benford's Law and articles of scientific journals: comparison of JCR® and Scopus data.

    PubMed

    Alves, Alexandre Donizeti; Yanasse, Horacio Hideki; Soma, Nei Yoshihiro

    2014-01-01

    Benford's Law is a logarithmic probability distribution function used to predict the distribution of the first significant digits in numerical data. This paper presents the results of a study of the distribution of the first significant digits of the number of articles published of journals indexed in the JCR ® Sciences and Social Sciences Editions from 2007 to 2011. The data of these journals were also analyzed by the country of origin and the journal's category. Results considering the number of articles published informed by Scopus are also presented. Comparing the results we observe that there is a significant difference in the data informed in the two databases.

  18. Neutron Data Compilation Centre, European Nuclear Energy Agency, Newsletter No. 13

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1972-02-15

    This edition of the newsletter is intended to inform all users of neutron data about the content of the CCDN Experimental Neutron Data Library as of February 1972. It supercedes the last index issue, no. 11, published in October 1969. Since then, the database has been greatly enlarged thanks to the collaboration of neutron data users in the ENEA area (Western Europe plus Japan) and to the truly worldwide cooperation between the four existing data centers: NNCSC at Brookhaven Lab. in Upton, NY, United States, CCDN in Gif-sur_yvette, France, Centr po Jadernym Dannym in Obninsk, USSR, and the Nuclear Datamore » Section, IAEA, Vienna, Austria.« less

  19. Abundant off-target edits from site-directed RNA editing can be reduced by nuclear localization of the editing enzyme.

    PubMed

    Vallecillo-Viejo, Isabel C; Liscovitch-Brauer, Noa; Montiel-Gonzalez, Maria Fernanda; Eisenberg, Eli; Rosenthal, Joshua J C

    2018-01-02

    Site-directed RNA editing (SDRE) is a general strategy for making targeted base changes in RNA molecules. Although the approach is relatively new, several groups, including our own, have been working on its development. The basic strategy has been to couple the catalytic domain of an adenosine (A) to inosine (I) RNA editing enzyme to a guide RNA that is used for targeting. Although highly efficient on-target editing has been reported, off-target events have not been rigorously quantified. In this report we target premature termination codons (PTCs) in messages encoding both a fluorescent reporter protein and the Cystic Fibrosis Transmembrane Conductance Regulator (CFTR) protein transiently transfected into human epithelial cells. We demonstrate that while on-target editing is efficient, off-target editing is extensive, both within the targeted message and across the entire transcriptome of the transfected cells. By redirecting the editing enzymes from the cytoplasm to the nucleus, off-target editing is reduced without compromising the on-target editing efficiency. The addition of the E488Q mutation to the editing enzymes, a common strategy for increasing on-target editing efficiency, causes a tremendous increase in off-target editing. These results underscore the need to reduce promiscuity in current approaches to SDRE.

  20. REDO: RNA Editing Detection in Plant Organelles Based on Variant Calling Results.

    PubMed

    Wu, Shuangyang; Liu, Wanfei; Aljohi, Hasan Awad; Alromaih, Sarah A; Alanazi, Ibrahim O; Lin, Qiang; Yu, Jun; Hu, Songnian

    2018-05-01

    RNA editing is a post-transcriptional or cotranscriptional process that changes the sequence of the precursor transcript by substitutions, insertions, or deletions. Almost all of the land plants undergo RNA editing in organelles (plastids and mitochondria). Although several software tools have been developed to identify RNA editing events, there has been a great challenge to distinguish true RNA editing events from genome variation, sequencing errors, and other factors. Here we introduce REDO, a comprehensive application tool for identifying RNA editing events in plant organelles based on variant call format files from RNA-sequencing data. REDO is a suite of Perl scripts that illustrate a bunch of attributes of RNA editing events in figures and tables. REDO can also detect RNA editing events in multiple samples simultaneously and identify the significant differential proportion of RNA editing loci. Comparing with similar tools, such as REDItools, REDO runs faster with higher accuracy, and more specificity at the cost of slightly lower sensitivity. Moreover, REDO annotates each RNA editing site in RNAs, whereas REDItools reports only possible RNA editing sites in genome, which need additional steps to obtain RNA editing profiles for RNAs. Overall, REDO can identify potential RNA editing sites easily and provide several functions such as detailed annotations, statistics, figures, and significantly differential proportion of RNA editing sites among different samples.

  1. Structure-energy relationship in barbituric acid: a calorimetric, computational, and crystallographic study.

    PubMed

    Roux, María Victoria; Temprado, Manuel; Notario, Rafael; Foces-Foces, Concepción; Emel'yanenko, Vladimir N; Verevkin, Sergey P

    2008-08-14

    This paper reports the value of the standard (p(o) = 0.1 MPa) molar enthalpy of formation in the gas phase at T = 298.15 K for barbituric acid. The enthalpies of combustion and sublimation were measured by static bomb combustion calorimetry and transference (transpiration) method in a saturated N2 stream and a gas-phase enthalpy of formation value of -(534.3 +/- 1.7) kJ x mol(-1) was determined at T = 298.15 K. G3-calculated enthalpies of formation are in very good agreement with the experimental value. The behavior of the sample as a function of the temperature was studied by differential scanning calorimetry, and a new polymorph of barbituric acid at high temperature was found. In the solid state, two anhydrous forms are known displaying two out of the six hydrogen-bonding patterns observed in the alkyl/alkenyl derivatives retrieved from the Cambridge Crystallographic Database. The stability of these motifs has been analyzed by theoretical calculations. X-ray powder diffraction technique was used to establish to which polymorphic form corresponds to the commercial sample used in this study and to characterize the new form at high temperature.

  2. Use of real-time sensors to characterise human exposures to combustion related pollutants.

    PubMed

    Delgado-Saborit, Juana Maria

    2012-07-01

    Concentrations of black carbon and nitrogen dioxide have been collected concurrently using a MicrAeth AE-51 and an Aeroqual GSS NO(2) sensor. Forty five sampling events with a duration spanning between 16 and 22 hours have collected 10,800 5 min data in Birmingham (UK) from July to October 2011. The high temporal resolution database allowed identification of peak exposures and which activities contributed the most to these peaks, such as cooking and commuting. Personal exposure concentrations for non-occupationally exposed subjects ranged between 0.01 and 50 μg m(-3) for BC with average values of 1.3 ± 2.2 μg m(-3) (AM ± SD). Nitrogen dioxide exposure concentrations were in the range

  3. Dynamic landscape and regulation of RNA editing in mammals

    PubMed Central

    Tan, Meng How; Li, Qin; Shanmugam, Raghuvaran; Piskol, Robert; Kohler, Jennefer; Young, Amy N.; Liu, Kaiwen Ivy; Zhang, Rui; Ramaswami, Gokul; Ariyoshi, Kentaro; Gupte, Ankita; Keegan, Liam P.; George, Cyril X.; Ramu, Avinash; Huang, Ni; Pollina, Elizabeth A.; Leeman, Dena S.; Rustighi, Alessandra; Sharon Goh, Y. P.; Chawla, Ajay; Del Sal, Giannino; Peltz, Gary; Brunet, Anne; Conrad, Donald F.; Samuel, Charles E.; O’Connell, Mary A.; Walkley, Carl R.; Nishikura, Kazuko; Li, Jin Billy

    2017-01-01

    Adenosine-to-inosine (A-to-I) RNA editing is a conserved post-transcriptional mechanism mediated by ADAR enzymes that diversifies the transcriptome by altering selected nucleotides in RNA molecules1. Although many editing sites have recently been discovered2–7, the extent to which most sites are edited and how the editing is regulated in different biological contexts are not fully understood8–10. Here we report dynamic spatiotemporal patterns and new regulators of RNA editing, discovered through an extensive profiling of A-to-I RNA editing in 8,551 human samples (representing 53 body sites from 552 individuals) from the Genotype-Tissue Expression (GTEx) project and in hundreds of other primate and mouse samples. We show that editing levels in non-repetitive coding regions vary more between tissues than editing levels in repetitive regions. Globally, ADAR1 is the primary editor of repetitive sites and ADAR2 is the primary editor of non-repetitive coding sites, whereas the catalytically inactive ADAR3 predominantly acts as an inhibitor of editing. Cross-species analysis of RNA editing in several tissues revealed that species, rather than tissue type, is the primary determinant of editing levels, suggesting stronger cis-directed regulation of RNA editing for most sites, although the small set of conserved coding sites is under stronger trans-regulation. In addition, we curated an extensive set of ADAR1 and ADAR2 targets and showed that many editing sites display distinct tissue-specific regulation by the ADAR enzymes in vivo. Further analysis of the GTEx data revealed several potential regulators of editing, such as AIMP2, which reduces editing in muscles by enhancing the degradation of the ADAR proteins. Collectively, our work provides insights into the complex cis- and trans-regulation of A-to-I editing. PMID:29022589

  4. Dynamic landscape and regulation of RNA editing in mammals.

    PubMed

    Tan, Meng How; Li, Qin; Shanmugam, Raghuvaran; Piskol, Robert; Kohler, Jennefer; Young, Amy N; Liu, Kaiwen Ivy; Zhang, Rui; Ramaswami, Gokul; Ariyoshi, Kentaro; Gupte, Ankita; Keegan, Liam P; George, Cyril X; Ramu, Avinash; Huang, Ni; Pollina, Elizabeth A; Leeman, Dena S; Rustighi, Alessandra; Goh, Y P Sharon; Chawla, Ajay; Del Sal, Giannino; Peltz, Gary; Brunet, Anne; Conrad, Donald F; Samuel, Charles E; O'Connell, Mary A; Walkley, Carl R; Nishikura, Kazuko; Li, Jin Billy

    2017-10-11

    Adenosine-to-inosine (A-to-I) RNA editing is a conserved post-transcriptional mechanism mediated by ADAR enzymes that diversifies the transcriptome by altering selected nucleotides in RNA molecules. Although many editing sites have recently been discovered, the extent to which most sites are edited and how the editing is regulated in different biological contexts are not fully understood. Here we report dynamic spatiotemporal patterns and new regulators of RNA editing, discovered through an extensive profiling of A-to-I RNA editing in 8,551 human samples (representing 53 body sites from 552 individuals) from the Genotype-Tissue Expression (GTEx) project and in hundreds of other primate and mouse samples. We show that editing levels in non-repetitive coding regions vary more between tissues than editing levels in repetitive regions. Globally, ADAR1 is the primary editor of repetitive sites and ADAR2 is the primary editor of non-repetitive coding sites, whereas the catalytically inactive ADAR3 predominantly acts as an inhibitor of editing. Cross-species analysis of RNA editing in several tissues revealed that species, rather than tissue type, is the primary determinant of editing levels, suggesting stronger cis-directed regulation of RNA editing for most sites, although the small set of conserved coding sites is under stronger trans-regulation. In addition, we curated an extensive set of ADAR1 and ADAR2 targets and showed that many editing sites display distinct tissue-specific regulation by the ADAR enzymes in vivo. Further analysis of the GTEx data revealed several potential regulators of editing, such as AIMP2, which reduces editing in muscles by enhancing the degradation of the ADAR proteins. Collectively, our work provides insights into the complex cis- and trans-regulation of A-to-I editing.

  5. Combustion of Interacting Droplet Arrays in a Microgravity Environment

    NASA Technical Reports Server (NTRS)

    Dietrich, Daniel L.; Struk, Peter M.; Kitano, Kunihiro; Ikeda, Koji; Honma, Senji

    1997-01-01

    This research program involves the study of single droplets and linear arrays of droplets in weakly-buoyant and non-bouyant environments. The primary purpose of the single droplet work was to (1) provide a data base from which to compare droplet array results and (2) to correlate the effects of buoyancy on flame shape. Traditionally convective effects in droplet combustion are represented in terms of the Reynolds number, Re, for forced convection and the Grashof number, Gr, for natural convection. Typically, corrections to the burning rate constant for convective effects are written in terms of Re or Gr(exp 1). The Stefan velocity is not included in these correlations, even though from purely physical reasons, one would expect it to be important, especially at higher burning rates. The flame distortion due to convective effects is less documented quantitatively. Kumagai and Isoda do predict flame shape in natural and forced convective flow fields. Their focus, however, was to predict the actual flame dimensions. Law and co-workers used reduced pressure, high oxidizer ambients to obtain spherical flames. This implies that buoyant flows were reduced at the low pressures, as indicated by a very small Grashof number. Ross et al, however, using scaling arguments showed that reducing the pressure does not have a large effect on the magnitude of the buoyant velocity. Struk et al showed elongated flame shapes during simulated (porous sphere) droplet combustion. The elongation of the flames was due to residual gravity levels aboard the reduced gravity aircraft on which the experiments were conducted. These flame shapes, as well as some data from the literature were interpreted based on a dimensionless grouping called the sphericity parameter, Sp. Sp is the ratio of a characteristic computed buoyant velocity to the Stefan velocity at the flame front. One purpose of the droplet arrays work is to extend the database and theories that exist for single droplets into the regime where droplet interactions are important. The eventual goal being to use the results of this work as inputs to models on spray combustion where droplets seldom burn individually; instead the combustion history of a droplet is strongly influenced by the presence of the neighboring droplets. Recently, Annamali and Ryan have summarized he current status of droplet array, cloud and spray combustion. A number of simplified theories led numerical studies of droplet vaporization/combustion where multiple droplet effects are present are now available. These theories all neglect the effect of buoyancy. Experimentally, most studies to date suffer the effects of buoyancy. It is the dominant transport mechanism in the problem. Only the works of Law and co-worker and more recently by Mikami et al were performed in an environment where buoyancy effects were small. Law and co-workers were limited to high oxygen index, low pressure ambient environments since there studies were conducted in normal gravity.

  6. Method and apparatus for active control of combustion rate through modulation of heat transfer from the combustion chamber wall

    DOEpatents

    Roberts, Jr., Charles E.; Chadwell, Christopher J.

    2004-09-21

    The flame propagation rate resulting from a combustion event in the combustion chamber of an internal combustion engine is controlled by modulation of the heat transfer from the combustion flame to the combustion chamber walls. In one embodiment, heat transfer from the combustion flame to the combustion chamber walls is mechanically modulated by a movable member that is inserted into, or withdrawn from, the combustion chamber thereby changing the shape of the combustion chamber and the combustion chamber wall surface area. In another embodiment, heat transfer from the combustion flame to the combustion chamber walls is modulated by cooling the surface of a portion of the combustion chamber wall that is in close proximity to the area of the combustion chamber where flame speed control is desired.

  7. The Thomas A. Edison Papers at Rutgers University

    Science.gov Websites

    Contact Us Research Digital Edition Microfilm Edition Book Edition Motion Picture Catalogs Document Microfilm Edition Book Edition Motion Picture Catalogs Document Sampler Thomas Edison's Life Biography

  8. Models to predict emissions of health-damaging pollutants and global warming contributions of residential fuel/stove combinations in China.

    PubMed

    Edwards, Rufus D; Smith, Kirk R; Zhang, Junfeng; Ma, Yuqing

    2003-01-01

    Residential energy use in developing countries has traditionally been associated with combustion devices of poor energy efficiency, which have been shown to produce substantial health-damaging pollution, contributing significantly to the global burden of disease, and greenhouse gas (GHG) emissions. Precision of these estimates in China has been hampered by limited data on stove use and fuel consumption in residences. In addition limited information is available on variability of emissions of pollutants from different stove/fuel combinations in typical use, as measurement of emission factors requires measurement of multiple chemical species in complex burn cycle tests. Such measurements are too costly and time consuming for application in conjunction with national surveys. Emissions of most of the major health-damaging pollutants (HDP) and many of the gases that contribute to GHG emissions from cooking stoves are the result of the significant portion of fuel carbon that is diverted to products of incomplete combustion (PIC) as a result of poor combustion efficiencies. The approximately linear increase in emissions of PIC with decreasing combustion efficiencies allows development of linear models to predict emissions of GHG and HDP intrinsically linked to CO2 and PIC production, and ultimately allows the prediction of global warming contributions from residential stove emissions. A comprehensive emissions database of three burn cycles of 23 typical fuel/stove combinations tested in a simulated village house in China has been used to develop models to predict emissions of HDP and global warming commitment (GWC) from cooking stoves in China, that rely on simple survey information on stove and fuel use that may be incorporated into national surveys. Stepwise regression models predicted 66% of the variance in global warming commitment (CO2, CO, CH4, NOx, TNMHC) per 1 MJ delivered energy due to emissions from these stoves if survey information on fuel type was available. Subsequently if stove type is known, stepwise regression models predicted 73% of the variance. Integrated assessment of policies to change stove or fuel type requires that implications for environmental impacts, energy efficiency, global warming and human exposures to HDP emissions can be evaluated. Frequently, this involves measurement of TSP or CO as the major HDPs. Incorporation of this information into models to predict GWC predicted 79% and 78% of the variance respectively. Clearly, however, the complexity of making multiple measurements in conjunction with a national survey would be both expensive and time consuming. Thus, models to predict HDP using simple survey information, and with measurement of either CO/CO2 or TSP/CO2 to predict emission factors for the other HDP have been derived. Stepwise regression models predicted 65% of the variance in emissions of total suspended particulate as grams of carbon (TSPC) per 1 MJ delivered if survey information on fuel and stove type was available and 74% if the CO/CO2 ratio was measured. Similarly stepwise regression models predicted 76% of the variance in COC emissions per MJ delivered with survey information on stove and fuel type and 85% if the TSPC/CO2 ratio was measured. Ultimately, with international agreements on emissions trading frameworks, similar models based on extensive databases of the fate of fuel carbon during combustion from representative household stoves would provide a mechanism for computing greenhouse credits in the residential sector as part of clean development mechanism frameworks and monitoring compliance to control regimes.

  9. Ebola virus RNA editing depends on the primary editing site sequence and an upstream secondary structure.

    PubMed

    Mehedi, Masfique; Hoenen, Thomas; Robertson, Shelly; Ricklefs, Stacy; Dolan, Michael A; Taylor, Travis; Falzarano, Darryl; Ebihara, Hideki; Porcella, Stephen F; Feldmann, Heinz

    2013-01-01

    Ebolavirus (EBOV), the causative agent of a severe hemorrhagic fever and a biosafety level 4 pathogen, increases its genome coding capacity by producing multiple transcripts encoding for structural and nonstructural glycoproteins from a single gene. This is achieved through RNA editing, during which non-template adenosine residues are incorporated into the EBOV mRNAs at an editing site encoding for 7 adenosine residues. However, the mechanism of EBOV RNA editing is currently not understood. In this study, we report for the first time that minigenomes containing the glycoprotein gene editing site can undergo RNA editing, thereby eliminating the requirement for a biosafety level 4 laboratory to study EBOV RNA editing. Using a newly developed dual-reporter minigenome, we have characterized the mechanism of EBOV RNA editing, and have identified cis-acting sequences that are required for editing, located between 9 nt upstream and 9 nt downstream of the editing site. Moreover, we show that a secondary structure in the upstream cis-acting sequence plays an important role in RNA editing. EBOV RNA editing is glycoprotein gene-specific, as a stretch encoding for 7 adenosine residues located in the viral polymerase gene did not serve as an editing site, most likely due to an absence of the necessary cis-acting sequences. Finally, the EBOV protein VP30 was identified as a trans-acting factor for RNA editing, constituting a novel function for this protein. Overall, our results provide novel insights into the RNA editing mechanism of EBOV, further understanding of which might result in novel intervention strategies against this viral pathogen.

  10. Uploading, Searching and Visualizing of Paleomagnetic and Rock Magnetic Data in the Online MagIC Database

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Koppers, A.; Tauxe, L.; Constable, C.; Donadini, F.

    2007-12-01

    The Magnetics Information Consortium (MagIC) is commissioned to implement and maintain an online portal to a relational database populated by both rock and paleomagnetic data. The goal of MagIC is to archive all available measurements and derived properties from paleomagnetic studies of directions and intensities, and for rock magnetic experiments (hysteresis, remanence, susceptibility, anisotropy). MagIC is hosted under EarthRef.org at http://earthref.org/MAGIC/ and will soon implement two search nodes, one for paleomagnetism and one for rock magnetism. Currently the PMAG node is operational. Both nodes provide query building based on location, reference, methods applied, material type and geological age, as well as a visual map interface to browse and select locations. Users can also browse the database by data type or by data compilation to view all contributions associated with well known earlier collections like PINT, GMPDB or PSVRL. The query result set is displayed in a digestible tabular format allowing the user to descend from locations to sites, samples, specimens and measurements. At each stage, the result set can be saved and, where appropriate, can be visualized by plotting global location maps, equal area, XY, age, and depth plots, or typical Zijderveld, hysteresis, magnetization and remanence diagrams. User contributions to the MagIC database are critical to achieving a useful research tool. We have developed a standard data and metadata template (version 2.3) that can be used to format and upload all data at the time of publication in Earth Science journals. Software tools are provided to facilitate population of these templates within Microsoft Excel. These tools allow for the import/export of text files and provide advanced functionality to manage and edit the data, and to perform various internal checks to maintain data integrity and prepare for uploading. The MagIC Contribution Wizard at http://earthref.org/MAGIC/upload.htm executes the upload and takes only a few minutes to process tens of thousands of data records. The standardized MagIC template files are stored in the digital archives of EarthRef.org where they remain available for download by the public (in both text and Excel format). Finally, the contents of these template files are automatically parsed into the online relational database, making the data available for online searches in the paleomagnetic and rock magnetic search nodes. During the upload process the owner has the option of keeping the contribution private so it can be viewed in the context of other data sets and visualized using the suite of MagIC plotting tools. Alternatively, the new data can be password protected and shared with a group of users at the contributor's discretion. Once they are published and the owner is comfortable making the upload publicly accessible, the MagIC Editing Committee reviews the contribution for adherence to the MagIC data model and conventions to ensure a high level of data integrity.

  11. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CER_ES9_Aqua-FM3_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  12. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CER_ES9_FM1+FM4_Edition2)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  13. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF ( CER_ES9_Aqua-FM4_Edition1-CV)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  14. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF ( CER_ES9_Terra-FM1_Edition1-CV)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2006-09-30] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  15. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CER_ES9_TRMM-PFM_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=1998-08-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  16. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CER_ES9_Aqua-FM4_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  17. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CER_ES9_PFM+FM2_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2000-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  18. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CER_ES9_Terra-FM1_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  19. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CERES:CER_ES9_PFM+FM1_Edition2)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2000-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  20. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CER_ES9_Aqua-FM4_Edition2)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  1. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CER_ES9_PFM+FM1_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2000-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  2. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF ( CER_ES9_Aqua-FM3_Edition1-CV)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2006-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  3. CERES ERBE-like Monthly Regional Averages (ES-9) in HDF (CER_ES9_Terra-FM2_Edition1)

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Regional Averages (ES-9) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-9 is also produced for combinations of scanner instruments. All instantaneous shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-8 product for a month are sorted by 2.5-degree spatial regions, by day number, and by the local hour of observation. The mean of the instantaneous fluxes for a given region-day-hour bin is determined and recorded on the ES-9 along with other flux statistics and scene information. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The ES-9 also contains hourly average fluxes for the month and an overall monthly average for each region. These average fluxes are given for both clear-sky and total-sky scenes. The following CERES ES9 data sets are currently available: CER_ES9_FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition1 CER_ES9_PFM+FM1+FM2_Edition2 CER_ES9_PFM+FM1_Edition1 CER_ES9_PFM+FM2_Edition1 CER_ES9_PFM+FM1_Edition2 CER_ES9_PFM+FM2_Edition2 CER_ES9_TRMM-PFM_Edition1 CER_ES9_TRMM-PFM_Edition2 CER_ES9_Terra-FM1_Edition1 CER_ES9_Terra-FM2_Edition1 CER_ES9_FM1+FM2_Edition2 CER_ES9_Terra-FM1_Edition2 CER_ES9_Terra-FM2_Edition2 CER_ES9_Aqua-FM3_Edition1 CER_ES9_Aqua-FM4_Edition1 CER_ES9_FM1+FM2+FM3+FM4_Edition1 CER_ES9_Aqua-FM3_Edition2 CER_ES9_Aqua-FM4_Edition2 CER_ES9_FM1+FM3_Edition2 CER_ES9_FM1+FM4_Edition2 CER_ES9_Aqua-FM3_Edition1-CV CER_ES9_Aqua-FM4_Edition1-CV CER_ES9_Terra-FM1_Edition1-CV CER_ES9_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal_Resolution_Range=250 km - < 500 km or approximately 2.5 degrees - < 5.0 degrees; Temporal_Resolution=hourly, daily, monthly; Temporal_Resolution_Range=Hourly - < Daily, Daily - < Weekly, Monthly - < Annual].

  4. Regenerative combustion device

    DOEpatents

    West, Phillip B.

    2004-03-16

    A regenerative combustion device having a combustion zone, and chemicals contained within the combustion zone, such as water, having a first equilibrium state, and a second combustible state. Means for transforming the chemicals from the first equilibrium state to the second combustible state, such as electrodes, are disposed within the chemicals. An igniter, such as a spark plug or similar device, is disposed within the combustion zone for igniting combustion of the chemicals in the second combustible state. The combustion products are contained within the combustion zone, and the chemicals are selected such that the combustion products naturally chemically revert into the chemicals in the first equilibrium state following combustion. The combustion device may thus be repeatedly reused, requiring only a brief wait after each ignition to allow the regeneration of combustible gasses within the head space.

  5. Method and apparatus for advanced staged combustion utilizing forced internal recirculation

    DOEpatents

    Rabovitser, Iosif K.; Knight, Richard A.; Cygan, David F.; Nester, Serguei; Abbasi, Hamid A.

    2003-12-16

    A method and apparatus for combustion of a fuel in which a first-stage fuel and a first-stage oxidant are introduced into a combustion chamber and ignited, forming a primary combustion zone. At least about 5% of the total heat output produced by combustion of the first-stage fuel and the first-stage oxidant is removed from the primary combustion zone, forming cooled first-stage combustion products. A portion of the cooled first-stage combustion products from a downstream region of the primary combustion zone is recirculated to an upstream region of primary combustion zone. A second-stage fuel is introduced into the combustion chamber downstream of the primary combustion zone and ignited, forming a secondary combustion zone. At least about 5% of the heat from the secondary combustion zone is removed. In accordance with one embodiment, a third-stage oxidant is introduced into the combustion chamber downstream of the secondary combustion zone, forming a tertiary combustion zone.

  6. Steric antisense inhibition of AMPA receptor Q/R editing reveals tight coupling to intronic editing sites and splicing

    PubMed Central

    Penn, Andrew C.; Balik, Ales; Greger, Ingo H.

    2013-01-01

    Adenosine-to-Inosine (A-to-I) RNA editing is a post-transcriptional mechanism, evolved to diversify the transcriptome in metazoa. In addition to wide-spread editing in non-coding regions protein recoding by RNA editing allows for fine tuning of protein function. Functional consequences are only known for some editing sites and the combinatorial effect between multiple sites (functional epistasis) is currently unclear. Similarly, the interplay between RNA editing and splicing, which impacts on post-transcriptional gene regulation, has not been resolved. Here, we describe a versatile antisense approach, which will aid resolving these open questions. We have developed and characterized morpholino oligos targeting the most efficiently edited site—the AMPA receptor GluA2 Q/R site. We show that inhibition of editing closely correlates with intronic editing efficiency, which is linked to splicing efficiency. In addition to providing a versatile tool our data underscore the unique efficiency of a physiologically pivotal editing site. PMID:23172291

  7. Ontology for Vector Surveillance and Management

    PubMed Central

    LOZANO-FUENTES, SAUL; BANDYOPADHYAY, ARITRA; COWELL, LINDSAY G.; GOLDFAIN, ALBERT; EISEN, LARS

    2013-01-01

    Ontologies, which are made up by standardized and defined controlled vocabulary terms and their interrelationships, are comprehensive and readily searchable repositories for knowledge in a given domain. The Open Biomedical Ontologies (OBO) Foundry was initiated in 2001 with the aims of becoming an “umbrella” for life-science ontologies and promoting the use of ontology development best practices. A software application (OBO-Edit; *.obo file format) was developed to facilitate ontology development and editing. The OBO Foundry now comprises over 100 ontologies and candidate ontologies, including the NCBI organismal classification ontology (NCBITaxon), the Mosquito Insecticide Resistance Ontology (MIRO), the Infectious Disease Ontology (IDO), the IDOMAL malaria ontology, and ontologies for mosquito gross anatomy and tick gross anatomy. We previously developed a disease data management system for dengue and malaria control programs, which incorporated a set of information trees built upon ontological principles, including a “term tree” to promote the use of standardized terms. In the course of doing so, we realized that there were substantial gaps in existing ontologies with regards to concepts, processes, and, especially, physical entities (e.g., vector species, pathogen species, and vector surveillance and management equipment) in the domain of surveillance and management of vectors and vector-borne pathogens. We therefore produced an ontology for vector surveillance and management, focusing on arthropod vectors and vector-borne pathogens with relevance to humans or domestic animals, and with special emphasis on content to support operational activities through inclusion in databases, data management systems, or decision support systems. The Vector Surveillance and Management Ontology (VSMO) includes >2,200 unique terms, of which the vast majority (>80%) were newly generated during the development of this ontology. One core feature of the VSMO is the linkage, through the has_vector relation, of arthropod species to the pathogenic microorganisms for which they serve as biological vectors. We also recognized and addressed a potential roadblock for use of the VSMO by the vector-borne disease community: the difficulty in extracting information from OBO-Edit ontology files (*.obo files) and exporting the information to other file formats. A novel ontology explorer tool was developed to facilitate extraction and export of information from the VSMO *.obo file into lists of terms and their associated unique IDs in *.txt or *.csv file formats. These lists can then be imported into a database or data management system for use as select lists with predefined terms. This is an important step to ensure that the knowledge contained in our ontology can be put into practical use. PMID:23427646

  8. User's manual for the National Water-Quality Assessment Program Invertebrate Data Analysis System (IDAS) software, version 5

    USGS Publications Warehouse

    Cuffney, Thomas F.; Brightbill, Robin A.

    2011-01-01

    The Invertebrate Data Analysis System (IDAS) software was developed to provide an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. The IDAS software is a stand-alone program for personal computers that run Microsoft Windows(Registered). It allows users to read data downloaded from the NAWQA Program Biological Transactional Database (Bio-TDB) or to import data from other sources either as Microsoft Excel(Registered) or Microsoft Access(Registered) files. The program consists of five modules: Edit Data, Data Preparation, Calculate Community Metrics, Calculate Diversities and Similarities, and Data Export. The Edit Data module allows the user to subset data on the basis of taxonomy or sample type, extract a random subsample of data, combine or delete data, summarize distributions, resolve ambiguous taxa (see glossary) and conditional/provisional taxa, import non-NAWQA data, and maintain and create files of invertebrate attributes that are used in the calculation of invertebrate metrics. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa on the basis of laboratory processing notes, delete pupae or terrestrial adults, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa on the basis of the number of sites where a taxon occurs and (or) the abundance of a taxon in a sample, and resolve taxonomic ambiguities by one of four methods. The Calculate Community Metrics module allows the user to calculate 184 community metrics, including metrics based on organism tolerances, functional feeding groups, and behavior. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data Export module allows the user to export data to other software packages (CANOCO, Primer, PC-ORD, MVSP) and produce tables of community data that can be imported into spreadsheet, database, graphics, statistics, and word-processing programs. The IDAS program facilitates the documentation of analyses by keeping a log of the data that are processed, the files that are generated, and the program settings used to process the data. Though the IDAS program was developed to process NAWQA Program invertebrate data downloaded from Bio-TDB, the Edit Data module includes tools that can be used to convert non-NAWQA data into Bio-TDB format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used to process data generated outside of the NAWQA Program.

  9. Ontology for vector surveillance and management.

    PubMed

    Lozano-Fuentes, Saul; Bandyopadhyay, Aritra; Cowell, Lindsay G; Goldfain, Albert; Eisen, Lars

    2013-01-01

    Ontologies, which are made up by standardized and defined controlled vocabulary terms and their interrelationships, are comprehensive and readily searchable repositories for knowledge in a given domain. The Open Biomedical Ontologies (OBO) Foundry was initiated in 2001 with the aims of becoming an "umbrella" for life-science ontologies and promoting the use of ontology development best practices. A software application (OBO-Edit; *.obo file format) was developed to facilitate ontology development and editing. The OBO Foundry now comprises over 100 ontologies and candidate ontologies, including the NCBI organismal classification ontology (NCBITaxon), the Mosquito Insecticide Resistance Ontology (MIRO), the Infectious Disease Ontology (IDO), the IDOMAL malaria ontology, and ontologies for mosquito gross anatomy and tick gross anatomy. We previously developed a disease data management system for dengue and malaria control programs, which incorporated a set of information trees built upon ontological principles, including a "term tree" to promote the use of standardized terms. In the course of doing so, we realized that there were substantial gaps in existing ontologies with regards to concepts, processes, and, especially, physical entities (e.g., vector species, pathogen species, and vector surveillance and management equipment) in the domain of surveillance and management of vectors and vector-borne pathogens. We therefore produced an ontology for vector surveillance and management, focusing on arthropod vectors and vector-borne pathogens with relevance to humans or domestic animals, and with special emphasis on content to support operational activities through inclusion in databases, data management systems, or decision support systems. The Vector Surveillance and Management Ontology (VSMO) includes >2,200 unique terms, of which the vast majority (>80%) were newly generated during the development of this ontology. One core feature of the VSMO is the linkage, through the has vector relation, of arthropod species to the pathogenic microorganisms for which they serve as biological vectors. We also recognized and addressed a potential roadblock for use of the VSMO by the vector-borne disease community: the difficulty in extracting information from OBO-Edit ontology files (*.obo files) and exporting the information to other file formats. A novel ontology explorer tool was developed to facilitate extraction and export of information from the VSMO*.obo file into lists of terms and their associated unique IDs in *.txt or *.csv file formats. These lists can then be imported into a database or data management system for use as select lists with predefined terms. This is an important step to ensure that the knowledge contained in our ontology can be put into practical use.

  10. Medical research in Israel and the Israel biomedical database.

    PubMed

    Berns, D S; Rager-Zisman, B

    2000-11-01

    The data collected for the second edition of the Directory of Medical Research in Israel and the Israel Biomedical Database have yielded very relevant information concerning the distribution of investigators, publication activities and funding sources. The aggregate data confirm the findings of the first edition published in 1996 [2]. Those facts endorse the highly concentrated and extensive nature of medical research in the Jerusalem area, which is conducted at the Hebrew University and its affiliated hospitals. In contrast, Tel Aviv University, whose basic research staff is about two-thirds the size of the Hebrew University staff, has a more diffuse relationship with its clinical staff who are located at more than half a dozen hospitals. Ben-Gurion University in Beer Sheva and the Technion in Haifa are smaller in size, but have closer geographic contact between their clinical and basic research staff. Nonetheless, all the medical schools and affiliated hospitals have good publication and funding records. It is important to note that while some aspects of the performance at basic research institutions seem to be somewhat better than at hospitals, the records are actually quite similar despite the greater burden of clinical services at the hospitals as compared to teaching responsibilities in the basic sciences. The survey also indicates the substantial number of young investigators in the latest survey who did not appear in the first survey. While this is certainly encouraging, it is also disturbing that the funding sources are apparently decreasing at a time when young investigators are attempting to become established and the increasing burden of health care costs precludes financial assistance from hospital sources. The intensity and undoubtedly the quality of medical research in Israel remains at a level consistent with many of the more advanced western countries. This conclusion is somewhat mitigated by the fact that there is a decrease in available funding and a measurable decrease in scholarly activity at a time when a new, younger generation of investigators is just beginning to become productive. In closing, we wish to stress that the collection of data for the Biomedical Database is a continuing project and we encourage all medical researches who may not have contributed relevant information to write to the Office of the Chief Scientist or contact the office by email.

  11. Reengineering Workflow for Curation of DICOM Datasets.

    PubMed

    Bennett, William; Smith, Kirk; Jarosz, Quasar; Nolan, Tracy; Bosch, Walter

    2018-06-15

    Reusable, publicly available data is a pillar of open science and rapid advancement of cancer imaging research. Sharing data from completed research studies not only saves research dollars required to collect data, but also helps insure that studies are both replicable and reproducible. The Cancer Imaging Archive (TCIA) is a global shared repository for imaging data related to cancer. Insuring the consistency, scientific utility, and anonymity of data stored in TCIA is of utmost importance. As the rate of submission to TCIA has been increasing, both in volume and complexity of DICOM objects stored, the process of curation of collections has become a bottleneck in acquisition of data. In order to increase the rate of curation of image sets, improve the quality of the curation, and better track the provenance of changes made to submitted DICOM image sets, a custom set of tools was developed, using novel methods for the analysis of DICOM data sets. These tools are written in the programming language perl, use the open-source database PostgreSQL, make use of the perl DICOM routines in the open-source package Posda, and incorporate DICOM diagnostic tools from other open-source packages, such as dicom3tools. These tools are referred to as the "Posda Tools." The Posda Tools are open source and available via git at https://github.com/UAMS-DBMI/PosdaTools . In this paper, we briefly describe the Posda Tools and discuss the novel methods employed by these tools to facilitate rapid analysis of DICOM data, including the following: (1) use a database schema which is more permissive, and differently normalized from traditional DICOM databases; (2) perform integrity checks automatically on a bulk basis; (3) apply revisions to DICOM datasets on an bulk basis, either through a web-based interface or via command line executable perl scripts; (4) all such edits are tracked in a revision tracker and may be rolled back; (5) a UI is provided to inspect the results of such edits, to verify that they are what was intended; (6) identification of DICOM Studies, Series, and SOP instances using "nicknames" which are persistent and have well-defined scope to make expression of reported DICOM errors easier to manage; and (7) rapidly identify potential duplicate DICOM datasets by pixel data is provided; this can be used, e.g., to identify submission subjects which may relate to the same individual, without identifying the individual.

  12. Evaluating experimental molecular physics studies of radiation damage in DNA*

    NASA Astrophysics Data System (ADS)

    Śmiałek, Małgorzata A.

    2016-11-01

    The field of Atomic and Molecular Physics (AMP) is a mature field exploring the spectroscopy, excitation, ionisation of atoms and molecules in all three phases. Understanding of the spectroscopy and collisional dynamics of AMP has been fundamental to the development and application of quantum mechanics and is applied across a broad range of disparate disciplines including atmospheric sciences, astrochemistry, combustion and environmental science, and in central to core technologies such as semiconductor fabrications, nanotechnology and plasma processing. In recent years the molecular physics also started significantly contributing to the area of the radiation damage at molecular level and thus cancer therapy improvement through both experimental and theoretical advances, developing new damage measurement and analysis techniques. It is therefore worth to summarise and highlight the most prominent findings from the AMP community that contribute towards better understanding of the fundamental processes in biologically-relevant systems as well as to comment on the experimental challenges that were met for more complex investigation targets. Contribution to the Topical Issue "Low-Energy Interactions related to Atmospheric and Extreme Conditions", edited by S. Ptasinska, M. Smialek-Telega, A. Milosavljevic, B. Sivaraman.

  13. Unique and Overlapping Symptoms in Schizophrenia Spectrum and Dissociative Disorders in Relation to Models of Psychopathology: A Systematic Review

    PubMed Central

    Renard, Selwyn B.; Huntjens, Rafaele J. C.; Lysaker, Paul H.; Moskowitz, Andrew; Aleman, André; Pijnenborg, Gerdina H. M.

    2017-01-01

    Schizophrenia spectrum disorders (SSDs) and dissociative disorders (DDs) are described in the fifth edition of the Diagnostic and Statistical Manual for Mental Disorders (DSM-5) and tenth edition of the International Statistical Classification of Diseases and Related Health Problems (ICD-10) as 2 categorically distinct diagnostic categories. However, several studies indicate high levels of co-occurrence between these diagnostic groups, which might be explained by overlapping symptoms. The aim of this systematic review is to provide a comprehensive overview of the research concerning overlap and differences in symptoms between schizophrenia spectrum and DDs. For this purpose the PubMed, PsycINFO, and Web of Science databases were searched for relevant literature. The literature contained a large body of evidence showing the presence of symptoms of dissociation in SSDs. Although there are quantitative differences between diagnoses, overlapping symptoms are not limited to certain domains of dissociation, nor to nonpathological forms of dissociation. In addition, dissociation seems to be related to a history of trauma in SSDs, as is also seen in DDs. There is also evidence showing that positive and negative symptoms typically associated with schizophrenia may be present in DD. Implications of these results are discussed with regard to different models of psychopathology and clinical practice. PMID:27209638

  14. The IASLC Lung Cancer Staging Project: A Renewed Call to Participation.

    PubMed

    Giroux, Dorothy J; Van Schil, Paul; Asamura, Hisao; Rami-Porta, Ramón; Chansky, Kari; Crowley, John J; Rusch, Valerie W; Kernstine, Kemp

    2018-06-01

    Over the past two decades, the International Association for the Study of Lung Cancer (IASLC) Staging Project has been a steady source of evidence-based recommendations for the TNM classification for lung cancer published by the Union for International Cancer Control and the American Joint Committee on Cancer. The Staging and Prognostic Factors Committee of the IASLC is now issuing a call for participation in the next phase of the project, which is designed to inform the ninth edition of the TNM classification for lung cancer. Following the case recruitment model for the eighth edition database, volunteer site participants are asked to submit data on patients whose lung cancer was diagnosed between January 1, 2011, and December 31, 2019, to the project by means of a secure, electronic data capture system provided by Cancer Research And Biostatistics in Seattle, Washington. Alternatively, participants may transfer existing data sets. The continued success of the IASLC Staging Project in achieving its objectives will depend on the extent of international participation, the degree to which cases are entered directly into the electronic data capture system, and how closely externally submitted cases conform to the data elements for the project. Copyright © 2018 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  15. Abundant RNA editing sites of chloroplast protein-coding genes in Ginkgo biloba and an evolutionary pattern analysis.

    PubMed

    He, Peng; Huang, Sheng; Xiao, Guanghui; Zhang, Yuzhou; Yu, Jianing

    2016-12-01

    RNA editing is a posttranscriptional modification process that alters the RNA sequence so that it deviates from the genomic DNA sequence. RNA editing mainly occurs in chloroplasts and mitochondrial genomes, and the number of editing sites varies in terrestrial plants. Why and how RNA editing systems evolved remains a mystery. Ginkgo biloba is one of the oldest seed plants and has an important evolutionary position. Determining the patterns and distribution of RNA editing in the ancient plant provides insights into the evolutionary trend of RNA editing, and helping us to further understand their biological significance. In this paper, we investigated 82 protein-coding genes in the chloroplast genome of G. biloba and identified 255 editing sites, which is the highest number of RNA editing events reported in a gymnosperm. All of the editing sites were C-to-U conversions, which mainly occurred in the second codon position, biased towards to the U_A context, and caused an increase in hydrophobic amino acids. RNA editing could change the secondary structures of 82 proteins, and create or eliminate a transmembrane region in five proteins as determined in silico. Finally, the evolutionary tendencies of RNA editing in different gene groups were estimated using the nonsynonymous-synonymous substitution rate selection mode. The G. biloba chloroplast genome possesses the highest number of RNA editing events reported so far in a seed plant. Most of the RNA editing sites can restore amino acid conservation, increase hydrophobicity, and even influence protein structures. Similar purifying selections constitute the dominant evolutionary force at the editing sites of essential genes, such as the psa, some psb and pet groups, and a positive selection occurred in the editing sites of nonessential genes, such as most ndh and a few psb genes.

  16. A single alteration 20 nt 5′ to an editing target inhibits chloroplast RNA editing in vivo

    PubMed Central

    Reed, Martha L.; Peeters, Nemo M.; Hanson, Maureen R.

    2001-01-01

    Transcripts of typical dicot plant plastid genes undergo C→U RNA editing at approximately 30 locations, but there is no consensus sequence surrounding the C targets of editing. The cis-acting elements required for editing of the C located at tobacco rpoB editing site II were investigated by introducing translatable chimeric minigenes containing sequence –20 to +6 surrounding the C target of editing. When the –20 to +6 sequence specified by the homologous region present in the black pine chloroplast genome was incorporated, virtually no editing of the transcripts occurred in transgenic tobacco plastids. Nucleotides that differ between the black pine and tobacco sequence were tested for their role in C→U editing by designing chimeric genes containing one or more of these divergent nucleotides. Surprisingly, the divergent nucleotide that had the strongest negative effect on editing of the minigene transcript was located –20 nt 5′ to the C target of editing. Expression of transgene transcripts carrying the 27 nt sequence did not affect the editing extent of the endogenous rpoB transcripts, even though the chimeric transcripts were much more abundant than those of the endogenous gene. In plants carrying a 93 nt rpoB editing site sequence, transgene transcripts accumulated to a level three times greater than transgene transcripts in the plants carrying the 27 nt rpoB editing sites and resulted in editing of the endogenous transcripts from 100 to 50%. Both a lower affinity of the 27 nt site for a trans-acting factor and lower abundance of the transcript could explain why expression of minigene transcripts containing the 27 nt sequence did not affect endogenous editing. PMID:11266552

  17. WikiGenomes: an open web application for community consumption and curation of gene annotation data in Wikidata

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Putman, Tim E.; Lelong, Sebastien; Burgstaller-Muehlbacher, Sebastian

    With the advancement of genome-sequencing technologies, new genomes are being sequenced daily. Although these sequences are deposited in publicly available data warehouses, their functional and genomic annotations (beyond genes which are predicted automatically) mostly reside in the text of primary publications. Professional curators are hard at work extracting those annotations from the literature for the most studied organisms and depositing them in structured databases. However, the resources don’t exist to fund the comprehensive curation of the thousands of newly sequenced organisms in this manner. Here, we describe WikiGenomes (wikigenomes.org), a web application that facilitates the consumption and curation of genomicmore » data by the entire scientific community. WikiGenomes is based on Wikidata, an openly editable knowledge graph with the goal of aggregating published knowledge into a free and open database. WikiGenomes empowers the individual genomic researcher to contribute their expertise to the curation effort and integrates the knowledge into Wikidata, enabling it to be accessed by anyone without restriction.« less

  18. WikiGenomes: an open web application for community consumption and curation of gene annotation data in Wikidata

    DOE PAGES

    Putman, Tim E.; Lelong, Sebastien; Burgstaller-Muehlbacher, Sebastian; ...

    2017-03-06

    With the advancement of genome-sequencing technologies, new genomes are being sequenced daily. Although these sequences are deposited in publicly available data warehouses, their functional and genomic annotations (beyond genes which are predicted automatically) mostly reside in the text of primary publications. Professional curators are hard at work extracting those annotations from the literature for the most studied organisms and depositing them in structured databases. However, the resources don’t exist to fund the comprehensive curation of the thousands of newly sequenced organisms in this manner. Here, we describe WikiGenomes (wikigenomes.org), a web application that facilitates the consumption and curation of genomicmore » data by the entire scientific community. WikiGenomes is based on Wikidata, an openly editable knowledge graph with the goal of aggregating published knowledge into a free and open database. WikiGenomes empowers the individual genomic researcher to contribute their expertise to the curation effort and integrates the knowledge into Wikidata, enabling it to be accessed by anyone without restriction.« less

  19. Read Code Quality Assurance

    PubMed Central

    Schulz, Erich; Barrett, James W.; Price, Colin

    1998-01-01

    As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with “business rules” declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short. PMID:9670131

  20. Pharmit: interactive exploration of chemical space.

    PubMed

    Sunseri, Jocelyn; Koes, David Ryan

    2016-07-08

    Pharmit (http://pharmit.csb.pitt.edu) provides an online, interactive environment for the virtual screening of large compound databases using pharmacophores, molecular shape and energy minimization. Users can import, create and edit virtual screening queries in an interactive browser-based interface. Queries are specified in terms of a pharmacophore, a spatial arrangement of the essential features of an interaction, and molecular shape. Search results can be further ranked and filtered using energy minimization. In addition to a number of pre-built databases of popular compound libraries, users may submit their own compound libraries for screening. Pharmit uses state-of-the-art sub-linear algorithms to provide interactive screening of millions of compounds. Queries typically take a few seconds to a few minutes depending on their complexity. This allows users to iteratively refine their search during a single session. The easy access to large chemical datasets provided by Pharmit simplifies and accelerates structure-based drug design. Pharmit is available under a dual BSD/GPL open-source license. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

Top