Sample records for internally consistent database

  1. TEDS-M 2008 User Guide for the International Database. Supplement 1: International Version of the TEDS-M Questionnaires

    ERIC Educational Resources Information Center

    Brese, Falk, Ed.

    2012-01-01

    The Teacher Education Study in Mathematics (TEDS-M) International Database includes data for all questionnaires administered as part of the TEDS-M study. These consisted of questionnaires administered to future teachers, educators, and institutions with teacher preparation programs. This supplement contains the international version of the TEDS-M…

  2. The International Space Station Comparative Maintenance Analysis(CMAM)

    DTIC Science & Technology

    2004-09-01

    External Component • Entire ORU Database 2. Database Connectivity The CMAM ORU database consists of three tables: an ORU master parts list , an ISS...Flight table, and an ISS Subsystem table. The ORU master parts list and the ISS Flight table can be updated or modified from the CMAM user interface

  3. Pathology Imagebase-a reference image database for standardization of pathology.

    PubMed

    Egevad, Lars; Cheville, John; Evans, Andrew J; Hörnblad, Jonas; Kench, James G; Kristiansen, Glen; Leite, Katia R M; Magi-Galluzzi, Cristina; Pan, Chin-Chen; Samaratunga, Hemamali; Srigley, John R; True, Lawrence; Zhou, Ming; Clements, Mark; Delahunt, Brett

    2017-11-01

    Despite efforts to standardize histopathology practice through the development of guidelines, the interpretation of morphology is still hampered by subjectivity. We here describe Pathology Imagebase, a novel mechanism for establishing an international standard for the interpretation of pathology specimens. The International Society of Urological Pathology (ISUP) established a reference image database through the input of experts in the field. Three panels were formed, one each for prostate, urinary bladder and renal pathology, consisting of 24 international experts. Each of the panel members uploaded microphotographs of cases into a non-public database. The remaining 23 experts were asked to vote from a multiple-choice menu. Prior to and while voting, panel members were unable to access the results of voting by the other experts. When a consensus level of at least two-thirds or 16 votes was reached, cases were automatically transferred to the main database. Consensus was reached in a total of 287 cases across five projects on the grading of prostate, bladder and renal cancer and the classification of renal tumours and flat lesions of the bladder. The full database is available to all ISUP members at www.isupweb.org. Non-members may access a selected number of cases. It is anticipated that the database will assist pathologists in calibrating their grading, and will also promote consistency in the diagnosis of difficult cases. © 2017 John Wiley & Sons Ltd.

  4. An international aerospace information system: A cooperative opportunity

    NASA Technical Reports Server (NTRS)

    Cotter, Gladys A.; Blados, Walter R.

    1992-01-01

    Scientific and technical information (STI) is a valuable resource which represents the results of large investments in research and development (R&D), and the expertise of a nation. NASA and its predecessor organizations have developed and managed the preeminent aerospace information system. We see information and information systems changing and becoming more international in scope. In Europe, consistent with joint R&D programs and a view toward a united Europe, we have seen the emergence of a European Aerospace Database concept. In addition, the development of aeronautics and astronautics in individual nations have also lead to initiatives for national aerospace databases. Considering recent technological developments in information science and technology, as well as the reality of scarce resources in all nations, it is time to reconsider the mutually beneficial possibilities offered by cooperation and international resource sharing. The new possibilities offered through cooperation among the various aerospace database efforts toward an international aerospace database initiative which can optimize the cost/benefit equation for all participants are considered.

  5. SSME environment database development

    NASA Technical Reports Server (NTRS)

    Reardon, John

    1987-01-01

    The internal environment of the Space Shuttle Main Engine (SSME) is being determined from hot firings of the prototype engines and from model tests using either air or water as the test fluid. The objectives are to develop a database system to facilitate management and analysis of test measurements and results, to enter available data into the the database, and to analyze available data to establish conventions and procedures to provide consistency in data normalization and configuration geometry references.

  6. The Danish Inguinal Hernia database.

    PubMed

    Friis-Andersen, Hans; Bisgaard, Thue

    2016-01-01

    To monitor and improve nation-wide surgical outcome after groin hernia repair based on scientific evidence-based surgical strategies for the national and international surgical community. Patients ≥18 years operated for groin hernia. Type and size of hernia, primary or recurrent, type of surgical repair procedure, mesh and mesh fixation methods. According to the Danish National Health Act, surgeons are obliged to register all hernia repairs immediately after surgery (3 minute registration time). All institutions have continuous access to their own data stratified on individual surgeons. Registrations are based on a closed, protected Internet system requiring personal codes also identifying the operating institution. A national steering committee consisting of 13 voluntary and dedicated surgeons, 11 of whom are unpaid, handles the medical management of the database. The Danish Inguinal Hernia Database comprises intraoperative data from >130,000 repairs (May 2015). A total of 49 peer-reviewed national and international publications have been published from the database (June 2015). The Danish Inguinal Hernia Database is fully active monitoring surgical quality and contributes to the national and international surgical society to improve outcome after groin hernia repair.

  7. Toward unification of taxonomy databases in a distributed computer environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitakami, Hajime; Tateno, Yoshio; Gojobori, Takashi

    1994-12-31

    All the taxonomy databases constructed with the DNA databases of the international DNA data banks are powerful electronic dictionaries which aid in biological research by computer. The taxonomy databases are, however not consistently unified with a relational format. If we can achieve consistent unification of the taxonomy databases, it will be useful in comparing many research results, and investigating future research directions from existent research results. In particular, it will be useful in comparing relationships between phylogenetic trees inferred from molecular data and those constructed from morphological data. The goal of the present study is to unify the existent taxonomymore » databases and eliminate inconsistencies (errors) that are present in them. Inconsistencies occur particularly in the restructuring of the existent taxonomy databases, since classification rules for constructing the taxonomy have rapidly changed with biological advancements. A repair system is needed to remove inconsistencies in each data bank and mismatches among data banks. This paper describes a new methodology for removing both inconsistencies and mismatches from the databases on a distributed computer environment. The methodology is implemented in a relational database management system, SYBASE.« less

  8. Development of an Integrated Biospecimen Database among the Regional Biobanks in Korea.

    PubMed

    Park, Hyun Sang; Cho, Hune; Kim, Hwa Sun

    2016-04-01

    This study developed an integrated database for 15 regional biobanks that provides large quantities of high-quality bio-data to researchers to be used for the prevention of disease, for the development of personalized medicines, and in genetics studies. We collected raw data, managed independently by 15 regional biobanks, for database modeling and analyzed and defined the metadata of the items. We also built a three-step (high, middle, and low) classification system for classifying the item concepts based on the metadata. To generate clear meanings of the items, clinical items were defined using the Systematized Nomenclature of Medicine Clinical Terms, and specimen items were defined using the Logical Observation Identifiers Names and Codes. To optimize database performance, we set up a multi-column index based on the classification system and the international standard code. As a result of subdividing 7,197,252 raw data items collected, we refined the metadata into 1,796 clinical items and 1,792 specimen items. The classification system consists of 15 high, 163 middle, and 3,588 low class items. International standard codes were linked to 69.9% of the clinical items and 71.7% of the specimen items. The database consists of 18 tables based on a table from MySQL Server 5.6. As a result of the performance evaluation, the multi-column index shortened query time by as much as nine times. The database developed was based on an international standard terminology system, providing an infrastructure that can integrate the 7,197,252 raw data items managed by the 15 regional biobanks. In particular, it resolved the inevitable interoperability issues in the exchange of information among the biobanks, and provided a solution to the synonym problem, which arises when the same concept is expressed in a variety of ways.

  9. Psychometrics and the neuroscience of individual differences: Internal consistency limits between-subjects effects.

    PubMed

    Hajcak, Greg; Meyer, Alexandria; Kotov, Roman

    2017-08-01

    In the clinical neuroscience literature, between-subjects differences in neural activity are presumed to reflect reliable measures-even though the psychometric properties of neural measures are almost never reported. The current article focuses on the critical importance of assessing and reporting internal consistency reliability-the homogeneity of "items" that comprise a neural "score." We demonstrate how variability in the internal consistency of neural measures limits between-subjects (i.e., individual differences) effects. To this end, we utilize error-related brain activity (i.e., the error-related negativity or ERN) in both healthy and generalized anxiety disorder (GAD) participants to demonstrate options for psychometric analyses of neural measures; we examine between-groups differences in internal consistency, between-groups effect sizes, and between-groups discriminability (i.e., ROC analyses)-all as a function of increasing items (i.e., number of trials). Overall, internal consistency should be used to inform experimental design and the choice of neural measures in individual differences research. The internal consistency of neural measures is necessary for interpreting results and guiding progress in clinical neuroscience-and should be routinely reported in all individual differences studies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Designing an international industrial hygiene database of exposures among workers in the asphalt industry.

    PubMed

    Burstyn, I; Kromhout, H; Cruise, P J; Brennan, P

    2000-01-01

    The objective of this project was to construct a database of exposure measurements which would be used to retrospectively assess the intensity of various exposures in an epidemiological study of cancer risk among asphalt workers. The database was developed as a stand-alone Microsoft Access 2.0 application, which could work in each of the national centres. Exposure data included in the database comprised measurements of exposure levels, plus supplementary information on production characteristics which was analogous to that used to describe companies enrolled in the study. The database has been successfully implemented in eight countries, demonstrating the flexibility and data security features adequate to the task. The database allowed retrieval and consistent coding of 38 data sets of which 34 have never been described in peer-reviewed scientific literature. We were able to collect most of the data intended. As of February 1999 the database consisted of 2007 sets of measurements from persons or locations. The measurements appeared to be free from any obvious bias. The methodology embodied in the creation of the database can be usefully employed to develop exposure assessment tools in epidemiological studies.

  11. Technical Work Plan for: Thermodynamic Database for Chemical Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C.F. Jovecolon

    The objective of the work scope covered by this Technical Work Plan (TWP) is to correct and improve the Yucca Mountain Project (YMP) thermodynamic databases, to update their documentation, and to ensure reasonable consistency among them. In addition, the work scope will continue to generate database revisions, which are organized and named so as to be transparent to internal and external users and reviewers. Regarding consistency among databases, it is noted that aqueous speciation and mineral solubility data for a given system may differ according to how solubility was determined, and the method used for subsequent retrieval of thermodynamic parametermore » values from measured data. Of particular concern are the details of the determination of ''infinite dilution'' constants, which involve the use of specific methods for activity coefficient corrections. That is, equilibrium constants developed for a given system for one set of conditions may not be consistent with constants developed for other conditions, depending on the species considered in the chemical reactions and the methods used in the reported studies. Hence, there will be some differences (for example in log K values) between the Pitzer and ''B-dot'' database parameters for the same reactions or species.« less

  12. ICAO CAEP modelling and databases task force, ICAO goals assessment results

    DOT National Transportation Integrated Search

    2010-05-12

    This presentation was given at the International Civil Aviation Organization, Session Two - Aviation Emissions Quantification and MRV. Results present the consensus view of MODTF, and are consistent across multiple models. Primary results based on IC...

  13. Image-guided decision support system for pulmonary nodule classification in 3D thoracic CT images

    NASA Astrophysics Data System (ADS)

    Kawata, Yoshiki; Niki, Noboru; Ohmatsu, Hironobu; Kusumoto, Masahiro; Kakinuma, Ryutaro; Mori, Kiyoshi; Yamada, Kozo; Nishiyama, Hiroyuki; Eguchi, Kenji; Kaneko, Masahiro; Moriyama, Noriyuki

    2004-05-01

    The purpose of this study is to develop an image-guided decision support system that assists decision-making in clinical differential diagnosis of pulmonary nodules. This approach retrieves and displays nodules that exhibit morphological and internal profiles consistent to the nodule in question. It uses a three-dimensional (3-D) CT image database of pulmonary nodules for which diagnosis is known. In order to build the system, there are following issues that should be solved: 1) to categorize the nodule database with respect to morphological and internal features, 2) to quickly search nodule images similar to an indeterminate nodule from a large database, and 3) to reveal malignancy likelihood computed by using similar nodule images. Especially, the first problem influences the design of other issues. The successful categorization of nodule pattern might lead physicians to find important cues that characterize benign and malignant nodules. This paper focuses on an approach to categorize the nodule database with respect to nodule shape and CT density patterns inside nodule.

  14. The ICARDA agro-climate tool

    USDA-ARS?s Scientific Manuscript database

    A Visual Basic agro-climate application by climatologists at the International Center for Agricultural Research in the Dry Areas and the U.S. Department of Agriculture is described here. The database from which the application derives climate information consists of weather generator parameters der...

  15. The ICARDA Agro-Climate Tool

    USDA-ARS?s Scientific Manuscript database

    A Visual Basic agro-climate application developed by climatologists at the International Center for Agricultural Research in the Dry Areas and the U.S. Department of Agriculture is described here. The database from which the application derives climate information consists of weather generator param...

  16. Fifteen years of occupational and environmental health projects support in Brazil, Chile, and Mexico: a report from Mount Sinai School of Medicine ITREOH program, 1995-2010.

    PubMed

    Peres, Frederico; Claudio, Luz

    2013-01-01

    The Fogarty International Center of the National Institutes of Health created the International Training and Research Program in Occupational and Environmental Health (ITREOH program) in 1995 with the aim to train environmental and occupational health scientists in developing countries. Mount Sinai School of Medicine was a grantee of this program since its inception, partnering with research institutions in Brazil, Chile, and Mexico. This article evaluates Mount Sinai's program in order to determine whether it has contributed to the specific research capacity needs of the international partners. Information was obtained from: (a) international and regional scientific literature databases; (b) databases from the three participating countries; and (c) MSSM ITREOH Program Database. Most of the research projects supported by the program were consistent with the themes found to be top priorities for the partner countries based on mortality/morbidity and research themes in the literature. Indirect effects of the training and the subsequent research projects completed by the trained fellows in the program included health policy changes and development of collaborative international projects. International research training programs, such as the MSSM ITREOH, that strengthen scientific research capacity in occupational and environmental health in Latin America can make a significant impact on the most pressing health issues in the partner countries. Copyright © 2012 Wiley Periodicals, Inc.

  17. The Protein Information Resource: an integrated public resource of functional annotation of proteins

    PubMed Central

    Wu, Cathy H.; Huang, Hongzhan; Arminski, Leslie; Castro-Alvear, Jorge; Chen, Yongxing; Hu, Zhang-Zhi; Ledley, Robert S.; Lewis, Kali C.; Mewes, Hans-Werner; Orcutt, Bruce C.; Suzek, Baris E.; Tsugita, Akira; Vinayaka, C. R.; Yeh, Lai-Su L.; Zhang, Jian; Barker, Winona C.

    2002-01-01

    The Protein Information Resource (PIR) serves as an integrated public resource of functional annotation of protein data to support genomic/proteomic research and scientific discovery. The PIR, in collaboration with the Munich Information Center for Protein Sequences (MIPS) and the Japan International Protein Information Database (JIPID), produces the PIR-International Protein Sequence Database (PSD), the major annotated protein sequence database in the public domain, containing about 250 000 proteins. To improve protein annotation and the coverage of experimentally validated data, a bibliography submission system is developed for scientists to submit, categorize and retrieve literature information. Comprehensive protein information is available from iProClass, which includes family classification at the superfamily, domain and motif levels, structural and functional features of proteins, as well as cross-references to over 40 biological databases. To provide timely and comprehensive protein data with source attribution, we have introduced a non-redundant reference protein database, PIR-NREF. The database consists of about 800 000 proteins collected from PIR-PSD, SWISS-PROT, TrEMBL, GenPept, RefSeq and PDB, with composite protein names and literature data. To promote database interoperability, we provide XML data distribution and open database schema, and adopt common ontologies. The PIR web site (http://pir.georgetown.edu/) features data mining and sequence analysis tools for information retrieval and functional identification of proteins based on both sequence and annotation information. The PIR databases and other files are also available by FTP (ftp://nbrfa.georgetown.edu/pir_databases). PMID:11752247

  18. Atlantic Ocean CARINA data: overview and salinity adjustments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanhua, T.; Steinfeldt, R.; Key, Robert

    2010-01-01

    Water column data of carbon and carbon-relevant hydrographic and hydrochemical parameters from 188 previously non-publicly available cruise data sets in the Arctic Mediterranean Seas, Atlantic and Southern Ocean have been retrieved and merged into a new database: CARINA (CARbon dioxide IN the Atlantic Ocean). The data have gone through rigorous quality control procedures to assure the highest possible quality and consistency. The data for the pertinent parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the threemore » data products: merged data files with measured, calculated and interpolated data for each of the three CARINA regions, i.e. the Arctic Mediterranean Seas, the Atlantic and the Southern Ocean. These products have been corrected to be internally consistent. Ninety-eight of the cruises in the CARINA database were conducted in the Atlantic Ocean, defined here as the region south of the Greenland-Iceland-Scotland Ridge and north of about 30 S. Here we present an overview of the Atlantic Ocean synthesis of the CARINA data and the adjustments that were applied to the data product. We also report the details of the secondary QC (Quality Control) for salinity for this data set. Procedures of quality control including crossover analysis between stations and inversion analysis of all crossover data are briefly described. Adjustments to salinity measurements were applied to the data from 10 cruises in the Atlantic Ocean region. Based on our analysis we estimate the internal consistency of the CARINA-ATL salinity data to be 4.1 ppm. With these adjustments the CARINA data products are consistent both internally was well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s, and is now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.« less

  19. Atlantic Ocean CARINA data: overview and salinity adjustments

    NASA Astrophysics Data System (ADS)

    Tanhua, T.; Steinfeldt, R.; Key, R. M.; Brown, P.; Gruber, N.; Wanninkhof, R.; Perez, F.; Körtzinger, A.; Velo, A.; Schuster, U.; van Heuven, S.; Bullister, J. L.; Stendardo, I.; Hoppema, M.; Olsen, A.; Kozyr, A.; Pierrot, D.; Schirnick, C.; Wallace, D. W. R.

    2010-02-01

    Water column data of carbon and carbon-relevant hydrographic and hydrochemical parameters from 188 previously non-publicly available cruise data sets in the Arctic Mediterranean Seas, Atlantic and Southern Ocean have been retrieved and merged into a new database: CARINA (CARbon dioxide IN the Atlantic Ocean). The data have gone through rigorous quality control procedures to assure the highest possible quality and consistency. The data for the pertinent parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the three data products: merged data files with measured, calculated and interpolated data for each of the three CARINA regions, i.e. the Arctic Mediterranean Seas, the Atlantic and the Southern Ocean. These products have been corrected to be internally consistent. Ninety-eight of the cruises in the CARINA database were conducted in the Atlantic Ocean, defined here as the region south of the Greenland-Iceland-Scotland Ridge and north of about 30° S. Here we present an overview of the Atlantic Ocean synthesis of the CARINA data and the adjustments that were applied to the data product. We also report the details of the secondary QC (Quality Control) for salinity for this data set. Procedures of quality control - including crossover analysis between stations and inversion analysis of all crossover data - are briefly described. Adjustments to salinity measurements were applied to the data from 10 cruises in the Atlantic Ocean region. Based on our analysis we estimate the internal consistency of the CARINA-ATL salinity data to be 4.1 ppm. With these adjustments the CARINA data products are consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s, and is now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  20. The reliability paradox of the Parent-Child Conflict Tactics Corporal Punishment Subscale.

    PubMed

    Lorber, Michael F; Slep, Amy M Smith

    2018-02-01

    In the present investigation we consider and explain an apparent paradox in the measurement of corporal punishment with the Parent-Child Conflict Tactics Scale (CTS-PC): How can it have poor internal consistency and still be reliable? The CTS-PC was administered to a community sample of 453 opposite sex couples who were parents of 3- to 7-year-old children. Internal consistency was marginal, yet item response theory analyses revealed that reliability rose sharply with increasing corporal punishment, exceeding .80 in the upper ranges of the construct. The results suggest that the CTS-PC Corporal Punishment subscale reliably discriminates among parents who report average to high corporal punishment (64% of mothers and 56% of fathers in the present sample), despite low overall internal consistency. These results have straightforward implications for the use and reporting of the scale. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. CARINA data synthesis project: pH data scale unification and cruise adjustments

    NASA Astrophysics Data System (ADS)

    Velo, A.; Pérez, F. F.; Lin, X.; Key, R. M.; Tanhua, T.; de La Paz, M.; Olsen, A.; van Heuven, S.; Jutterström, S.; Ríos, A. F.

    2010-05-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from 188 previously non-publicly available cruise data sets in the Artic Mediterranean Seas (AMS), Atlantic Ocean and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic Ocean). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values. Systematic biases found in the data have been corrected in the data products, three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; AMS, Atlantic Ocean and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 59 reported pH measured values. All reported pH data have been unified to the Sea-Water Scale (SWS) at 25 °C. Here we present details of the secondary QC of pH in the CARINA database and the scale unification to SWS at 25 °C. The pH scale has been converted for 36 cruises. Procedures of quality control, including crossover analysis between cruises and inversion analysis are described. Adjustments were applied to the pH values for 21 of the cruises in the CARINA dataset. With these adjustments the CARINA database is consistent both internally as well as with the GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal consistency of the CARINA pH data to be 0.005 pH units. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates, for ocean acidification assessment and for model validation.

  2. LHCb Conditions database operation assistance systems

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Shapoval, I.; Cattaneo, M.; Degaudenzi, H.; Santinelli, R.

    2012-12-01

    The Conditions Database (CondDB) of the LHCb experiment provides versioned, time dependent geometry and conditions data for all LHCb data processing applications (simulation, high level trigger (HLT), reconstruction, analysis) in a heterogeneous computing environment ranging from user laptops to the HLT farm and the Grid. These different use cases impose front-end support for multiple database technologies (Oracle and SQLite are used). Sophisticated distribution tools are required to ensure timely and robust delivery of updates to all environments. The content of the database has to be managed to ensure that updates are internally consistent and externally compatible with multiple versions of the physics application software. In this paper we describe three systems that we have developed to address these issues. The first system is a CondDB state tracking extension to the Oracle 3D Streams replication technology, to trap cases when the CondDB replication was corrupted. Second, an automated distribution system for the SQLite-based CondDB, providing also smart backup and checkout mechanisms for the CondDB managers and LHCb users respectively. And, finally, a system to verify and monitor the internal (CondDB self-consistency) and external (LHCb physics software vs. CondDB) compatibility. The former two systems are used in production in the LHCb experiment and have achieved the desired goal of higher flexibility and robustness for the management and operation of the CondDB. The latter one has been fully designed and is passing currently to the implementation stage.

  3. Performance of Stratified and Subgrouped Disproportionality Analyses in Spontaneous Databases.

    PubMed

    Seabroke, Suzie; Candore, Gianmario; Juhlin, Kristina; Quarcoo, Naashika; Wisniewski, Antoni; Arani, Ramin; Painter, Jeffery; Tregunno, Philip; Norén, G Niklas; Slattery, Jim

    2016-04-01

    Disproportionality analyses are used in many organisations to identify adverse drug reactions (ADRs) from spontaneous report data. Reporting patterns vary over time, with patient demographics, and between different geographical regions, and therefore subgroup analyses or adjustment by stratification may be beneficial. The objective of this study was to evaluate the performance of subgroup and stratified disproportionality analyses for a number of key covariates within spontaneous report databases of differing sizes and characteristics. Using a reference set of established ADRs, signal detection performance (sensitivity and precision) was compared for stratified, subgroup and crude (unadjusted) analyses within five spontaneous report databases (two company, one national and two international databases). Analyses were repeated for a range of covariates: age, sex, country/region of origin, calendar time period, event seriousness, vaccine/non-vaccine, reporter qualification and report source. Subgroup analyses consistently performed better than stratified analyses in all databases. Subgroup analyses also showed benefits in both sensitivity and precision over crude analyses for the larger international databases, whilst for the smaller databases a gain in precision tended to result in some loss of sensitivity. Additionally, stratified analyses did not increase sensitivity or precision beyond that associated with analytical artefacts of the analysis. The most promising subgroup covariates were age and region/country of origin, although this varied between databases. Subgroup analyses perform better than stratified analyses and should be considered over the latter in routine first-pass signal detection. Subgroup analyses are also clearly beneficial over crude analyses for larger databases, but further validation is required for smaller databases.

  4. Proposal of Network-Based Multilingual Space Dictionary Database System

    NASA Astrophysics Data System (ADS)

    Yoshimitsu, T.; Hashimoto, T.; Ninomiya, K.

    2002-01-01

    The International Academy of Astronautics (IAA) is now constructing a multilingual dictionary database system of space-friendly terms. The database consists of a lexicon and dictionaries of multiple languages. The lexicon is a table which relates corresponding terminology in different languages. Each language has a dictionary which contains terms and their definitions. The database assumes the use on the internet. Updating and searching the terms and definitions are conducted via the network. Maintaining the database is conducted by the international cooperation. A new word arises day by day, thus to easily input new words and their definitions to the database is required for the longstanding success of the system. The main key of the database is an English term which is approved at the table held once or twice with the working group members. Each language has at lease one working group member who is responsible of assigning the corresponding term and the definition of the term of his/her native language. Inputting and updating terms and their definitions can be conducted via the internet from the office of each member which may be located at his/her native country. The system is constructed by freely distributed database server program working on the Linux operating system, which will be installed at the head office of IAA. Once it is installed, it will be open to all IAA members who can search the terms via the internet. Currently the authors are constructing the prototype system which is described in this paper.

  5. CARINA data synthesis project: pH data scale unification and cruise adjustments

    NASA Astrophysics Data System (ADS)

    Velo, A.; Pérez, F. F.; Lin, X.; Key, R. M.; Tanhua, T.; de La Paz, M.; van Heuven, S.; Jutterström, S.; Ríos, A. F.

    2009-10-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Artic Mediterranean Seas (AMS), Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; AMS, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 59 reported pH measured values. Here we present details of the secondary QC on pH for the CARINA database. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the pH values for 21 of the cruises in the CARINA dataset. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA pH data to be 0.005 pH units. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  6. X-Ray Transition Energies Database

    National Institute of Standards and Technology Data Gateway

    SRD 128 NIST X-Ray Transition Energies Database (Web, free access)   This X-ray transition table provides the energies and wavelengths for the K and L transitions connecting energy levels having principal quantum numbers n = 1, 2, 3, and 4. The elements covered include Z = 10, neon to Z = 100, fermium. There are two unique features of this data base: (1) a serious attempt to have all experimental values on a scale consistent with the International System of measurement (the SI) and (2) inclusion of accurate theoretical estimates for all transitions.

  7. Validating hospital antibiotic purchasing data as a metric of inpatient antibiotic use.

    PubMed

    Tan, Charlie; Ritchie, Michael; Alldred, Jason; Daneman, Nick

    2016-02-01

    Antibiotic purchasing data are a widely used, but unsubstantiated, measure of antibiotic consumption. To validate this source, we compared purchasing data from hospitals and external medical databases with patient-level dispensing data. Antibiotic purchasing and dispensing data from internal hospital records and purchasing data from IMS Health were obtained for two hospitals between May 2013 and April 2015. Internal purchasing data were validated against dispensing data, and IMS data were compared with both internal metrics. Scatterplots of individual antimicrobial data points were generated; Pearson's correlation and linear regression coefficients were computed. A secondary analysis re-examined these correlations over shorter calendar periods. Internal purchasing data were strongly correlated with dispensing data, with correlation coefficients of 0.90 (95% CI = 0.83-0.95) and 0.98 (95% CI = 0.95-0.99) at hospitals A and B, respectively. Although dispensing data were consistently lower than purchasing data, this was attributed to a single antibiotic at both hospitals. IMS data were favourably correlated with, but underestimated, internal purchasing and dispensing data. This difference was accounted for by eight antibiotics for which direct sales from some manufacturers were not included in the IMS database. The correlation between purchasing and dispensing data was consistent across periods as short as 3 months, but not at monthly intervals. Both internal and external antibiotic purchasing data are strongly correlated with dispensing data. If outliers are accounted for appropriately, internal purchasing data could be used for cost-effective evaluation of antimicrobial stewardship programmes, and external data sets could be used for surveillance and research across geographical regions. © The Author 2015. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Validating hospital antibiotic purchasing data as a metric of inpatient antibiotic use

    PubMed Central

    Tan, Charlie; Ritchie, Michael; Alldred, Jason; Daneman, Nick

    2016-01-01

    Objectives Antibiotic purchasing data are a widely used, but unsubstantiated, measure of antibiotic consumption. To validate this source, we compared purchasing data from hospitals and external medical databases with patient-level dispensing data. Methods Antibiotic purchasing and dispensing data from internal hospital records and purchasing data from IMS Health were obtained for two hospitals between May 2013 and April 2015. Internal purchasing data were validated against dispensing data, and IMS data were compared with both internal metrics. Scatterplots of individual antimicrobial data points were generated; Pearson's correlation and linear regression coefficients were computed. A secondary analysis re-examined these correlations over shorter calendar periods. Results Internal purchasing data were strongly correlated with dispensing data, with correlation coefficients of 0.90 (95% CI = 0.83–0.95) and 0.98 (95% CI = 0.95–0.99) at hospitals A and B, respectively. Although dispensing data were consistently lower than purchasing data, this was attributed to a single antibiotic at both hospitals. IMS data were favourably correlated with, but underestimated, internal purchasing and dispensing data. This difference was accounted for by eight antibiotics for which direct sales from some manufacturers were not included in the IMS database. The correlation between purchasing and dispensing data was consistent across periods as short as 3 months, but not at monthly intervals. Conclusions Both internal and external antibiotic purchasing data are strongly correlated with dispensing data. If outliers are accounted for appropriately, internal purchasing data could be used for cost-effective evaluation of antimicrobial stewardship programmes, and external data sets could be used for surveillance and research across geographical regions. PMID:26546668

  9. VizieR Online Data Catalog: AAVSO International Variable Star Index VSX (Watson+, 2006-2014)

    NASA Astrophysics Data System (ADS)

    Watson, C.; Henden, A. A.; Price, A.

    2017-05-01

    This file contains Galactic stars known or suspected to be variable. It lists all stars that have an entry in the AAVSO International Variable Star Index (VSX; http://www.aavso.org/vsx). The database consisted initially of the General Catalogue of Variable Stars (GCVS) and the New Catalogue of Suspected Variables (NSV) and was then supplemented with a large number of variable star catalogues, as well as individual variable star discoveries or variables found in the literature. Effort has also been invested to update the entries with the latest information regarding position, type and period and to remove duplicates. The VSX database is being continually updated and maintained. For historical reasons some objects outside of the Galaxy have been included. (3 data files).

  10. VizieR Online Data Catalog: AAVSO International Variable Star Index VSX (Watson+, 2006-2014)

    NASA Astrophysics Data System (ADS)

    Watson, C.; Henden, A. A.; Price, A.

    2018-05-01

    This file contains Galactic stars known or suspected to be variable. It lists all stars that have an entry in the AAVSO International Variable Star Index (VSX; http://www.aavso.org/vsx). The database consisted initially of the General Catalogue of Variable Stars (GCVS) and the New Catalogue of Suspected Variables (NSV) and was then supplemented with a large number of variable star catalogues, as well as individual variable star discoveries or variables found in the literature. Effort has also been invested to update the entries with the latest information regarding position, type and period and to remove duplicates. The VSX database is being continually updated and maintained. For historical reasons some objects outside of the Galaxy have been included. (3 data files).

  11. MetaBar - a tool for consistent contextual data acquisition and standards compliant submission.

    PubMed

    Hankeln, Wolfgang; Buttigieg, Pier Luigi; Fink, Dennis; Kottmann, Renzo; Yilmaz, Pelin; Glöckner, Frank Oliver

    2010-06-30

    Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft Excel spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data.

  12. MetaBar - a tool for consistent contextual data acquisition and standards compliant submission

    PubMed Central

    2010-01-01

    Background Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. Results MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft® Excel® spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). Conclusion The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data. PMID:20591175

  13. The Biological Macromolecule Crystallization Database and NASA Protein Crystal Growth Archive

    PubMed Central

    Gilliland, Gary L.; Tung, Michael; Ladner, Jane

    1996-01-01

    The NIST/NASA/CARB Biological Macromolecule Crystallization Database (BMCD), NIST Standard Reference Database 21, contains crystal data and crystallization conditions for biological macromolecules. The database entries include data abstracted from published crystallographic reports. Each entry consists of information describing the biological macromolecule crystallized and crystal data and the crystallization conditions for each crystal form. The BMCD serves as the NASA Protein Crystal Growth Archive in that it contains protocols and results of crystallization experiments undertaken in microgravity (space). These database entries report the results, whether successful or not, from NASA-sponsored protein crystal growth experiments in microgravity and from microgravity crystallization studies sponsored by other international organizations. The BMCD was designed as a tool to assist x-ray crystallographers in the development of protocols to crystallize biological macromolecules, those that have previously been crystallized, and those that have not been crystallized. PMID:11542472

  14. A survey of commercial object-oriented database management systems

    NASA Technical Reports Server (NTRS)

    Atkins, John

    1992-01-01

    The object-oriented data model is the culmination of over thirty years of database research. Initially, database research focused on the need to provide information in a consistent and efficient manner to the business community. Early data models such as the hierarchical model and the network model met the goal of consistent and efficient access to data and were substantial improvements over simple file mechanisms for storing and accessing data. However, these models required highly skilled programmers to provide access to the data. Consequently, in the early 70's E.F. Codd, an IBM research computer scientists, proposed a new data model based on the simple mathematical notion of the relation. This model is known as the Relational Model. In the relational model, data is represented in flat tables (or relations) which have no physical or internal links between them. The simplicity of this model fostered the development of powerful but relatively simple query languages that now made data directly accessible to the general database user. Except for large, multi-user database systems, a database professional was in general no longer necessary. Database professionals found that traditional data in the form of character data, dates, and numeric data were easily represented and managed via the relational model. Commercial relational database management systems proliferated and performance of relational databases improved dramatically. However, there was a growing community of potential database users whose needs were not met by the relational model. These users needed to store data with data types not available in the relational model and who required a far richer modelling environment than that provided by the relational model. Indeed, the complexity of the objects to be represented in the model mandated a new approach to database technology. The Object-Oriented Model was the result.

  15. Firefighters' hearing: a comparison with population databases from the International Standards Organization.

    PubMed

    Kales, S N; Freyman, R L; Hill, J M; Polyhronopoulos, G N; Aldrich, J M; Christiani, D C

    2001-07-01

    We investigated firefighters' hearing relative to general population data to adjust for age-expected hearing loss. For five groups of male firefighters with increasing mean ages, we compared their hearing thresholds at the 50th and 90th percentiles with normative and age- and sex-matched hearing data from the International Standards Organization (databases A and B). At the 50th percentile, from a mean age of 28 to a mean age of 53 years, relative to databases A and B, the firefighters lost an excess of 19 to 23 dB, 20 to 23 dB, and 16 to 19 dB at 3000, 4000, and 6000 Hz, respectively. At the 90th percentile, from a mean age of 28 to a mean age of 53 years, relative to databases A and B, the firefighters lost an excess of 12 to 20 dB, 38 to 44 dB, 41 to 45 dB, and 22 to 28 dB at 2000, 3000, 4000, and 6000 Hz, respectively. The results are consistent with accelerated hearing loss in excess of age-expected loss among the firefighters, especially at or above the 90th percentile.

  16. [The biomedical periodicals of Hungarian editions--historical overview].

    PubMed

    Berhidi, Anna; Geges, József; Vasas, Lívia

    2006-03-12

    The majority of Hungarian scientific results are published in international periodicals in foreign languages. Yet the publications in Hungarian scientific periodicals also should not be ignored. This study analyses biomedical periodicals of Hungarian edition from different points of view. Based on different databases a list of titles consisting of 119 items resulted, which contains both the core and the peripheral journals of the biomedical field. These periodicals were analysed empirically, one by one: checking out the titles. 13 of the titles are ceased, among the rest 106 Hungarian scientific journals 10 are published in English language. From the remaining majority of Hungarian language and publishing only a few show up in international databases. Although quarter of the Hungarian biomedical journals meet the requirements, which means they could be represented in international databases, these periodicals are not indexed. 42 biomedical periodicals are available online. Although quarter of these journals come with restricted access. 2/3 of the Hungarian biomedical journals have detailed instructions to authors. These instructions inform the publishing doctors and researchers of the requirements of a biomedical periodical. The increasing number of Hungarian biomedical journals published is welcome news. But it would be important for quality publications which are cited a lot to appear in the Hungarian journals. The more publications are cited, the more journals and authors gain in prestige on home and international level.

  17. Food composition database development for between country comparisons.

    PubMed

    Merchant, Anwar T; Dehghan, Mahshid

    2006-01-19

    Nutritional assessment by diet analysis is a two-stepped process consisting of evaluation of food consumption, and conversion of food into nutrient intake by using a food composition database, which lists the mean nutritional values for a given food portion. Most reports in the literature focus on minimizing errors in estimation of food consumption but the selection of a specific food composition table used in nutrient estimation is also a source of errors. We are conducting a large prospective study internationally and need to compare diet, assessed by food frequency questionnaires, in a comparable manner between different countries. We have prepared a multi-country food composition database for nutrient estimation in all the countries participating in our study. The nutrient database is primarily based on the USDA food composition database, modified appropriately with reference to local food composition tables, and supplemented with recipes of locally eaten mixed dishes. By doing so we have ensured that the units of measurement, method of selection of foods for testing, and assays used for nutrient estimation are consistent and as current as possible, and yet have taken into account some local variations. Using this common metric for nutrient assessment will reduce differential errors in nutrient estimation and improve the validity of between-country comparisons.

  18. Analysis of Lunar Highland Regolith Samples From Apollo 16 Drive Core 64001/2 and Lunar Regolith Simulants - an Expanding Comparative Database

    NASA Technical Reports Server (NTRS)

    Schrader, Christian M.; Rickman, Doug; Stoeser, Douglas; Wentworth, Susan; McKay, Dave S.; Botha, Pieter; Butcher, Alan R.; Horsch, Hanna E.; Benedictus, Aukje; Gottlieb, Paul

    2008-01-01

    This slide presentation reviews the work to analyze the lunar highland regolith samples that came from the Apollo 16 core sample 64001/2 and simulants of lunar regolith, and build a comparative database. The work is part of a larger effort to compile an internally consistent database on lunar regolith (Apollo Samples) and lunar regolith simulants. This is in support of a future lunar outpost. The work is to characterize existing lunar regolith and simulants in terms of particle type, particle size distribution, particle shape distribution, bulk density, and other compositional characteristics, and to evaluate the regolith simulants by the same properties in comparison to the Apollo sample lunar regolith.

  19. Proceedings of the International Conference (7th) on Machine Learning Held in Austin, Texas on 21-23 June 1990

    DTIC Science & Technology

    1990-06-23

    experiment was carried out for I(a2 I C1) = I(a2 C 2 ) = 1(a2 0 C3) = 0.33 both the databases . The number of sub-descriptions 1(a3 I C1) = I(a3 C 2 ) = l...for the second database is + 1(a2 I C2 )) + f(ep2 I C2) * as shown in table 4. The nurber of sub-descriptions (I(al I C2) + I(a2 I C2 )) is once...tends to degrade the performance [Chan 75]. path of C3]. Application 2 : The database is the 1984 Congres- sional voting pattern records consisting of

  20. Choosing spouses and houses: Impaired congruence between preference and choice following damage to the ventromedial prefrontal cortex.

    PubMed

    Bowren, Mark D; Croft, Katie E; Reber, Justin; Tranel, Daniel

    2018-03-01

    A well-documented effect of focal ventromedial prefrontal cortex (vmPFC) damage is a deficit in real-world decision making. An important aspect of this deficit may be a deficiency in "internal consistency" during social decision making-that is, impaired congruence between expressed preferences versus actual behavioral choices. An example of low internal consistency would be if one expressed the desire to marry someone with impeccable moral character, yet proceeded to marry someone convicted of multiple felonies. Here, we used a neuropsychological approach to investigate neural correlates of internal consistency in complex decision making. Sixteen individuals with focal vmPFC lesions, 16 brain damage comparison individuals, and 16 normal comparison individuals completed a 3-option forced-choice preference task in which choices were made using attribute sets. Participants also completed visual-analogue preference ratings to indicate how much they liked each option, and rated the influence of each attribute on their decision making. Options were either social (potential spouses) or nonsocial (potential houses). Internal consistency for a trial was defined as agreement between the choice and the most positively rated option. A mixed design analysis of variance revealed that internal consistency between choices and preferences derived from summed attribute ratings was significantly lower for the vmPFC group relative to comparison participants, but only in the social condition (pη2 = .09), 95% CI [.002, .163]. Internal consistency during social decisions may be deficient in patients with vmPFC damage, leading to a discrepancy between preferences and choices. The vmPFC may provide an important neural mechanism for aligning behavioral choices with expressed preferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. Systemic inaccuracies in the National Surgical Quality Improvement Program database: Implications for accuracy and validity for neurosurgery outcomes research.

    PubMed

    Rolston, John D; Han, Seunggu J; Chang, Edward F

    2017-03-01

    The American College of Surgeons (ACS) National Surgical Quality Improvement Program (NSQIP) provides a rich database of North American surgical procedures and their complications. Yet no external source has validated the accuracy of the information within this database. Using records from the 2006 to 2013 NSQIP database, we used two methods to identify errors: (1) mismatches between the Current Procedural Terminology (CPT) code that was used to identify the surgical procedure, and the International Classification of Diseases (ICD-9) post-operative diagnosis: i.e., a diagnosis that is incompatible with a certain procedure. (2) Primary anesthetic and CPT code mismatching: i.e., anesthesia not indicated for a particular procedure. Analyzing data for movement disorders, epilepsy, and tumor resection, we found evidence of CPT code and postoperative diagnosis mismatches in 0.4-100% of cases, depending on the CPT code examined. When analyzing anesthetic data from brain tumor, epilepsy, trauma, and spine surgery, we found evidence of miscoded anesthesia in 0.1-0.8% of cases. National databases like NSQIP are an important tool for quality improvement. Yet all databases are subject to errors, and measures of internal consistency show that errors affect up to 100% of case records for certain procedures in NSQIP. Steps should be taken to improve data collection on the frontend of NSQIP, and also to ensure that future studies with NSQIP take steps to exclude erroneous cases from analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. The quest for the perfect gravity anomaly: Part 1 - New calculation standards

    USGS Publications Warehouse

    Li, X.; Hildenbrand, T.G.; Hinze, W. J.; Keller, Gordon R.; Ravat, D.; Webring, M.

    2006-01-01

    The North American gravity database together with databases from Canada, Mexico, and the United States are being revised to improve their coverage, versatility, and accuracy. An important part of this effort is revision of procedures and standards for calculating gravity anomalies taking into account our enhanced computational power, modern satellite-based positioning technology, improved terrain databases, and increased interest in more accurately defining different anomaly components. The most striking revision is the use of one single internationally accepted reference ellipsoid for the horizontal and vertical datums of gravity stations as well as for the computation of the theoretical gravity. The new standards hardly impact the interpretation of local anomalies, but do improve regional anomalies. Most importantly, such new standards can be consistently applied to gravity database compilations of nations, continents, and even the entire world. ?? 2005 Society of Exploration Geophysicists.

  3. The Golosiiv on-line plate archive database, management and maintenance

    NASA Astrophysics Data System (ADS)

    Pakuliak, L.; Sergeeva, T.

    2007-08-01

    We intend to create online version of the database of the MAO NASU plate archive as VO-compatible structures in accordance with principles, developed by the International Virtual Observatory Alliance in order to make them available for world astronomical community. The online version of the log-book database is constructed by means of MySQL+PHP. Data management system provides a user with user interface, gives a capability of detailed traditional form-filling radial search of plates, obtaining some auxiliary sampling, the listing of each collection and permits to browse the detail descriptions of collections. The administrative tool allows database administrator the data correction, enhancement with new data sets and control of the integrity and consistence of the database as a whole. The VO-compatible database is currently constructing under the demands and in the accordance with principles of international data archives and has to be strongly generalized in order to provide a possibility of data mining by means of standard interfaces and to be the best fitted to the demands of WFPDB Group for databases of the plate catalogues. On-going enhancements of database toward the WFPDB bring the problem of the verification of data to the forefront, as it demands the high degree of data reliability. The process of data verification is practically endless and inseparable from data management owing to a diversity of data errors nature, that means to a variety of ploys of their identification and fixing. The current status of MAO NASU glass archive forces the activity in both directions simultaneously: the enhancement of log-book database with new sets of observational data as well as generalized database creation and the cross-identification between them. The VO-compatible version of the database is supplying with digitized data of plates obtained with MicroTek ScanMaker 9800 XL TMA. The scanning procedure is not total but is conducted selectively in the frames of special projects.

  4. Human Connectome Project Informatics: quality control, database services, and data visualization

    PubMed Central

    Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.

    2013-01-01

    The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591

  5. Last Deglacial Sea Level: A Curated Database of Indicators of Past Sea Levels from Biological and Geomorphological Archives

    NASA Astrophysics Data System (ADS)

    Hibbert, F. D.; Williams, F. H.; Fallon, S.; Rohling, E. J.

    2017-12-01

    The last deglacial was an interval of rapid climate and sea-level change, including the collapse of large continental ice sheets. This database collates carefully assessed sea-level data from peer-reviewed sources for the interval 0 to 25 thousand years ago (ka), from the last glacial maximum to the present interglacial conditions. In addition to facilitating site-specific reconstructions of past sea levels, the database provides a suite of data beyond the range of modern/instrumental variability that may help hone future sea-level projections. The database is global in scope, internally consistent, and contains U-series and radiocarbon dated indicators from both biological and geomorpohological archives. We focus on far-field data (i.e., away from the sites of the former continental ice sheets), but some key intermediate (i.e., from the Caribbean) data are also included. All primary fields (i.e., sample location, elevation, age and context) possess quantified uncertainties, which - in conjunction with available metadata - allows the reconstructed sea levels to be interpreted within both their uncertainties and geological context. Consistent treatment of each of the individual records in the database, and incorporation of fully expressed uncertainties, allows datasets to be easily compared. The compilation contains 145 studies from 40 locations (>2,000 data points) and includes all raw information and metadata.

  6. [Estimators of internal consistency in health research: the use of the alpha coefficient].

    PubMed

    da Silva, Franciele Cascaes; Gonçalves, Elizandra; Arancibia, Beatriz Angélica Valdivia; Bento, Gisele Graziele; Castro, Thiago Luis da Silva; Hernandez, Salma Stephany Soleman; da Silva, Rudney

    2015-01-01

    Academic production has increased in the area of health, increasingly demanding high quality in publications of great impact. One of the ways to consider quality is through methods that increase the consistency of data analysis, such as reliability which, depending on the type of data, can be evaluated by different coefficients, especially the alpha coefficient. Based on this, the present review systematically gathers scientific articles produced in the last five years, which in a methodological manner gave the α coefficient psychometric use as an estimator of internal consistency and reliability in the processes of construction, adaptation and validation of instruments. The identification of the studies was conducted systematically in the databases BioMed Central Journals, Web of Science, Wiley Online Library, Medline, SciELO, Scopus, Journals@Ovid, BMJ and Springer, using inclusion and exclusion criteria. Data analyses were performed by means of triangulation, content analysis and descriptive analysis. It was found that most studies were conducted in Iran (f=3), Spain (f=2) and Brazil (f=2). These studies aimed to test the psychometric properties of instruments, with eight studies using the α coefficient to assess reliability and nine for assessing internal consistency. All studies were classified as methodological research when their objectives were analyzed. In addition, four studies were also classified as correlational and one as descriptive-correlational. It can be concluded that though the α coefficient is widely used as one of the main parameters for assessing internal consistency of questionnaires in health sciences, its use as an estimator of trust of the methodology used and internal consistency has some critiques that should be considered.

  7. Comparison of the NCI open database with seven large chemical structural databases.

    PubMed

    Voigt, J H; Bienfait, B; Wang, S; Nicklaus, M C

    2001-01-01

    Eight large chemical databases have been analyzed and compared to each other. Central to this comparison is the open National Cancer Institute (NCI) database, consisting of approximately 250 000 structures. The other databases analyzed are the Available Chemicals Directory ("ACD," from MDL, release 1.99, 3D-version); the ChemACX ("ACX," from CamSoft, Version 4.5); the Maybridge Catalog and the Asinex database (both as distributed by CamSoft as part of ChemInfo 4.5); the Sigma-Aldrich Catalog (CD-ROM, 1999 Version); the World Drug Index ("WDI," Derwent, version 1999.03); and the organic part of the Cambridge Crystallographic Database ("CSD," from Cambridge Crystallographic Data Center, 1999 Version 5.18). The database properties analyzed are internal duplication rates; compounds unique to each database; cumulative occurrence of compounds in an increasing number of databases; overlap of identical compounds between two databases; similarity overlap; diversity; and others. The crystallographic database CSD and the WDI show somewhat less overlap with the other databases than those with each other. In particular the collections of commercial compounds and compilations of vendor catalogs have a substantial degree of overlap among each other. Still, no database is completely a subset of any other, and each appears to have its own niche and thus "raison d'être". The NCI database has by far the highest number of compounds that are unique to it. Approximately 200 000 of the NCI structures were not found in any of the other analyzed databases.

  8. 76 FR 5106 - Deposit Requirements for Registration of Automated Databases That Predominantly Consist of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-28

    ... Registration of Automated Databases That Predominantly Consist of Photographs AGENCY: Copyright Office, Library... regarding electronic registration of automated databases that consist predominantly of photographs and group... applications for automated databases that consist predominantly of photographs. The proposed amendments would...

  9. The Gypsy Database (GyDB) of mobile genetic elements

    PubMed Central

    Lloréns, C.; Futami, R.; Bezemer, D.; Moya, A.

    2008-01-01

    In this article, we introduce the Gypsy Database (GyDB) of mobile genetic elements, an in-progress database devoted to the non-redundant analysis and evolutionary-based classification of mobile genetic elements. In this first version, we contemplate eukaryotic Ty3/Gypsy and Retroviridae long terminal repeats (LTR) retroelements. Phylogenetic analyses based on the gag-pro-pol internal region commonly presented by these two groups strongly support a certain number of previously described Ty3/Gypsy lineages originally reported from reverse-transcriptase (RT) analyses. Vertebrate retroviruses (Retroviridae) are also constituted in several monophyletic groups consistent with genera proposed by the ICTV nomenclature, as well as with the current tendency to classify both endogenous and exogenous retroviruses by three major classes (I, II and III). Our inference indicates that all protein domains codified by the gag-pro-pol internal region of these two groups agree in a collective presentation of a particular evolutionary history, which may be used as a main criterion to differentiate their molecular diversity in a comprehensive collection of phylogenies and non-redundant molecular profiles useful in the identification of new Ty3/Gypsy and Retroviridae species. The GyDB project is available at http://gydb.uv.es. PMID:17895280

  10. Project Summary Report

    NASA Technical Reports Server (NTRS)

    Upchurch, Christopher

    2011-01-01

    The project, being the development of resource management applications, consisted entirely of my own effort. From deliverable requirements provided by my mentor, and some functional requirement additions generated through design reviews, It was my responsibility to implement the requested features as well as possible, given the resources available. For the most part development work consisted of database programming and functional testing using real resource data. Additional projects I worked on included some firing room console training, configuring the new NE-A microcontroller development lab network, mentoring high school CubeSat students, and managing the NE interns' component of the mentor appreciation ceremony.

  11. Schwannomatosis

    MedlinePlus

    ... Schwannomatosis Database Schwannomatosis Resources NF Registry International Schwannomatosis Database Because of the small number of people that ... and how to treat it, the International Schwannomatosis Database (ISD) project is proposing to bring together people ...

  12. The DNA Data Bank of Japan launches a new resource, the DDBJ Omics Archive of functional genomics experiments.

    PubMed

    Kodama, Yuichi; Mashima, Jun; Kaminuma, Eli; Gojobori, Takashi; Ogasawara, Osamu; Takagi, Toshihisa; Okubo, Kousaku; Nakamura, Yasukazu

    2012-01-01

    The DNA Data Bank of Japan (DDBJ; http://www.ddbj.nig.ac.jp) maintains and provides archival, retrieval and analytical resources for biological information. The central DDBJ resource consists of public, open-access nucleotide sequence databases including raw sequence reads, assembly information and functional annotation. Database content is exchanged with EBI and NCBI within the framework of the International Nucleotide Sequence Database Collaboration (INSDC). In 2011, DDBJ launched two new resources: the 'DDBJ Omics Archive' (DOR; http://trace.ddbj.nig.ac.jp/dor) and BioProject (http://trace.ddbj.nig.ac.jp/bioproject). DOR is an archival database of functional genomics data generated by microarray and highly parallel new generation sequencers. Data are exchanged between the ArrayExpress at EBI and DOR in the common MAGE-TAB format. BioProject provides an organizational framework to access metadata about research projects and the data from the projects that are deposited into different databases. In this article, we describe major changes and improvements introduced to the DDBJ services, and the launch of two new resources: DOR and BioProject.

  13. Hydroacoustic propagation grids for the CTBT knowledge databaes BBN technical memorandum W1303

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Angell

    1998-05-01

    The Hydroacoustic Coverage Assessment Model (HydroCAM) has been used to develop components of the hydroacoustic knowledge database required by operational monitoring systems, particularly the US National Data Center (NDC). The database, which consists of travel time, amplitude correction and travel time standard deviation grids, is planned to support source location, discrimination and estimation functions of the monitoring network. The grids will also be used under the current BBN subcontract to support an analysis of the performance of the International Monitoring System (IMS) and national sensor systems. This report describes the format and contents of the hydroacoustic knowledgebase grids, and themore » procedures and model parameters used to generate these grids. Comparisons between the knowledge grids, measured data and other modeled results are presented to illustrate the strengths and weaknesses of the current approach. A recommended approach for augmenting the knowledge database with a database of expected spectral/waveform characteristics is provided in the final section of the report.« less

  14. PHASS99: A software program for retrieving and decoding the radiometric ages of igneous rocks from the international database IGBADAT

    NASA Astrophysics Data System (ADS)

    Al-Mishwat, Ali T.

    2016-05-01

    PHASS99 is a FORTRAN program designed to retrieve and decode radiometric and other physical age information of igneous rocks contained in the international database IGBADAT (Igneous Base Data File). In the database, ages are stored in a proprietary format using mnemonic representations. The program can handle up to 99 ages in an igneous rock specimen and caters to forty radiometric age systems. The radiometric age alphanumeric strings assigned to each specimen description in the database consist of four components: the numeric age and its exponential modifier, a four-character mnemonic method identification, a two-character mnemonic name of analysed material, and the reference number in the rock group bibliography vector. For each specimen, the program searches for radiometric age strings, extracts them, parses them, decodes the different age components, and converts them to high-level English equivalents. IGBADAT and similarly-structured files are used for input. The output includes three files: a flat raw ASCII text file containing retrieved radiometric age information, a generic spreadsheet-compatible file for data import to spreadsheets, and an error file. PHASS99 builds on the old program TSTPHA (Test Physical Age) decoder program and expands greatly its capabilities. PHASS99 is simple, user friendly, fast, efficient, and does not require users to have knowledge of programing.

  15. Fostering International Collaboration in Birth Defects Research and Prevention: A Perspective From the International Clearinghouse for Birth Defects Surveillance and Research

    PubMed Central

    Botto, Lorenzo D.; Robert-Gnansia, Elisabeth; Siffel, Csaba; Harris, John; Borman, Barry; Mastroiacovo, Pierpaolo

    2006-01-01

    The International Clearing-house for Birth Defects Surveillance and Research, formerly known as International Clearinghouse of Birth Defects Monitoring Systems, consists of 40 registries worldwide that collaborate in monitoring 40 types of birth defects. Clearinghouse activities include the sharing and joint monitoring of birth defect data, epidemiologic and public health research, and capacity building, with the goal of reducing disease and promoting healthy birth outcomes through primary prevention. We discuss 3 of these activities: the collaborative assessment of the potential teratogenicity of first-trimester use of medications (the MADRE project), an example of the intersection of surveillance and research; the international databases of people with orofacial clefts, an example of the evolution from surveillance to outcome research; and the study of genetic polymorphisms, an example of collaboration in public health genetics. PMID:16571708

  16. Usefulness of a centralized system of data collection for the development of an international multicentre registry of spondyloarthritis

    PubMed Central

    Schiotis, Ruxandra; Font, Pilar; Zarco, Pedro; Almodovar, Raquel; Gratacós, Jordi; Mulero, Juan; Juanola, Xavier; Montilla, Carlos; Moreno, Estefanía; Ariza Ariza, Rafael; Collantes-Estevez, Eduardo

    2011-01-01

    Objective. To present the usefulness of a centralized system of data collection for the development of an international multicentre registry of SpA. Method. The originality of this registry consists in the creation of a virtual network of researchers in a computerized Internet database. From its conception, the registry was meant to be a dynamic acquiring system. Results. REGISPONSER has two developing phases (Conception and Universalization) and gathers several evolving secondary projects (REGISPONSER-EARLY, REGISPONSER-AS, ESPERANZA and RESPONDIA). Each sub-project answered the necessity of having more specific and complete data of the patients even from the onset of the disease so, in the end, obtaining a well-defined picture of SpAs spectrum in the Spanish population. Conclusion. REGISPONSER is the first dynamic SpA database composed of cohorts with a significant number of patients distributed by specific diagnosis, which provides basic specific information of the sub-cohorts useful for patients’ evaluation in rheumatology ambulatory consulting. PMID:20823095

  17. Countermeasure Evaluation and Validation Project (CEVP) Database Requirement Documentation

    NASA Technical Reports Server (NTRS)

    Shin, Sung Y.

    2003-01-01

    The initial focus of the project by the JSC laboratories will be to develop, test and implement a standardized complement of integrated physiological test (Integrated Testing Regimen, ITR) that will examine both system and intersystem function, and will be used to validate and certify candidate countermeasures. The ITR will consist of medical requirements (MRs) and non-MR core ITR tests, and countermeasure-specific testing. Non-MR and countermeasure-specific test data will be archived in a database specific to the CEVP. Development of a CEVP Database will be critical to documenting the progress of candidate countermeasures. The goal of this work is a fully functional software system that will integrate computer-based data collection and storage with secure, efficient, and practical distribution of that data over the Internet. This system will provide the foundation of a new level of interagency and international cooperation for scientific experimentation and research, providing intramural, international, and extramural collaboration through management and distribution of the CEVP data. The research performed this summer includes the first phase of the project. The first phase of the project is a requirements analysis. This analysis will identify the expected behavior of the system under normal conditions and abnormal conditions; that could affect the system's ability to produce this behavior; and the internal features in the system needed to reduce the risk of unexpected or unwanted behaviors. The second phase of this project have also performed in this summer. The second phase of project is the design of data entry screen and data retrieval screen for a working model of the Ground Data Database. The final report provided the requirements for the CEVP system in a variety of ways, so that both the development team and JSC technical management have a thorough understanding of how the system is expected to behave.

  18. 41 CFR 60-1.12 - Record retention.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... individual for a particular position, such as on-line resumes or internal resume databases, records... recordkeeping with respect to internal resume databases, the contractor must maintain a record of each resume added to the database, a record of the date each resume was added to the database, the position for...

  19. 41 CFR 60-1.12 - Record retention.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... individual for a particular position, such as on-line resumes or internal resume databases, records... recordkeeping with respect to internal resume databases, the contractor must maintain a record of each resume added to the database, a record of the date each resume was added to the database, the position for...

  20. 41 CFR 60-1.12 - Record retention.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... individual for a particular position, such as on-line resumes or internal resume databases, records... recordkeeping with respect to internal resume databases, the contractor must maintain a record of each resume added to the database, a record of the date each resume was added to the database, the position for...

  1. 41 CFR 60-1.12 - Record retention.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... individual for a particular position, such as on-line resumes or internal resume databases, records... recordkeeping with respect to internal resume databases, the contractor must maintain a record of each resume added to the database, a record of the date each resume was added to the database, the position for...

  2. Accelerated Leach Testing of GLASS: ALTGLASS Version 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trivelpiece, Cory L.; Jantzen, Carol M.; Crawford, Charles L.

    The Accelerated Leach Testing of GLASS (ALTGLASS) database is a collection of data from short- and long-term product consistency tests (PCT, ASTM C1285 A and B) on high level waste (HLW) as well as low activity waste (LAW) glasses. The database provides both U.S. and international researchers with an archive of experimental data for the purpose of studying, modeling, or validating existing models of nuclear waste glass corrosion. The ALTGLASS database is maintained and updated by researchers at the Savannah River National Laboratory (SRNL). This newest version, ALTGLASS Version 3.0, has been updated with an additional 503 rows of datamore » representing PCT results from corrosion experiments conducted in the United States by the Savannah River National Laboratory, Pacific Northwest National Laboratory, Argonne National Laboratory, and the Vitreous State Laboratory (SRNL, PNNL, ANL, VSL, respectively) as well as the National Nuclear Laboratory (NNL) in the United Kingdom.« less

  3. Thermodynamic properties for arsenic minerals and aqueous species

    USGS Publications Warehouse

    Nordstrom, D. Kirk; Majzlan, Juraj; Königsberger, Erich; Bowell, Robert J.; Alpers, Charles N.; Jamieson, Heather E.; Nordstrom, D. Kirk; Majzlan, Juraj

    2014-01-01

    Quantitative geochemical calculations are not possible without thermodynamic databases and considerable advances in the quantity and quality of these databases have been made since the early days of Lewis and Randall (1923), Latimer (1952), and Rossini et al. (1952). Oelkers et al. (2009) wrote, “The creation of thermodynamic databases may be one of the greatest advances in the field of geochemistry of the last century.” Thermodynamic data have been used for basic research needs and for a countless variety of applications in hazardous waste management and policy making (Zhu and Anderson 2002; Nordstrom and Archer 2003; Bethke 2008; Oelkers and Schott 2009). The challenge today is to evaluate thermodynamic data for internal consistency, to reach a better consensus of the most reliable properties, to determine the degree of certainty needed for geochemical modeling, and to agree on priorities for further measurements and evaluations.

  4. Content-based video indexing and searching with wavelet transformation

    NASA Astrophysics Data System (ADS)

    Stumpf, Florian; Al-Jawad, Naseer; Du, Hongbo; Jassim, Sabah

    2006-05-01

    Biometric databases form an essential tool in the fight against international terrorism, organised crime and fraud. Various government and law enforcement agencies have their own biometric databases consisting of combination of fingerprints, Iris codes, face images/videos and speech records for an increasing number of persons. In many cases personal data linked to biometric records are incomplete and/or inaccurate. Besides, biometric data in different databases for the same individual may be recorded with different personal details. Following the recent terrorist atrocities, law enforcing agencies collaborate more than before and have greater reliance on database sharing. In such an environment, reliable biometric-based identification must not only determine who you are but also who else you are. In this paper we propose a compact content-based video signature and indexing scheme that can facilitate retrieval of multiple records in face biometric databases that belong to the same person even if their associated personal data are inconsistent. We shall assess the performance of our system using a benchmark audio visual face biometric database that has multiple videos for each subject but with different identity claims. We shall demonstrate that retrieval of relatively small number of videos that are nearest, in terms of the proposed index, to any video in the database results in significant proportion of that individual biometric data.

  5. 4. International reservoir characterization technical conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-04-01

    This volume contains the Proceedings of the Fourth International Reservoir Characterization Technical Conference held March 2-4, 1997 in Houston, Texas. The theme for the conference was Advances in Reservoir Characterization for Effective Reservoir Management. On March 2, 1997, the DOE Class Workshop kicked off with tutorials by Dr. Steve Begg (BP Exploration) and Dr. Ganesh Thakur (Chevron). Tutorial presentations are not included in these Proceedings but may be available from the authors. The conference consisted of the following topics: data acquisition; reservoir modeling; scaling reservoir properties; and managing uncertainty. Selected papers have been processed separately for inclusion in the Energymore » Science and Technology database.« less

  6. Atlantic Ocean CARINA data: overview and salinity adjustments

    NASA Astrophysics Data System (ADS)

    Tanhua, T.; Steinfeldt, R.; Key, R. M.; Brown, P.; Gruber, N.; Wanninkhof, R.; Perez, F.; Körtzinger, A.; Velo, A.; Schuster, U.; van Heuven, S.; Bullister, J. L.; Stendardo, I.; Hoppema, M.; Olsen, A.; Kozyr, A.; Pierrot, D.; Schirnick, C.; Wallace, D. W. R.

    2009-08-01

    Water column data of carbon and carbon-relevant hydrographic and hydrochemical parameters from 188 previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged into a new database: CARINA (CARbon IN the Atlantic). The data have gone through rigorous quality control procedures to assure the highest possible quality and consistency. The data for the pertinent parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions, i.e. Arctic, Atlantic and Southern Ocean. Ninety-eight of the cruises in the CARINA database were conducted in the Atlantic Ocean, defined here as the region south of the Greenland-Iceland-Scotland Ridge and north of about 30° S. Here we present an overview of the Atlantic Ocean synthesis of the CARINA data and the adjustments that were applied to the data product. We also report details of the secondary QC for salinity for this data set. Procedures of quality control - including crossover analysis between stations and inversion analysis of all crossover data - are briefly described. Adjustments to salinity measurements were applied to the data from 10 cruises in the Atlantic Ocean region. Based on our analysis we estimate the internal accuracy of the CARINA-ATL salinity data to be 4.1 ppm. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s (Key et al., 2004), and is now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  7. 77 FR 40268 - Deposit Requirements for Registration of Automated Databases That Predominantly Consist of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-09

    ... Registration of Automated Databases That Predominantly Consist of Photographs AGENCY: Copyright Office, Library... the deposit requirements for applications for automated databases that consist predominantly of... authorship, the deposits for such databases include the image of each photograph in which copyright is...

  8. Spatial variation of volcanic rock geochemistry in the Virunga Volcanic Province: Statistical analysis of an integrated database

    NASA Astrophysics Data System (ADS)

    Barette, Florian; Poppe, Sam; Smets, Benoît; Benbakkar, Mhammed; Kervyn, Matthieu

    2017-10-01

    We present an integrated, spatially-explicit database of existing geochemical major-element analyses available from (post-) colonial scientific reports, PhD Theses and international publications for the Virunga Volcanic Province, located in the western branch of the East African Rift System. This volcanic province is characterised by alkaline volcanism, including silica-undersaturated, alkaline and potassic lavas. The database contains a total of 908 geochemical analyses of eruptive rocks for the entire volcanic province with a localisation for most samples. A preliminary analysis of the overall consistency of the database, using statistical techniques on sets of geochemical analyses with contrasted analytical methods or dates, demonstrates that the database is consistent. We applied a principal component analysis and cluster analysis on whole-rock major element compositions included in the database to study the spatial variation of the chemical composition of eruptive products in the Virunga Volcanic Province. These statistical analyses identify spatially distributed clusters of eruptive products. The known geochemical contrasts are highlighted by the spatial analysis, such as the unique geochemical signature of Nyiragongo lavas compared to other Virunga lavas, the geochemical heterogeneity of the Bulengo area, and the trachyte flows of Karisimbi volcano. Most importantly, we identified separate clusters of eruptive products which originate from primitive magmatic sources. These lavas of primitive composition are preferentially located along NE-SW inherited rift structures, often at distance from the central Virunga volcanoes. Our results illustrate the relevance of a spatial analysis on integrated geochemical data for a volcanic province, as a complement to classical petrological investigations. This approach indeed helps to characterise geochemical variations within a complex of magmatic systems and to identify specific petrologic and geochemical investigations that should be tackled within a study area.

  9. Protein Information Resource: a community resource for expert annotation of protein data

    PubMed Central

    Barker, Winona C.; Garavelli, John S.; Hou, Zhenglin; Huang, Hongzhan; Ledley, Robert S.; McGarvey, Peter B.; Mewes, Hans-Werner; Orcutt, Bruce C.; Pfeiffer, Friedhelm; Tsugita, Akira; Vinayaka, C. R.; Xiao, Chunlin; Yeh, Lai-Su L.; Wu, Cathy

    2001-01-01

    The Protein Information Resource, in collaboration with the Munich Information Center for Protein Sequences (MIPS) and the Japan International Protein Information Database (JIPID), produces the most comprehensive and expertly annotated protein sequence database in the public domain, the PIR-International Protein Sequence Database. To provide timely and high quality annotation and promote database interoperability, the PIR-International employs rule-based and classification-driven procedures based on controlled vocabulary and standard nomenclature and includes status tags to distinguish experimentally determined from predicted protein features. The database contains about 200 000 non-redundant protein sequences, which are classified into families and superfamilies and their domains and motifs identified. Entries are extensively cross-referenced to other sequence, classification, genome, structure and activity databases. The PIR web site features search engines that use sequence similarity and database annotation to facilitate the analysis and functional identification of proteins. The PIR-Inter­national databases and search tools are accessible on the PIR web site at http://pir.georgetown.edu/ and at the MIPS web site at http://www.mips.biochem.mpg.de. The PIR-International Protein Sequence Database and other files are also available by FTP. PMID:11125041

  10. The National Heart, Lung, and Blood Institute Recipient Epidemiology and Donor Evaluation Study (REDS-III): A research program striving to improve blood donor and transfusion recipient outcomes

    PubMed Central

    Kleinman, Steven; Busch, Michael P; Murphy, Edward L; Shan, Hua; Ness, Paul; Glynn, Simone A.

    2014-01-01

    Background The Recipient Epidemiology and Donor Evaluation Study -III (REDS-III) is a 7-year multicenter transfusion safety research initiative launched in 2011 by the National Heart, Lung, and Blood Institute. Study design The domestic component involves 4 blood centers, 12 hospitals, a data coordinating center, and a central laboratory. The international component consists of distinct programs in Brazil, China, and South Africa which involve US and in-country investigators. Results REDS-III is using two major methods to address key research priorities in blood banking/transfusion medicine. First, there will be numerous analyses of large “core” databases; the international programs have each constructed a donor/donation database while the domestic program has established a detailed research database that links data from blood donors and their donations, the components made from these donations, and data extracts from the electronic medical records of the recipients of these components. Secondly, there are more than 25 focused research protocols involving transfusion recipients, blood donors, or both that are either in progress or scheduled to begin within the next 3 years. Areas of study include transfusion epidemiology and blood utilization; transfusion outcomes; non-infectious transfusion risks; HIV-related safety issues (particularly in the international programs); emerging infectious agents; blood component quality; donor health and safety; and other donor issues. Conclusions It is intended that REDS-III serve as an impetus for more widespread recipient and linked donor-recipient research in the US as well as to help assure a safe and available blood supply in the US and in international locations. PMID:24188564

  11. Application of ISO standard 27048: dose assessment for the monitoring of workers for internal radiation exposure.

    PubMed

    Henrichs, K

    2011-03-01

    Besides ongoing developments in the dosimetry of incorporated radionuclides, there are various efforts to improve the monitoring of workers for potential or real intakes of radionuclides. The disillusioning experience with numerous intercomparison projects identified substantial differences between national regulations, concepts, applied programmes and methods, and dose assessment procedures. Measured activities were not directly comparable because of significant differences between measuring frequencies and methods, but also results of case studies for dose assessments revealed differences of orders of magnitude. Besides the general common interest in reliable monitoring results, at least the cross-border activities of workers (e.g. nuclear power plant services) require consistent approaches and comparable results. The International Standardization Organization therefore initiated projects to standardise programmes for the monitoring of workers, the requirements for measuring laboratories and the processes for the quantitative evaluation of monitoring results in terms of internal assessed doses. The strength of the concepts applied by the international working group consists in a unified approach defining the requirements, databases and processes. This paper is intended to give a short introduction into the standardization project followed by a more detailed description of the dose assessment standard, which will be published in the very near future.

  12. ThermoFit: A Set of Software Tools, Protocols and Schema for the Organization of Thermodynamic Data and for the Development, Maintenance, and Distribution of Internally Consistent Thermodynamic Data/Model Collections

    NASA Astrophysics Data System (ADS)

    Ghiorso, M. S.

    2013-12-01

    Internally consistent thermodynamic databases are critical resources that facilitate the calculation of heterogeneous phase equilibria and thereby support geochemical, petrological, and geodynamical modeling. These 'databases' are actually derived data/model systems that depend on a diverse suite of physical property measurements, calorimetric data, and experimental phase equilibrium brackets. In addition, such databases are calibrated with the adoption of various models for extrapolation of heat capacities and volumetric equations of state to elevated temperature and pressure conditions. Finally, these databases require specification of thermochemical models for the mixing properties of solid, liquid, and fluid solutions, which are often rooted in physical theory and, in turn, depend on additional experimental observations. The process of 'calibrating' a thermochemical database involves considerable effort and an extensive computational infrastructure. Because of these complexities, the community tends to rely on a small number of thermochemical databases, generated by a few researchers; these databases often have limited longevity and are universally difficult to maintain. ThermoFit is a software framework and user interface whose aim is to provide a modeling environment that facilitates creation, maintenance and distribution of thermodynamic data/model collections. Underlying ThermoFit are data archives of fundamental physical property, calorimetric, crystallographic, and phase equilibrium constraints that provide the essential experimental information from which thermodynamic databases are traditionally calibrated. ThermoFit standardizes schema for accessing these data archives and provides web services for data mining these collections. Beyond simple data management and interoperability, ThermoFit provides a collection of visualization and software modeling tools that streamline the model/database generation process. Most notably, ThermoFit facilitates the rapid visualization of predicted model outcomes and permits the user to modify these outcomes using tactile- or mouse-based GUI interaction, permitting real-time updates that reflect users choices, preferences, and priorities involving derived model results. This ability permits some resolution of the problem of correlated model parameters in the common situation where thermodynamic models must be calibrated from inadequate data resources. The ability also allows modeling constraints to be imposed using natural data and observations (i.e. petrologic or geochemical intuition). Once formulated, ThermoFit facilitates deployment of data/model collections by automated creation of web services. Users consume these services via web-, excel-, or desktop-clients. ThermoFit is currently under active development and not yet generally available; a limited capability prototype system has been coded for Macintosh computers and utilized to construct thermochemical models for H2O-CO2 mixed fluid saturation in silicate liquids. The longer term goal is to release ThermoFit as a web portal application client with server-based cloud computations supporting the modeling environment.

  13. ICCS 2009 User Guide for the International Database. Supplement 1: International Version of the ICCS 2009 Questionnaires

    ERIC Educational Resources Information Center

    Brese, Falk; Jung, Michael; Mirazchiyski, Plamen; Schulz, Wolfram; Zuehlke, Olaf

    2011-01-01

    This document presents Supplement 1 of "The International Civic and Citizenship Education Study (ICCS) 2009 International Database," which includes data for all questionnaires administered as part of the ICCS 2009 assessment. This supplement contains the international version of the ICCS 2009 questionnaires in the following seven…

  14. CARINA alkalinity data in the Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Velo, A.; Perez, F. F.; Brown, P.; Tanhua, T.; Schuster, U.; Key, R. M.

    2009-08-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these, 75 cruises report alkalinity values. Here we present details of the secondary QC on alkalinity for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the alkalinity values for 16 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA-ATL alkalinity data to be 3.3 μmol kg-1. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  15. CARINA: nutrient data in the Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Tanhua, T.; Brown, P. J.; Key, R. M.

    2009-11-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic Mediterranean Seas, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these 84 cruises report nitrate values, 79 silicate, and 78 phosphate. Here we present details of the secondary QC for nutrients for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the nutrient values for 43 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s (Key et al., 2004). Based on our analysis we estimate the internal accuracy of the CARINA-ATL nutrient data to be: nitrate 1.5%; phosphate 2.6%; silicate 3.1%. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  16. CARINA: nutrient data in the Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Tanhua, T.; Brown, P. J.; Key, R. M.

    2009-07-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these 84 cruises report nitrate values, 79 silicate, and 78 phosphate. Here we present details of the secondary QC for nutrients for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the nutrient values for 43 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s (Key et al., 2004). Based on our analysis we estimate the internal accuracy of the CARINA-ATL nutrient data to be: nitrate 1.5%; phosphate 2.6%; silicate 3.1%. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  17. CARINA alkalinity data in the Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Velo, A.; Perez, F. F.; Brown, P.; Tanhua, T.; Schuster, U.; Key, R. M.

    2009-11-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these, 75 cruises report alkalinity values. Here we present details of the secondary QC on alkalinity for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the alkalinity values for 16 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA-ATL alkalinity data to be 3.3 μmol kg-1. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  18. Relationship between office and home blood pressure with increasing age: The International Database of HOme blood pressure in relation to Cardiovascular Outcome (IDHOCO).

    PubMed

    Ntineri, Angeliki; Stergiou, George S; Thijs, Lutgarde; Asayama, Kei; Boggia, José; Boubouchairopoulou, Nadia; Hozawa, Atsushi; Imai, Yutaka; Johansson, Jouni K; Jula, Antti M; Kollias, Anastasios; Luzardo, Leonella; Niiranen, Teemu J; Nomura, Kyoko; Ohkubo, Takayoshi; Tsuji, Ichiro; Tzourio, Christophe; Wei, Fang-Fei; Staessen, Jan A

    2016-08-01

    Home blood pressure (HBP) measurements are known to be lower than conventional office blood pressure (OBP) measurements. However, this difference might not be consistent across the entire age range and has not been adequately investigated. We assessed the relationship between OBP and HBP with increasing age using the International Database of HOme blood pressure in relation to Cardiovascular Outcome (IDHOCO). OBP, HBP and their difference were assessed across different decades of age. A total of 5689 untreated subjects aged 18-97 years, who had at least two OBP and HBP measurements, were included. Systolic OBP and HBP increased across older age categories (from 112 to 142 mm Hg and from 109 to 136 mm Hg, respectively), with OBP being higher than HBP by ∼7 mm Hg in subjects aged >30 years and lesser in younger subjects (P=0.001). Both diastolic OBP and HBP increased until the age of ∼50 years (from 71 to 79 mm Hg and from 66 to 76 mm Hg, respectively), with OBP being consistently higher than HBP and a trend toward a decreased OBP-HBP difference with aging (P<0.001). Determinants of a larger OBP-HBP difference were younger age, sustained hypertension, nonsmoking and negative cardiovascular disease history. These data suggest that in the general adult population, HBP is consistently lower than OBP across all the decades, but their difference might vary between age groups. Further research is needed to confirm these findings in younger and older subjects and in hypertensive individuals.

  19. Strengths and Difficulties Questionnaire: internal validity and reliability for New Zealand preschoolers.

    PubMed

    Kersten, Paula; Vandal, Alain C; Elder, Hinemoa; McPherson, Kathryn M

    2018-04-21

    This observational study examines the internal construct validity, internal consistency and cross-informant reliability of the Strengths and Difficulties Questionnaire (SDQ) in a New Zealand preschool population across four ethnicity strata (New Zealand European, Māori, Pasifika, Asian). Rasch analysis was employed to examine internal validity on a subsample of 1000 children. Internal consistency (n=29 075) and cross-informant reliability (n=17 006) were examined using correlations, intraclass correlation coefficients and Cronbach's alpha on the sample available for such analyses. Data were used from a national SDQ database provided by the funder, pertaining to New Zealand domiciled children aged 4 and 5 and scored by their parents and teachers. The five subscales do not fit the Rasch model (as indicated by the overall fit statistics), contain items that are biased (differential item functioning (DIF)) by key variables, suffer from a floor and ceiling effect and have unacceptable internal consistency. After dealing with DIF, the Total Difficulty scale does fit the Rasch model and has good internal consistency. Parent/teacher inter-rater reliability was unacceptably low for all subscales. The five SDQ subscales are not valid and not suitable for use in their own right in New Zealand. We have provided a conversion table for the Total Difficulty scale, which takes account of bias by ethnic group. Clinicians should use this conversion table in order to reconcile DIF by culture in final scores. It is advisable to use both parents and teachers' feedback when considering children's needs for referral of further assessment. Future work should examine whether validity is impacted by different language versions used in the same country. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. Assembly: a resource for assembled genomes at NCBI

    PubMed Central

    Kitts, Paul A.; Church, Deanna M.; Thibaud-Nissen, Françoise; Choi, Jinna; Hem, Vichet; Sapojnikov, Victor; Smith, Robert G.; Tatusova, Tatiana; Xiang, Charlie; Zherikov, Andrey; DiCuccio, Michael; Murphy, Terence D.; Pruitt, Kim D.; Kimchi, Avi

    2016-01-01

    The NCBI Assembly database (www.ncbi.nlm.nih.gov/assembly/) provides stable accessioning and data tracking for genome assembly data. The model underlying the database can accommodate a range of assembly structures, including sets of unordered contig or scaffold sequences, bacterial genomes consisting of a single complete chromosome, or complex structures such as a human genome with modeled allelic variation. The database provides an assembly accession and version to unambiguously identify the set of sequences that make up a particular version of an assembly, and tracks changes to updated genome assemblies. The Assembly database reports metadata such as assembly names, simple statistical reports of the assembly (number of contigs and scaffolds, contiguity metrics such as contig N50, total sequence length and total gap length) as well as the assembly update history. The Assembly database also tracks the relationship between an assembly submitted to the International Nucleotide Sequence Database Consortium (INSDC) and the assembly represented in the NCBI RefSeq project. Users can find assemblies of interest by querying the Assembly Resource directly or by browsing available assemblies for a particular organism. Links in the Assembly Resource allow users to easily download sequence and annotations for current versions of genome assemblies from the NCBI genomes FTP site. PMID:26578580

  1. A database of biological and geomorphological sea-level markers from the Last Glacial Maximum to present

    PubMed Central

    Hibbert, F.D.; Williams, F.H.; Fallon, S.J.; Rohling, E.J.

    2018-01-01

    The last deglacial was an interval of rapid climate and sea-level change, including the collapse of large continental ice sheets. This database collates carefully assessed sea-level data from peer-reviewed sources for the interval 0 to 25 thousand years ago (ka), from the Last Glacial Maximum to the present interglacial. In addition to facilitating site-specific reconstructions of past sea levels, the database provides a suite of data beyond the range of modern/instrumental variability that may help hone future sea-level projections. The database is global in scope, internally consistent, and contains U-series and radiocarbon dated indicators from both biological and geomorpohological archives. We focus on far-field data (i.e., away from the sites of the former continental ice sheets), but some key intermediate (i.e., from the Caribbean) data are also included. All primary fields (i.e., sample location, elevation, age and context) possess quantified uncertainties, which—in conjunction with available metadata—allows the reconstructed sea levels to be interpreted within both their uncertainties and geological context. PMID:29809175

  2. Maximizing the use of Special Olympics International's Healthy Athletes database: A call to action.

    PubMed

    Lloyd, Meghann; Foley, John T; Temple, Viviene A

    2018-02-01

    There is a critical need for high-quality population-level data related to the health of individuals with intellectual disabilities. For more than 15 years Special Olympics International has been conducting free Healthy Athletes screenings at local, national and international events. The Healthy Athletes database is the largest known international database specifically on the health of people with intellectual disabilities; however, it is relatively under-utilized by the research community. A consensus meeting with two dozen North American researchers, stakeholders, clinicians and policymakers took place in Toronto, Canada. The purpose of the meeting was to: 1) establish the perceived utility of the database, and 2) to identify and prioritize 3-5 specific priorities related to using the Healthy Athletes database to promote the health of individuals with intellectual disabilities. There was unanimous agreement from the meeting participants that this database represents an immense opportunity both from the data already collected, and data that will be collected in the future. The 3 top priorities for the database were deemed to be: 1) establish the representativeness of data collected on Special Olympics athletes compared to the general population with intellectual disabilities, 2) create a scientific advisory group for Special Olympics International, and 3) use the data to improve Special Olympics programs around the world. The Special Olympics Healthy Athletes database includes data not found in any other source and should be used, in partnership with Special Olympics International, by researchers to significantly increase our knowledge and understanding of the health of individuals with intellectual disabilities. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. GLIMS Glacier Database: Status and Challenges

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Racoviteanu, A.; Khalsa, S. S.; Armstrong, R.

    2008-12-01

    GLIMS (Global Land Ice Measurements from Space) is an international initiative to map the world's glaciers and to build a GIS database that is usable via the World Wide Web. The GLIMS programme includes 70 institutions, and 25 Regional Centers (RCs), who analyze satellite imagery to map glaciers in their regions of expertise. The analysis results are collected at the National Snow and Ice Data Center (NSIDC) and ingested into the GLIMS Glacier Database. The database contains approximately 80 000 glacier outlines, half the estimated total on Earth. In addition, the database contains metadata on approximately 200 000 ASTER images acquired over glacierized terrain. Glacier data and the ASTER metadata can be viewed and searched via interactive maps at http://glims.org/. As glacier mapping with GLIMS has progressed, various hurdles have arisen that have required solutions. For example, the GLIMS community has formulated definitions for how to delineate glaciers with different complicated morphologies and how to deal with debris cover. Experiments have been carried out to assess the consistency of the database, and protocols have been defined for the RCs to follow in their mapping. Hurdles still remain. In June 2008, a workshop was convened in Boulder, Colorado to address issues such as mapping debris-covered glaciers, mapping ice divides, and performing change analysis using two different glacier inventories. This contribution summarizes the status of the GLIMS Glacier Database and steps taken to ensure high data quality.

  4. A prediction model-based algorithm for computer-assisted database screening of adverse drug reactions in the Netherlands.

    PubMed

    Scholl, Joep H G; van Hunsel, Florence P A M; Hak, Eelko; van Puijenbroek, Eugène P

    2018-02-01

    The statistical screening of pharmacovigilance databases containing spontaneously reported adverse drug reactions (ADRs) is mainly based on disproportionality analysis. The aim of this study was to improve the efficiency of full database screening using a prediction model-based approach. A logistic regression-based prediction model containing 5 candidate predictors was developed and internally validated using the Summary of Product Characteristics as the gold standard for the outcome. All drug-ADR associations, with the exception of those related to vaccines, with a minimum of 3 reports formed the training data for the model. Performance was based on the area under the receiver operating characteristic curve (AUC). Results were compared with the current method of database screening based on the number of previously analyzed associations. A total of 25 026 unique drug-ADR associations formed the training data for the model. The final model contained all 5 candidate predictors (number of reports, disproportionality, reports from healthcare professionals, reports from marketing authorization holders, Naranjo score). The AUC for the full model was 0.740 (95% CI; 0.734-0.747). The internal validity was good based on the calibration curve and bootstrapping analysis (AUC after bootstrapping = 0.739). Compared with the old method, the AUC increased from 0.649 to 0.740, and the proportion of potential signals increased by approximately 50% (from 12.3% to 19.4%). A prediction model-based approach can be a useful tool to create priority-based listings for signal detection in databases consisting of spontaneous ADRs. © 2017 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  5. Using an International p53 Mutation Database as a Foundation for an Online Laboratory in an Upper Level Undergraduate Biology Class

    ERIC Educational Resources Information Center

    Melloy, Patricia G.

    2015-01-01

    A two-part laboratory exercise was developed to enhance classroom instruction on the significance of p53 mutations in cancer development. Students were asked to mine key information from an international database of p53 genetic changes related to cancer, the IARC TP53 database. Using this database, students designed several data mining activities…

  6. Draft secure medical database standard.

    PubMed

    Pangalos, George

    2002-01-01

    Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.

  7. Service employees give as they get: internal service as a moderator of the service climate-service outcomes link.

    PubMed

    Ehrhart, Karen Holcombe; Witt, L A; Schneider, Benjamin; Perry, Sara Jansen

    2011-03-01

    We lend theoretical insight to the service climate literature by exploring the joint effects of branch service climate and the internal service provided to the branch (the service received from corporate units to support external service delivery) on customer-rated service quality. We hypothesized that service climate is related to service quality most strongly when the internal service quality received is high, providing front-line employees with the capability to deliver what the service climate motivates them to do. We studied 619 employees and 1,973 customers in 36 retail branches of a bank. We aggregated employee perceptions of the internal service quality received from corporate units and the local service climate and external customer perceptions of service quality to the branch level of analysis. Findings were consistent with the hypothesis that high-quality internal service is necessary for branch service climate to yield superior external customer service quality. PsycINFO Database Record (c) 2011 APA, all rights reserved.

  8. [Interpretation in the Danish health-care system].

    PubMed

    Lund Hansen, Marianne Taulo; Nielsen, Signe Smith

    2013-03-04

    Communication between health professional and patient is central for treatment and patient safety in the health-care system. This systematic review examines the last ten years of specialist literature concerning interpretation in the Danish health-care system. Structural search in two databases, screening of references and recommended literature from two scientists led to identification of seven relevant articles. The review showed that professional interpreters were not used consistently when needed. Family members were also used as interpreters. These results were supported by international investigations.

  9. Data Analysis of Seismic Sequence in Central Italy in 2016 using CTBTO- International Monitoring System

    NASA Astrophysics Data System (ADS)

    Mumladze, Tea; Wang, Haijun; Graham, Gerhard

    2017-04-01

    The seismic network that forms the International Monitoring System (IMS) of the Comprehensive Nuclear-test-ban Treaty Organization (CTBTO) will ultimately consist of 170 seismic stations (50 primary and 120 auxiliary) in 76 countries around the world. The Network is still under the development, but currently more than 80% of the network is in operation. The objective of seismic monitoring is to detect and locate underground nuclear explosions. However, the data from the IMS also can be widely used for scientific and civil purposes. In this study we present the results of data analysis of the seismic sequence in 2016 in Central Italy. Several hundred earthquakes were recorded for this sequence by the seismic stations of the IMS. All events were accurately located the analysts of the International Data Centre (IDC) of the CTBTO. In this study we will present the epicentral and magnitude distribution, station recordings and teleseismic phases as obtained from the Reviewed Event Bulletin (REB). We will also present a comparison of the database of the IDC with the databases of the European-Mediterranean Seismological Centre (EMSC) and U.S. Geological Survey (USGS). Present work shows that IMS data can be used for earthquake sequence analyses and can play an important role in seismological research.

  10. Development and Validation of a Social Capital Questionnaire for Adolescent Students (SCQ-AS)

    PubMed Central

    Paiva, Paula Cristina Pelli; de Paiva, Haroldo Neves; de Oliveira Filho, Paulo Messias; Lamounier, Joel Alves; Ferreira, Efigênia Ferreira e; Ferreira, Raquel Conceição; Kawachi, Ichiro; Zarzar, Patrícia Maria

    2014-01-01

    Objectives Social capital has been studied due to its contextual influence on health. However, no specific assessment tool has been developed and validated for the measurement of social capital among 12-year-old adolescent students. The aim of the present study was to develop and validate a quick, simple assessment tool to measure social capital among adolescent students. Methods A questionnaire was developed based on a review of relevant literature. For such, searches were made of the Scientific Electronic Library Online, Latin American and Caribbean Health Sciences, The Cochrane Library, ISI Web of Knowledge, International Database for Medical Literature and PubMed Central bibliographical databases from September 2011 to January 2014 for papers addressing assessment tools for the evaluation of social capital. Focus groups were also formed by adolescent students as well as health, educational and social professionals. The final assessment tool was administered to a convenience sample from two public schools (79 students) and one private school (22 students), comprising a final sample of 101 students. Reliability and internal consistency were evaluated using the Kappa coefficient and Cronbach's alpha coefficient, respectively. Content validity was determined by expert consensus as well as exploratory and confirmatory factor analysis. Results The final version of the questionnaire was made up of 12 items. The total scale demonstrated very good internal consistency (Cronbach's alpha: 0.71). Reproducibility was also very good, as the Kappa coefficient was higher than 0.72 for the majority of items (range: 0.63 to 0.97). Factor analysis grouped the 12 items into four subscales: School Social Cohesion, School Friendships, Neighborhood Social Cohesion and Trust (school and neighborhood). Conclusions The present findings indicate the validity and reliability of the Social Capital Questionnaire for Adolescent Students. PMID:25093409

  11. 76 FR 4072 - Registration of Claims of Copyright

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-24

    ... registration of automated databases that predominantly consist of photographs, and applications for group... to submit electronic applications to register copyrights of such photographic databases or of groups... automated databases, an electronic application for group registration of an automated database that consists...

  12. TIMSS 2011 User Guide for the International Database. Supplement 1: International Version of the TIMSS 2011 Background and Curriculum Questionnaires

    ERIC Educational Resources Information Center

    Foy, Pierre, Ed.; Arora, Alka, Ed.; Stanco, Gabrielle M., Ed.

    2013-01-01

    The TIMSS 2011 International Database includes data for all questionnaires administered as part of the TIMSS 2011 assessment. This supplement contains the international version of the TIMSS 2011 background questionnaires and curriculum questionnaires in the following 10 sections: (1) Fourth Grade Student Questionnaire; (2) Fourth Grade Home…

  13. An International Aerospace Information System: A Cooperative Opportunity.

    ERIC Educational Resources Information Center

    Blados, Walter R.; Cotter, Gladys A.

    1992-01-01

    Introduces and discusses ideas and issues relevant to the international unification of scientific and technical information (STI) through development of an international aerospace database (IAD). Specific recommendations for improving the National Aeronautics and Space Administration Aerospace Database (NAD) and for implementing IAD are given.…

  14. Towards a Global Service Registry for the World-Wide LHC Computing Grid

    NASA Astrophysics Data System (ADS)

    Field, Laurence; Alandes Pradillo, Maria; Di Girolamo, Alessandro

    2014-06-01

    The World-Wide LHC Computing Grid encompasses a set of heterogeneous information systems; from central portals such as the Open Science Grid's Information Management System and the Grid Operations Centre Database, to the WLCG information system, where the information sources are the Grid services themselves. Providing a consistent view of the information, which involves synchronising all these informations systems, is a challenging activity that has lead the LHC virtual organisations to create their own configuration databases. This experience, whereby each virtual organisation's configuration database interfaces with multiple information systems, has resulted in the duplication of effort, especially relating to the use of manual checks for the handling of inconsistencies. The Global Service Registry aims to address this issue by providing a centralised service that aggregates information from multiple information systems. It shows both information on registered resources (i.e. what should be there) and available resources (i.e. what is there). The main purpose is to simplify the synchronisation of the virtual organisation's own configuration databases, which are used for job submission and data management, through the provision of a single interface for obtaining all the information. By centralising the information, automated consistency and validation checks can be performed to improve the overall quality of information provided. Although internally the GLUE 2.0 information model is used for the purpose of integration, the Global Service Registry in not dependent on any particular information model for ingestion or dissemination. The intention is to allow the virtual organisation's configuration databases to be decoupled from the underlying information systems in a transparent way and hence simplify any possible future migration due to the evolution of those systems. This paper presents the Global Service Registry architecture, its advantages compared to the current situation and how it can support the evolution of information systems.

  15. An international aerospace information system - A cooperative opportunity

    NASA Technical Reports Server (NTRS)

    Blados, Walter R.; Cotter, Gladys A.

    1992-01-01

    This paper presents for consideration new possibilities for uniting the various aerospace database efforts toward a cooperative international aerospace database initiative that can optimize the cost-benefit equation for all members. The development of astronautics and aeronautics in individual nations has led to initiatives for national aerospace databases. Technological developments in information technology and science, as well as the reality of scarce resources, makes it necessary to reconsider the mutually beneficial possibilities offered by cooperation and international resource sharing.

  16. [Presence of the biomedical periodicals of Hungarian editions in international databases].

    PubMed

    Vasas, Lívia; Hercsel, Imréné

    2006-01-15

    Presence of the biomedical periodicals of Hungarian editions in international databases. The majority of Hungarian scientific results in medical and related sciences are published in scientific periodicals of foreign edition with high impact factor (IF) values, and they appear in international scientific literature in foreign languages. In this study the authors dealt with the presence and registered citation in international databases of those periodicals only, which had been published in Hungary and/or in cooperation with foreign publishing companies. The examination went back to year 1980 and covered a 25-year long period. 110 periodicals were selected for more detailed examination. The authors analyzed the situation of the current periodicals in the three most often visited databases (MEDLINE, EMBASE, Web of Science), and discovered, that the biomedical scientific periodicals of Hungarian interests were not represented with reasonable emphasis in the relevant international bibliographic databases. Because of the great number of data the scientific literature of medicine and related sciences could not be represented in its entirety, this publication, however, might give useful information for the inquirers, and call the attention of the competent people.

  17. Development and evaluation of the Internalized Racism in Asian Americans Scale (IRAAS).

    PubMed

    Choi, Andrew Young; Israel, Tania; Maeda, Hotaka

    2017-01-01

    This article presents the development and psychometric evaluation of the Internalized Racism in Asian Americans Scale (IRAAS), which was designed to measure the degree to which Asian Americans internalized hostile attitudes and negative messages targeted toward their racial identity. Items were developed on basis of prior literature, vetted through expert feedback and cognitive interviews, and administered to 655 Asian American participants through Amazon Mechanical Turk. Exploratory factor analysis with a random subsample (n = 324) yielded a psychometrically robust preliminary measurement model consisting of 3 factors: Self-Negativity, Weakness Stereotypes, and Appearance Bias. Confirmatory factor analysis with a separate subsample (n = 331) indicated that the proposed correlated factors model was strongly consistent with the observed data. Factor determinacies were high and demonstrated that the specified items adequately measured their intended factors. Bifactor modeling further indicated that this multidimensionality could be univocally represented for the purpose of measurement, including the use of a mean total score representing a single continuum of internalized racism on which individuals vary. The IRAAS statistically predicted depressive symptoms, and demonstrated statistically significant correlations in theoretically expected directions with four dimensions of collective self-esteem. These results provide initial validity evidence supporting the use of the IRAAS to measure aspects of internalized racism in this population. Limitations and research implications are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. PIRLS 2011 User Guide for the International Database. Supplement 1: International Version of the PIRLS 2011, Background Questionnaires and Curriculum Questionnaire

    ERIC Educational Resources Information Center

    Foy, Pierre, Ed.; Drucker, Kathleen T., Ed.

    2013-01-01

    The PIRLS 2011 international database includes data for all questionnaires administered as part of the PIRLS 2011 assessment. This supplement contains the international version of the PIRLS 2011 background questionnaires and curriculum questionnaires in the following 5 sections: (1) Student Questionnaire; (2) Home Questionnaire (Learning to Read…

  19. TEDS-M 2008 User Guide for the International Database. Supplement 2: National Adaptations of the TEDS-M Questionnaires

    ERIC Educational Resources Information Center

    Brese, Falk, Ed.

    2012-01-01

    This supplement contains all adaptations made by countries to the international version of the TEDS-M questionnaires under careful supervision of and approval by the TEDS-M International Study Center at Michigan State University. This information provides users of the TEDS-M International Database with a guide to evaluate the availability of…

  20. THE AIMS AND ACTIVITIES OF THE INTERNATIONAL NETWORK OF NUCLEAR STRUCTURE AND DECAY DATA EVALUATORS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NICHOLS,A.L.; TULI, J.K.

    International Network of Nuclear Structure and Decay Data (NSDD) Evaluators consists of a number of evaluation groups and data service centers in several countries that appreciate the merits of working together to maintain and ensure the quality and comprehensive content of the ENSDF database (Evaluated Nuclear Structure Data File). Biennial meetings of the network are held under the auspices of the International Atomic Energy Agency (IAEA) to assign evaluation responsibilities, monitor progress, discuss improvements and emerging difficulties, and agree on actions to be undertaken by individual members. The evaluated data and bibliographic details are made available to users via variousmore » media, such as the journals ''Nuclear Physics A'' and ''Nuclear Data Sheets'', the World Wide Web, on CD-ROM, wall charts of the nuclides and ''Nuclear Wallet Cards''. While the ENSDF master database is maintained by the US National Nuclear Data Center at the Brookhaven National Laboratory, these data are also available from other nuclear data centers including the IAEA Nuclear Data Section. The Abdus Salam International Centre for Theoretical Physics (ICTP), Trieste, Italy, in cooperation with the IAEA, organizes workshops on NSDD at regular intervals. The primary aims of these particular workshops are to provide hands-on training in the data evaluation processes, and to encourage new evaluators to participate in NSDD activities. The technical contents of these NSDD workshops are described, along with the rationale for the inclusion of various topics.« less

  1. New standards for reducing gravity data: The North American gravity database

    USGS Publications Warehouse

    Hinze, W. J.; Aiken, C.; Brozena, J.; Coakley, B.; Dater, D.; Flanagan, G.; Forsberg, R.; Hildenbrand, T.; Keller, Gordon R.; Kellogg, J.; Kucks, R.; Li, X.; Mainville, A.; Morin, R.; Pilkington, M.; Plouff, D.; Ravat, D.; Roman, D.; Urrutia-Fucugauchi, J.; Veronneau, M.; Webring, M.; Winester, D.

    2005-01-01

    The North American gravity database as well as databases from Canada, Mexico, and the United States are being revised to improve their coverage, versatility, and accuracy. An important part of this effort is revising procedures for calculating gravity anomalies, taking into account our enhanced computational power, improved terrain databases and datums, and increased interest in more accurately defining long-wavelength anomaly components. Users of the databases may note minor differences between previous and revised database values as a result of these procedures. Generally, the differences do not impact the interpretation of local anomalies but do improve regional anomaly studies. The most striking revision is the use of the internationally accepted terrestrial ellipsoid for the height datum of gravity stations rather than the conventionally used geoid or sea level. Principal facts of gravity observations and anomalies based on both revised and previous procedures together with germane metadata will be available on an interactive Web-based data system as well as from national agencies and data centers. The use of the revised procedures is encouraged for gravity data reduction because of the widespread use of the global positioning system in gravity fieldwork and the need for increased accuracy and precision of anomalies and consistency with North American and national databases. Anomalies based on the revised standards should be preceded by the adjective "ellipsoidal" to differentiate anomalies calculated using heights with respect to the ellipsoid from those based on conventional elevations referenced to the geoid. ?? 2005 Society of Exploration Geophysicists. All rights reserved.

  2. Fundamentals of the NEA Thermochemical Database and its influence over national nuclear programs on the performance assessment of deep geological repositories.

    PubMed

    Ragoussi, Maria-Eleni; Costa, Davide

    2017-03-14

    For the last 30 years, the NEA Thermochemical Database (TDB) Project (www.oecd-nea.org/dbtdb/) has been developing a chemical thermodynamic database for elements relevant to the safety of radioactive waste repositories, providing data that are vital to support the geochemical modeling of such systems. The recommended data are selected on the basis of strict review procedures and are characterized by their consistency. The results of these efforts are freely available, and have become an international point of reference in the field. As a result, a number of important national initiatives with regard to waste management programs have used the NEA TDB as their basis, both in terms of recommended data and guidelines. In this article we describe the fundamentals and achievements of the project together with the characteristics of some databases developed in national nuclear waste disposal programs that have been influenced by the NEA TDB. We also give some insights on how this work could be seen as an approach to be used in broader areas of environmental interest. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. On the advancement of highly cited research in China: An analysis of the Highly Cited database.

    PubMed

    Li, John Tianci

    2018-01-01

    This study investigates the progress of highly cited research in China from 2001 to 2016 through the analysis of the Highly Cited database. The Highly Cited database, compiled by Clarivate Analytics, is comprised of the world's most influential researchers in the 22 Essential Science Indicator fields as catalogued by the Web of Science. The database is considered an international standard for the measurement of national and institutional highly cited research output. Overall, we found a consistent and substantial increase in Highly Cited Researchers from China during the timespan. The Chinese institutions with the most Highly Cited Researchers- the Chinese Academy of Sciences, Tsinghua University, Peking University, Zhejiang University, the University of Science and Technology of China, and BGI Shenzhen- are all top ten universities or primary government research institutions. Further evaluation of separate fields of research and government funding data from the National Natural Science Foundation of China revealed disproportionate growth efficiencies among the separate divisions of the National Natural Science Foundation. The most development occurred in the fields of Chemistry, Materials Sciences, and Engineering, whereas the least development occurred in Economics and Business, Health Sciences, and Life Sciences.

  4. MouseNet database: digital management of a large-scale mutagenesis project.

    PubMed

    Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M

    2000-07-01

    The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.

  5. IMGT/3Dstructure-DB and IMGT/StructuralQuery, a database and a tool for immunoglobulin, T cell receptor and MHC structural data

    PubMed Central

    Kaas, Quentin; Ruiz, Manuel; Lefranc, Marie-Paule

    2004-01-01

    IMGT/3Dstructure-DB and IMGT/Structural-Query are a novel 3D structure database and a new tool for immunological proteins. They are part of IMGT, the international ImMunoGenetics information system®, a high-quality integrated knowledge resource specializing in immunoglobulins (IG), T cell receptors (TR), major histocompatibility complex (MHC) and related proteins of the immune system (RPI) of human and other vertebrate species, which consists of databases, Web resources and interactive on-line tools. IMGT/3Dstructure-DB data are described according to the IMGT Scientific chart rules based on the IMGT-ONTOLOGY concepts. IMGT/3Dstructure-DB provides IMGT gene and allele identification of IG, TR and MHC proteins with known 3D structures, domain delimitations, amino acid positions according to the IMGT unique numbering and renumbered coordinate flat files. Moreover IMGT/3Dstructure-DB provides 2D graphical representations (or Collier de Perles) and results of contact analysis. The IMGT/StructuralQuery tool allows search of this database based on specific structural characteristics. IMGT/3Dstructure-DB and IMGT/StructuralQuery are freely available at http://imgt.cines.fr. PMID:14681396

  6. Production and distribution of scientific and technical databases - Comparison among Japan, US and Europe

    NASA Astrophysics Data System (ADS)

    Onodera, Natsuo; Mizukami, Masayuki

    This paper estimates several quantitative indice on production and distribution of scientific and technical databases based on various recent publications and attempts to compare the indice internationally. Raw data used for the estimation are brought mainly from the Database Directory (published by MITI) for database production and from some domestic and foreign study reports for database revenues. The ratio of the indice among Japan, US and Europe for usage of database is similar to those for general scientific and technical activities such as population and R&D expenditures. But Japanese contributions to production, revenue and over-countory distribution of databases are still lower than US and European countries. International comparison of relative database activities between public and private sectors is also discussed.

  7. NIST/ASME Steam Properties Database

    National Institute of Standards and Technology Data Gateway

    SRD 10 NIST/ASME Steam Properties Database (PC database for purchase)   Based upon the International Association for the Properties of Water and Steam (IAPWS) 1995 formulation for the thermodynamic properties of water and the most recent IAPWS formulations for transport and other properties, this updated version provides water properties over a wide range of conditions according to the accepted international standards.

  8. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  9. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  10. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  11. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  12. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  13. Performance Evaluation of a Database System in a Multiple Backend Configurations,

    DTIC Science & Technology

    1984-10-01

    leaving a systemn process , the * internal performance measuremnents of MMSD have been carried out. Mathodo lo.- gies for constructing test databases...access d i rectory data via the AT, EDIT, and CDT. In designing the test database, one of the key concepts is the choice of the directory attributes in...internal timing. These requests are selected since they retrieve the seIaI lest portion of the test database and the processing time for each request is

  14. A central database for the Global Terrestrial Network for Permafrost (GTN-P)

    NASA Astrophysics Data System (ADS)

    Elger, Kirsten; Lanckman, Jean-Pierre; Lantuit, Hugues; Karlsson, Ævar Karl; Johannsson, Halldór

    2013-04-01

    The Global Terrestrial Network for Permafrost (GTN-P) is the primary international observing network for permafrost sponsored by the Global Climate Observing System (GCOS) and the Global Terrestrial Observing System (GTOS), and managed by the International Permafrost Association (IPA). It monitors the Essential Climate Variable (ECV) permafrost that consists of permafrost temperature and active-layer thickness, with the long-term goal of obtaining a comprehensive view of the spatial structure, trends, and variability of changes in the active layer and permafrost. The network's two international monitoring components are (1) CALM (Circumpolar Active Layer Monitoring) and the (2) Thermal State of Permafrost (TSP), which is made of an extensive borehole-network covering all permafrost regions. Both programs have been thoroughly overhauled during the International Polar Year 2007-2008 and extended their coverage to provide a true circumpolar network stretching over both Hemispheres. GTN-P has gained considerable visibility in the science community in providing the baseline against which models are globally validated and incorporated in climate assessments. Yet it was until now operated on a voluntary basis, and is now being redesigned to meet the increasing expectations from the science community. To update the network's objectives and deliver the best possible products to the community, the IPA organized a workshop to define the user's needs and requirements for the production, archival, storage and dissemination of the permafrost data products it manages. From the beginning on, GNT-P data was "outfitted" with an open data policy with free data access via the World Wide Web. The existing data, however, is far from being homogeneous: is not yet optimized for databases, there is no framework for data reporting or archival and data documentation is incomplete. As a result, and despite the utmost relevance of permafrost in the Earth's climate system, the data has not been used by as many researchers as intended by the initiators of these global programs. The European Union project PAGE21 created opportunities to develop this central database for GTN-P data during the duration of the project and beyond. The database aims to be the one location where the researcher can find data, metadata and information of all relevant parameters for a specific site. Each component of the Data Management System (DMS), including parameters, data levels and metadata formats were developed in cooperation with GTN-P and the IPA. The general framework of the GTN-P DMS is based on an object-oriented model (OOM) and implemented into a spatial database. To ensure interoperability and enable potential inter-database search, field names are following international metadata standards. The outputs of the DMS will be tailored to the needs of the modeling community but also to the ones of other stakeholders. In particular, new products will be developed in partnership with the IPA and other relevant international organizations to raise awareness on permafrost in the policy-making arena. The DMS will be released to a broader public in May 2013 and we expect to have the first active data upload - via an online interface - after 2013's summer field season.

  15. International Energy: Subject Thesaurus. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The International Energy Agency: Subject Thesaurus contains the standard vocabulary of indexing terms (descriptors) developed and structured to build and maintain energy information databases. Involved in this cooperative task are (1) the technical staff of the USDOE Office of Scientific and Technical Information (OSTI) in cooperation with the member countries of the International Energy Agency`s Energy Technology Data Exchange (ETDE) and (2) the International Atomic Energy Agency`s International Nuclear Information System (INIS) staff representing the more than 100 countries and organizations that record and index information for the international nuclear information community. ETDE member countries are also members of INIS.more » Nuclear information prepared for INIS by ETDE member countries is included in the ETDE Energy Database, which contains the online equivalent of the printed INIS Atomindex. Indexing terminology is therefore cooperatively standardized for use in both information systems. This structured vocabulary reflects thscope of international energy research, development, and technological programs. The terminology of this thesaurus aids in subject searching on commercial systems, such as ``Energy Science & Technology`` by DIALOG Information Services, ``Energy`` by STN International and the ``ETDE Energy Database`` by SilverPlatter. It is also the thesaurus for the Integrated Technical Information System (ITIS) online databases of the US Department of Energy.« less

  16. Denver International Airport sensor processing and database

    DOT National Transportation Integrated Search

    2000-03-01

    Data processing and database design is described for an instrumentation system installed on runway 34R at Denver International Airport (DIA). Static (low-speed) and dynamic (high-speed) sensors are installed in the pavement. The static sensors includ...

  17. Standardisation of the FAERS database: a systematic approach to manually recoding drug name variants.

    PubMed

    Wong, Carmen K; Ho, Samuel S; Saini, Bandana; Hibbs, David E; Fois, Romano A

    2015-07-01

    The US Food and Drug Administration Adverse Event Reporting System (FAERS), one of the world's largest spontaneous reporting systems, is difficult to use because of report duplication and a lack of standardisation in the recording of drug names. Unresolved data quality issues may distort statistical analyses, rendering the results difficult to interpret when detecting and monitoring adverse effects of pharmaceutical products. The aim of this study was to develop and implement a data cleaning protocol to identify and resolve drug nomenclature issues. The key 'data treatment' plan involved standardising drug names held in the FAERS database. Four million five hundred and six thousand five hundred and seventy-seven. Individual Safety Reports submitted to the FAERS between 1 January 2003 and 31 August 2012 were included for this study. OpenRefine was used to standardise drug name variants in the database such that they were consistent with international non-proprietary nomenclature defined by the World Health Organisation Anatomical Therapeutic Chemical classification. Drug variants where generic constituents could not be confidently determined, undecipherable drug names and non-medicinal products were retained verbatim. After the standardisation process, more than 16 611 916 drug entries were cleaned to their relevant international non-proprietary name. The cleaned drug table comprised 71 858 drug name variants and includes both standardised and original terms. Ninety-nine per cent of drug names was standardised using this method. The millions of reports enclosed in the FAERS contain valuable information that is of interest to pharmacovigilance, toxicology and post-marketing surveillance researchers. With the standardisation of the drug nomenclature, the database can be better utilised by research groups around the world. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Discrepancy Reporting Management System

    NASA Technical Reports Server (NTRS)

    Cooper, Tonja M.; Lin, James C.; Chatillon, Mark L.

    2004-01-01

    Discrepancy Reporting Management System (DRMS) is a computer program designed for use in the stations of NASA's Deep Space Network (DSN) to help establish the operational history of equipment items; acquire data on the quality of service provided to DSN customers; enable measurement of service performance; provide early insight into the need to improve processes, procedures, and interfaces; and enable the tracing of a data outage to a change in software or hardware. DRMS is a Web-based software system designed to include a distributed database and replication feature to achieve location-specific autonomy while maintaining a consistent high quality of data. DRMS incorporates commercial Web and database software. DRMS collects, processes, replicates, communicates, and manages information on spacecraft data discrepancies, equipment resets, and physical equipment status, and maintains an internal station log. All discrepancy reports (DRs), Master discrepancy reports (MDRs), and Reset data are replicated to a master server at NASA's Jet Propulsion Laboratory; Master DR data are replicated to all the DSN sites; and Station Logs are internal to each of the DSN sites and are not replicated. Data are validated according to several logical mathematical criteria. Queries can be performed on any combination of data.

  19. International forensic automotive paint database

    NASA Astrophysics Data System (ADS)

    Bishea, Gregory A.; Buckle, Joe L.; Ryland, Scott G.

    1999-02-01

    The Technical Working Group for Materials Analysis (TWGMAT) is supporting an international forensic automotive paint database. The Federal Bureau of Investigation and the Royal Canadian Mounted Police (RCMP) are collaborating on this effort through TWGMAT. This paper outlines the support and further development of the RCMP's Automotive Paint Database, `Paint Data Query'. This cooperative agreement augments and supports a current, validated, searchable, automotive paint database that is used to identify make(s), model(s), and year(s) of questioned paint samples in hit-and-run fatalities and other associated investigations involving automotive paint.

  20. INCIDENCE AND PREVALENCE OF ACROMEGALY IN THE UNITED STATES: A CLAIMS-BASED ANALYSIS.

    PubMed

    Broder, Michael S; Chang, Eunice; Cherepanov, Dasha; Neary, Maureen P; Ludlam, William H

    2016-11-01

    Acromegaly, a rare endocrine disorder, results from excessive growth hormone secretion, leading to multisystem-associated morbidities. Using 2 large nationwide databases, we estimated the annual incidence and prevalence of acromegaly in the U.S. We used 2008 to 2013 data from the Truven Health MarketScan ® Commercial Claims and Encounters Database and IMS Health PharMetrics healthcare insurance claims databases, with health plan enrollees <65 years of age. Study patients had ≥2 claims with acromegaly (International Classification of Diseases, 9th Revision, Clinical Modification Code [ICD-9CM] 253.0), or 1 claim with acromegaly and 1 claim for pituitary tumor, pituitary surgery, or cranial stereotactic radiosurgery. Annual incidence was calculated for each year from 2009 to 2013, and prevalence in 2013. Estimates were stratified by age and sex. Incidence was up to 11.7 cases per million person-years (PMPY) in MarketScan and 9.6 cases PMPY in PharMetrics. Rates were similar by sex but typically lowest in ≤17 year olds and higher in >24 year olds. The prevalence estimates were 87.8 and 71.0 per million per year in MarketScan and PharMetrics, respectively. Prevalence consistently increased with age but was similar by sex in each database. The current U.S. incidence of acromegaly may be up to 4 times higher and prevalence may be up to 50% higher than previously reported in European studies. Our findings correspond with the estimates reported by a recent U.S. study that used a single managed care database, supporting the robustness of these estimates in this population. Our study indicates there are approximately 3,000 new cases of acromegaly per year, with a prevalence of about 25,000 acromegaly patients in the U.S. CT = computed tomography GH = growth hormone IGF-1 = insulin-like growth factor 1 ICD-9-CM Code = International Classification of Diseases, 9th Revision, Clinical Modification Codes MRI = magnetic resonance imaging PMPY = per million person-years.

  1. ReMatch: a web-based tool to construct, store and share stoichiometric metabolic models with carbon maps for metabolic flux analysis.

    PubMed

    Pitkänen, Esa; Akerlund, Arto; Rantanen, Ari; Jouhten, Paula; Ukkonen, Esko

    2008-08-25

    ReMatch is a web-based, user-friendly tool that constructs stoichiometric network models for metabolic flux analysis, integrating user-developed models into a database collected from several comprehensive metabolic data resources, including KEGG, MetaCyc and CheBI. Particularly, ReMatch augments the metabolic reactions of the model with carbon mappings to facilitate (13)C metabolic flux analysis. The construction of a network model consisting of biochemical reactions is the first step in most metabolic modelling tasks. This model construction can be a tedious task as the required information is usually scattered to many separate databases whose interoperability is suboptimal, due to the heterogeneous naming conventions of metabolites in different databases. Another, particularly severe data integration problem is faced in (13)C metabolic flux analysis, where the mappings of carbon atoms from substrates into products in the model are required. ReMatch has been developed to solve the above data integration problems. First, ReMatch matches the imported user-developed model against the internal ReMatch database while considering a comprehensive metabolite name thesaurus. This, together with wild card support, allows the user to specify the model quickly without having to look the names up manually. Second, ReMatch is able to augment reactions of the model with carbon mappings, obtained either from the internal database or given by the user with an easy-touse tool. The constructed models can be exported into 13C-FLUX and SBML file formats. Further, a stoichiometric matrix and visualizations of the network model can be generated. The constructed models of metabolic networks can be optionally made available to the other users of ReMatch. Thus, ReMatch provides a common repository for metabolic network models with carbon mappings for the needs of metabolic flux analysis community. ReMatch is freely available for academic use at http://www.cs.helsinki.fi/group/sysfys/software/rematch/.

  2. Atmospheric Probe Model: Construction and Wind Tunnel Tests

    NASA Technical Reports Server (NTRS)

    Vogel, Jerald M.

    1998-01-01

    The material contained in this document represents a summary of the results of a low speed wind tunnel test program to determine the performance of an atmospheric probe at low speed. The probe configuration tested consists of a 2/3 scale model constructed from a combination of hard maple wood and aluminum stock. The model design includes approximately 130 surface static pressure taps. Additional hardware incorporated in the baseline model provides a mechanism for simulating external and internal trailing edge split flaps for probe flow control. Test matrix parameters include probe side slip angle, external/internal split flap deflection angle, and trip strip applications. Test output database includes surface pressure distributions on both inner and outer annular wings and probe center line velocity distributions from forward probe to aft probe locations.

  3. DESPIC: Detecting Early Signatures of Persuasion in Information Cascades

    DTIC Science & Technology

    2015-08-27

    over NoSQL Databases, Proceedings of the 14th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid 2014). 26-MAY-14, . : , P...over NoSQL Databases. Proceedings of the 14th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid 2014). Chicago, IL, USA...distributed NoSQL databases including HBase and Riak, we finalized the requirements of the optimal computational architecture to support our framework

  4. Application of a 5-tiered scheme for standardized classification of 2,360 unique mismatch repair gene variants in the InSiGHT locus-specific database.

    PubMed

    Thompson, Bryony A; Spurdle, Amanda B; Plazzer, John-Paul; Greenblatt, Marc S; Akagi, Kiwamu; Al-Mulla, Fahd; Bapat, Bharati; Bernstein, Inge; Capellá, Gabriel; den Dunnen, Johan T; du Sart, Desiree; Fabre, Aurelie; Farrell, Michael P; Farrington, Susan M; Frayling, Ian M; Frebourg, Thierry; Goldgar, David E; Heinen, Christopher D; Holinski-Feder, Elke; Kohonen-Corish, Maija; Robinson, Kristina Lagerstedt; Leung, Suet Yi; Martins, Alexandra; Moller, Pal; Morak, Monika; Nystrom, Minna; Peltomaki, Paivi; Pineda, Marta; Qi, Ming; Ramesar, Rajkumar; Rasmussen, Lene Juel; Royer-Pokora, Brigitte; Scott, Rodney J; Sijmons, Rolf; Tavtigian, Sean V; Tops, Carli M; Weber, Thomas; Wijnen, Juul; Woods, Michael O; Macrae, Finlay; Genuardi, Maurizio

    2014-02-01

    The clinical classification of hereditary sequence variants identified in disease-related genes directly affects clinical management of patients and their relatives. The International Society for Gastrointestinal Hereditary Tumours (InSiGHT) undertook a collaborative effort to develop, test and apply a standardized classification scheme to constitutional variants in the Lynch syndrome-associated genes MLH1, MSH2, MSH6 and PMS2. Unpublished data submission was encouraged to assist in variant classification and was recognized through microattribution. The scheme was refined by multidisciplinary expert committee review of the clinical and functional data available for variants, applied to 2,360 sequence alterations, and disseminated online. Assessment using validated criteria altered classifications for 66% of 12,006 database entries. Clinical recommendations based on transparent evaluation are now possible for 1,370 variants that were not obviously protein truncating from nomenclature. This large-scale endeavor will facilitate the consistent management of families suspected to have Lynch syndrome and demonstrates the value of multidisciplinary collaboration in the curation and classification of variants in public locus-specific databases.

  5. Searching for disability in electronic databases of published literature.

    PubMed

    Walsh, Emily S; Peterson, Jana J; Judkins, Dolores Z

    2014-01-01

    As researchers in disability and health conduct systematic reviews with greater frequency, the definition of disability used in these reviews gains importance. Translating a comprehensive conceptual definition of "disability" into an operational definition that utilizes electronic databases in the health sciences is a difficult step necessary for performing systematic literature reviews in the field. Consistency of definition across studies will help build a body of evidence that is comparable and amenable to synthesis. To illustrate a process for operationalizing the World Health Organization's International Classification of Disability, Functioning, and Health concept of disability for MEDLINE, PsycINFO, and CINAHL databases. We created an electronic search strategy in conjunction with a reference librarian and an expert panel. Quality control steps included comparison of search results to results of a search for a specific disabling condition and to articles nominated by the expert panel. The complete search strategy is presented. Results of the quality control steps indicated that our strategy was sufficiently sensitive and specific. Our search strategy will be valuable to researchers conducting literature reviews on broad populations with disabilities. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Application of a five-tiered scheme for standardized classification of 2,360 unique mismatch repair gene variants lodged on the InSiGHT locus-specific database

    PubMed Central

    Plazzer, John-Paul; Greenblatt, Marc S.; Akagi, Kiwamu; Al-Mulla, Fahd; Bapat, Bharati; Bernstein, Inge; Capellá, Gabriel; den Dunnen, Johan T.; du Sart, Desiree; Fabre, Aurelie; Farrell, Michael P.; Farrington, Susan M.; Frayling, Ian M.; Frebourg, Thierry; Goldgar, David E.; Heinen, Christopher D.; Holinski-Feder, Elke; Kohonen-Corish, Maija; Robinson, Kristina Lagerstedt; Leung, Suet Yi; Martins, Alexandra; Moller, Pal; Morak, Monika; Nystrom, Minna; Peltomaki, Paivi; Pineda, Marta; Qi, Ming; Ramesar, Rajkumar; Rasmussen, Lene Juel; Royer-Pokora, Brigitte; Scott, Rodney J.; Sijmons, Rolf; Tavtigian, Sean V.; Tops, Carli M.; Weber, Thomas; Wijnen, Juul; Woods, Michael O.; Macrae, Finlay; Genuardi, Maurizio

    2015-01-01

    Clinical classification of sequence variants identified in hereditary disease genes directly affects clinical management of patients and their relatives. The International Society for Gastrointestinal Hereditary Tumours (InSiGHT) undertook a collaborative effort to develop, test and apply a standardized classification scheme to constitutional variants in the Lynch Syndrome genes MLH1, MSH2, MSH6 and PMS2. Unpublished data submission was encouraged to assist variant classification, and recognized by microattribution. The scheme was refined by multidisciplinary expert committee review of clinical and functional data available for variants, applied to 2,360 sequence alterations, and disseminated online. Assessment using validated criteria altered classifications for 66% of 12,006 database entries. Clinical recommendations based on transparent evaluation are now possible for 1,370 variants not obviously protein-truncating from nomenclature. This large-scale endeavor will facilitate consistent management of suspected Lynch Syndrome families, and demonstrates the value of multidisciplinary collaboration for curation and classification of variants in public locus-specific databases. PMID:24362816

  7. The Nencki Affective Picture System (NAPS): introduction to a novel, standardized, wide-range, high-quality, realistic picture database.

    PubMed

    Marchewka, Artur; Zurawski, Łukasz; Jednoróg, Katarzyna; Grabowska, Anna

    2014-06-01

    Selecting appropriate stimuli to induce emotional states is essential in affective research. Only a few standardized affective stimulus databases have been created for auditory, language, and visual materials. Numerous studies have extensively employed these databases using both behavioral and neuroimaging methods. However, some limitations of the existing databases have recently been reported, including limited numbers of stimuli in specific categories or poor picture quality of the visual stimuli. In the present article, we introduce the Nencki Affective Picture System (NAPS), which consists of 1,356 realistic, high-quality photographs that are divided into five categories (people, faces, animals, objects, and landscapes). Affective ratings were collected from 204 mostly European participants. The pictures were rated according to the valence, arousal, and approach-avoidance dimensions using computerized bipolar semantic slider scales. Normative ratings for the categories are presented for each dimension. Validation of the ratings was obtained by comparing them to ratings generated using the Self-Assessment Manikin and the International Affective Picture System. In addition, the physical properties of the photographs are reported, including luminance, contrast, and entropy. The new database, with accompanying ratings and image parameters, allows researchers to select a variety of visual stimulus materials specific to their experimental questions of interest. The NAPS system is freely accessible to the scientific community for noncommercial use by request at http://naps.nencki.gov.pl .

  8. Expanding Internationally: OCLC Gears Up.

    ERIC Educational Resources Information Center

    Chepesiuk, Ron

    1997-01-01

    Describes the Online Computer Library Center (OCLC) efforts in China, Germany, Canada, Scotland, Jamaica and Brazil. Discusses FirstSearch, an end-user reference service, and WorldCat, a bibliographic database. Highlights international projects developing increased OCLC online availability, database loading software, CD-ROM cataloging,…

  9. Measuring spirituality and religiosity in clinical research: a systematic review of instruments available in the Portuguese language.

    PubMed

    Lucchetti, Giancarlo; Lucchetti, Alessandra Lamas Granero; Vallada, Homero

    2013-01-01

    Despite numerous spirituality and/or religiosity (S/R) measurement tools for use in research worldwide, there is little information on S/R instruments in the Portuguese language. The aim of the present study was to map out the S/R scales available for research in the Portuguese language. Systematic review of studies found in databases. A systematic review was conducted in three phases. Phases 1 and 2: articles in Portuguese, Spanish and English, published up to November 2011, dealing with the Portuguese translation and/or validation of S/R measurement tools for clinical research, were selected from six databases. Phase 3: the instruments were grouped according to authorship, cross-cultural adaptation, internal consistency, concurrent and discriminative validity and test-retest procedures. Twenty instruments were found. Forty-five percent of these evaluated religiosity, 40% spirituality, 10% religious/spiritual coping and 5% S/R. Among these, 90% had been produced in (n = 3) or translated to (n = 15) Brazilian Portuguese and two (10%) solely to European Portuguese. Nevertheless, the majority of the instruments had not undergone in-depth psychometric analysis. Only 40% of the instruments presented concurrent validity, 45% discriminative validity and 15% a test-retest procedure. The characteristics of each instrument were analyzed separately, yielding advantages, disadvantages and psychometric properties. Currently, 20 instruments for measuring S/R are available in the Portuguese language. Most have been translated (n = 15) or developed (n = 3) in Brazil and present good internal consistency. Nevertheless, few instruments have been assessed regarding all their psychometric qualities.

  10. Objectively measured physical activity and sedentary time in youth: the International children's accelerometry database (ICAD).

    PubMed

    Cooper, Ashley R; Goodman, Anna; Page, Angie S; Sherar, Lauren B; Esliger, Dale W; van Sluijs, Esther M F; Andersen, Lars Bo; Anderssen, Sigmund; Cardon, Greet; Davey, Rachel; Froberg, Karsten; Hallal, Pedro; Janz, Kathleen F; Kordas, Katarzyna; Kreimler, Susi; Pate, Russ R; Puder, Jardena J; Reilly, John J; Salmon, Jo; Sardinha, Luis B; Timperio, Anna; Ekelund, Ulf

    2015-09-17

    Physical activity and sedentary behaviour in youth have been reported to vary by sex, age, weight status and country. However, supporting data are often self-reported and/or do not encompass a wide range of ages or geographical locations. This study aimed to describe objectively-measured physical activity and sedentary time patterns in youth. The International Children's Accelerometry Database (ICAD) consists of ActiGraph accelerometer data from 20 studies in ten countries, processed using common data reduction procedures. Analyses were conducted on 27,637 participants (2.8-18.4 years) who provided at least three days of valid accelerometer data. Linear regression was used to examine associations between age, sex, weight status, country and physical activity outcomes. Boys were less sedentary and more active than girls at all ages. After 5 years of age there was an average cross-sectional decrease of 4.2% in total physical activity with each additional year of age, due mainly to lower levels of light-intensity physical activity and greater time spent sedentary. Physical activity did not differ by weight status in the youngest children, but from age seven onwards, overweight/obese participants were less active than their normal weight counterparts. Physical activity varied between samples from different countries, with a 15-20% difference between the highest and lowest countries at age 9-10 and a 26-28% difference at age 12-13. Physical activity differed between samples from different countries, but the associations between demographic characteristics and physical activity were consistently observed. Further research is needed to explore environmental and sociocultural explanations for these differences.

  11. Introduction of the American Academy of Facial Plastic and Reconstructive Surgery FACE TO FACE Database.

    PubMed

    Abraham, Manoj T; Rousso, Joseph J; Hu, Shirley; Brown, Ryan F; Moscatello, Augustine L; Finn, J Charles; Patel, Neha A; Kadakia, Sameep P; Wood-Smith, Donald

    2017-07-01

    The American Academy of Facial Plastic and Reconstructive Surgery FACE TO FACE database was created to gather and organize patient data primarily from international humanitarian surgical mission trips, as well as local humanitarian initiatives. Similar to cloud-based Electronic Medical Records, this web-based user-generated database allows for more accurate tracking of provider and patient information and outcomes, regardless of site, and is useful when coordinating follow-up care for patients. The database is particularly useful on international mission trips as there are often different surgeons who may provide care to patients on subsequent missions, and patients who may visit more than 1 mission site. Ultimately, by pooling data across multiples sites and over time, the database has the potential to be a useful resource for population-based studies and outcome data analysis. The objective of this paper is to delineate the process involved in creating the AAFPRS FACE TO FACE database, to assess its functional utility, to draw comparisons to electronic medical records systems that are now widely implemented, and to explain the specific benefits and disadvantages of the use of the database as it was implemented on recent international surgical mission trips.

  12. Description of the process used to create 1992 Hanford Morality Study database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, E.S.; Buchanan, J.A.; Holter, N.A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL`s Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL`s Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositionsmore » of radionuclides also maintained by PNL`s Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.« less

  13. Description of the process used to create 1992 Hanford Morality Study database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, E. S.; Buchanan, J. A.; Holter, N. A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL's Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL's Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositionsmore » of radionuclides also maintained by PNL's Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.« less

  14. From Chaos to Content: An Integrated Approach to Government Web Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demuth, Nora H.; Knudson, Christa K.

    2005-01-03

    The web development team of the Environmental Technology Directorate (ETD) at the U.S. Department of Energy’s Pacific Northwest National Laboratory (PNNL) redesigned the ETD website as a database-driven system, powered by the newly designed ETD Common Information System (ETD-CIS). The ETD website was redesigned in response to an analysis that showed the previous ETD websites were inefficient, costly, and lacking in a consistent focus. Redesigned and newly created websites based on a new ETD template provide a consistent image, meet or exceed accessibility standards, and are linked through a common database. The protocols used in developing the ETD website supportmore » integration of further organizational sites and facilitate internal use by staff and training on ETD website development and maintenance. Other PNNL organizations have approached the ETD web development team with an interest in applying the methods established by the ETD system. The ETD system protocol could potentially be used by other DOE laboratories to improve their website efficiency and content focus. “The tools by which we share science information must be as extraordinary as the information itself.[ ]” – DOE Science Director Raymond Orbach« less

  15. Evaluation of the mining techniques in constructing a traditional Chinese-language nursing recording system.

    PubMed

    Liao, Pei-Hung; Chu, William; Chu, Woei-Chyn

    2014-05-01

    In 2009, the Department of Health, part of Taiwan's Executive Yuan, announced the advent of electronic medical records to reduce medical expenses and facilitate the international exchange of medical record information. An information technology platform for nursing records in medical institutions was then quickly established, which improved nursing information systems and electronic databases. The purpose of the present study was to explore the usability of the data mining techniques to enhance completeness and ensure consistency of nursing records in the database system.First, the study used a Chinese word-segmenting system on common and special terms often used by the nursing staff. We also used text-mining techniques to collect keywords and create a keyword lexicon. We then used an association rule and artificial neural network to measure the correlation and forecasting capability for keywords. Finally, nursing staff members were provided with an on-screen pop-up menu to use when establishing nursing records. Our study found that by using mining techniques we were able to create a powerful keyword lexicon and establish a forecasting model for nursing diagnoses, ensuring the consistency of nursing terminology and improving the nursing staff's work efficiency and productivity.

  16. An internally consistent set of thermodynamic data for twentyone CaO-Al2O3-SiO2- H2O phases by linear parametric programming

    NASA Astrophysics Data System (ADS)

    Halbach, Heiner; Chatterjee, Niranjan D.

    1984-11-01

    The technique of linear parametric programming has been applied to derive sets of internally consistent thermodynamic data for 21 condensed phases of the quaternary system CaO-Al2O3-SiO2-H2O (CASH) (Table 4). This was achieved by simultaneously processing: a) calorimetric data for 16 of these phases (Table 1), and b) experimental phase equilibria reversal brackets for 27 reactions (Table 3) involving these phases. Calculation of equilibrium P-T curves of several arbitrarily picked reactions employing the preferred set of internally consistent thermodynamic data from Table 4 shows that the input brackets are invariably satisfied by the calculations (Fig. 2a). By contrast, the same equilibria calculated on the basis of a set of thermodynamic data derived by applying statistical methods to a large body of comparable input data (Haas et al. 1981; Hemingway et al. 1982) do not necessarily agree with the experimental reversal brackets. Prediction of some experimentally investigated phase relations not included into the linear programming input database also appears to be remarkably successful. Indications are, therefore, that the thermodynamic data listed in Table 4 may be used with confidence to predict geologic phase relations in the CASH system with considerable accuracy. For such calculated phase diagrams and their petrological implications, the reader's attention is drawn to the paper by Chatterjee et al. (1984).

  17. JDD, Inc. Database

    NASA Technical Reports Server (NTRS)

    Miller, David A., Jr.

    2004-01-01

    JDD Inc, is a maintenance and custodial contracting company whose mission is to provide their clients in the private and government sectors "quality construction, construction management and cleaning services in the most efficient and cost effective manners, (JDD, Inc. Mission Statement)." This company provides facilities support for Fort Riley in Fo,rt Riley, Kansas and the NASA John H. Glenn Research Center at Lewis Field here in Cleveland, Ohio. JDD, Inc. is owned and operated by James Vaughn, who started as painter at NASA Glenn and has been working here for the past seventeen years. This summer I worked under Devan Anderson, who is the safety manager for JDD Inc. in the Logistics and Technical Information Division at Glenn Research Center The LTID provides all transportation, secretarial, security needs and contract management of these various services for the center. As a safety manager, my mentor provides Occupational Health and Safety Occupation (OSHA) compliance to all JDD, Inc. employees and handles all other issues (Environmental Protection Agency issues, workers compensation, safety and health training) involving to job safety. My summer assignment was not as considered "groundbreaking research" like many other summer interns have done in the past, but it is just as important and beneficial to JDD, Inc. I initially created a database using a Microsoft Excel program to classify and categorize data pertaining to numerous safety training certification courses instructed by our safety manager during the course of the fiscal year. This early portion of the database consisted of only data (training field index, employees who were present at these training courses and who was absent) from the training certification courses. Once I completed this phase of the database, I decided to expand the database and add as many dimensions to it as possible. Throughout the last seven weeks, I have been compiling more data from day to day operations and been adding the information to the database. It now consists of seven different categories of data (carpet cleaning, forms, NASA Event Schedules, training certifications, wall and vent cleaning, work schedules, and miscellaneous) . I also did some field inspecting with the supervisors around the site and was present at all of the training certification courses that have been scheduled since June 2004. My future outlook for the JDD, Inc. database is to have all of company s information from future contract proposals, weekly inventory, to employee timesheets all in this same database.

  18. Standardizing terminology and definitions of medication adherence and persistence in research employing electronic databases.

    PubMed

    Raebel, Marsha A; Schmittdiel, Julie; Karter, Andrew J; Konieczny, Jennifer L; Steiner, John F

    2013-08-01

    To propose a unifying set of definitions for prescription adherence research utilizing electronic health record prescribing databases, prescription dispensing databases, and pharmacy claims databases and to provide a conceptual framework to operationalize these definitions consistently across studies. We reviewed recent literature to identify definitions in electronic database studies of prescription-filling patterns for chronic oral medications. We then develop a conceptual model and propose standardized terminology and definitions to describe prescription-filling behavior from electronic databases. The conceptual model we propose defines 2 separate constructs: medication adherence and persistence. We define primary and secondary adherence as distinct subtypes of adherence. Metrics for estimating secondary adherence are discussed and critiqued, including a newer metric (New Prescription Medication Gap measure) that enables estimation of both primary and secondary adherence. Terminology currently used in prescription adherence research employing electronic databases lacks consistency. We propose a clear, consistent, broadly applicable conceptual model and terminology for such studies. The model and definitions facilitate research utilizing electronic medication prescribing, dispensing, and/or claims databases and encompasses the entire continuum of prescription-filling behavior. Employing conceptually clear and consistent terminology to define medication adherence and persistence will facilitate future comparative effectiveness research and meta-analytic studies that utilize electronic prescription and dispensing records.

  19. Vocabulary Control and the Humanities: A Case Study of the "MLA International Bibliography."

    ERIC Educational Resources Information Center

    Stebelman, Scott

    1994-01-01

    Discussion of research in the humanities focuses on the "MLA International Bibliography," the primary database for literary research. Highlights include comparisons to research in the sciences; humanities vocabulary; database search techniques; contextual indexing; examples of searches; thesauri; and software. (43 references) (LRW)

  20. International patent applications for non-injectable naloxone for opioid overdose reversal: Exploratory search and retrieve analysis of the PatentScope database.

    PubMed

    McDonald, Rebecca; Danielsson Glende, Øyvind; Dale, Ola; Strang, John

    2018-02-01

    Non-injectable naloxone formulations are being developed for opioid overdose reversal, but only limited data have been published in the peer-reviewed domain. Through examination of a hitherto-unsearched database, we expand public knowledge of non-injectable formulations, tracing their development and novelty, with the aim to describe and compare their pharmacokinetic properties. (i) The PatentScope database of the World Intellectual Property Organization was searched for relevant English-language patent applications; (ii) Pharmacokinetic data were extracted, collated and analysed; (iii) PubMed was searched using Boolean search query '(nasal OR intranasal OR nose OR buccal OR sublingual) AND naloxone AND pharmacokinetics'. Five hundred and twenty-two PatentScope and 56 PubMed records were identified: three published international patent applications and five peer-reviewed papers were eligible. Pharmacokinetic data were available for intranasal, sublingual, and reference routes. Highly concentrated formulations (10-40 mg mL -1 ) had been developed and tested. Sublingual bioavailability was very low (1%; relative to intravenous). Non-concentrated intranasal spray (1 mg mL -1 ; 1 mL per nostril) had low bioavailability (11%). Concentrated intranasal formulations (≥10 mg mL -1 ) had bioavailability of 21-42% (relative to intravenous) and 26-57% (relative to intramuscular), with peak concentrations (dose-adjusted C max  = 0.8-1.7 ng mL -1 ) reached in 19-30 min (t max ). Exploratory analysis identified intranasal bioavailability as associated positively with dose and negatively with volume. We find consistent direction of development of intranasal sprays to high-concentration, low-volume formulations with bioavailability in the 20-60% range. These have potential to deliver a therapeutic dose in 0.1 mL volume. [McDonald R, Danielsson Glende Ø, Dale O, Strang J. International patent applications for non-injectable naloxone for opioid overdose reversal: Exploratory search and retrieve analysis of the PatentScope database. Drug Alcohol Rev 2017;00:000-000]. © 2017 Australasian Professional Society on Alcohol and other Drugs.

  1. Online access to international aerospace science and technology

    NASA Technical Reports Server (NTRS)

    Lahr, Thomas F.; Harrison, Laurie K.

    1993-01-01

    The NASA Aerospace Database contains over 625,000 foreign R&D documents from 1962 to the present from over 60 countries worldwide. In 1991 over 26,000 new non-U.S. entries were added from a variety of innovative exchange programs. An active international acquisitions effort by the NASA STI Program seeks to increase the percentage of foreign data in the coming years, focusing on Japan, the Commonwealth of Independent States, Western Europe, Australia, and Canada. It also has plans to target China, India, Brazil, and Eastern Europe in the future. The authors detail the resources the NASA Aerospace Database offers in the international arena, the methods used to gather this information, and the STI Program's initiatives for maintaining and expanding the percentage of international information in this database.

  2. TEDS-M 2008 User Guide for the International Database. Supplement 3: Variables Derived from the Educator and Future Teacher Data

    ERIC Educational Resources Information Center

    Brese, Falk, Ed.

    2012-01-01

    This supplement contains documentation on all the derived variables contained in the TEDS-M educator and future teacher data files. These derived variables were used to report data in the TEDS-M international reports. The variables that constitute the scales and indices are made available as part of the TEDS-M International Database to be used in…

  3. Hardwood log defect photographic database, software and user's guide

    Treesearch

    R. Edward Thomas

    2009-01-01

    Computer software and user's guide for Hardwood Log Defect Photographic Database. The database contains photographs and information on external hardwood log defects and the corresponding internal characteristics. This database allows users to search for specific defect types, sizes, and locations by tree species. For every defect, the database contains photos of...

  4. Recent International Documents and Journal Articles from the ERIC Database.

    ERIC Educational Resources Information Center

    International Journal of Early Years Education, 1998

    1998-01-01

    Annotates recent international documents and journal articles from the ERIC database. Document topics include racial equality, and balancing early childhood education and work. Journal article topics include foster care in Iraqi Kurdistan; child care in Sweden; teacher-child interaction in Australian centers; teacher education in Brazil, Iceland,…

  5. Knowledge Creation through User-Guided Data Mining: A Database Case

    ERIC Educational Resources Information Center

    Steiger, David M.

    2008-01-01

    This case focuses on learning by applying the four integrating mechanisms of Nonaka's knowledge creation theory: socialization, externalization, combination and internalization. In general, such knowledge creation and internalization (i.e., learning) is critical to database students since they will be expected to apply their specialized database…

  6. Are nutrient databases and nutrient analysis systems ready for the International implications of nutrigenomics?

    USDA-ARS?s Scientific Manuscript database

    Our objective is to discuss the implications internationally of the increased focus on nutrigenomics as the underlying basis for individualized health promotion and chronic disease prevention and the challenges presented to existing nutrient database and nutrient analysis systems by these trends. De...

  7. International migration and caesarean birth: a systematic review and meta-analysis.

    PubMed

    Merry, Lisa; Small, Rhonda; Blondel, Béatrice; Gagnon, Anita J

    2013-01-30

    Perinatal health disparities including disparities in caesarean births have been observed between migrant and non-migrant women and some literature suggests that non-medical factors may be implicated. A systematic review was conducted to determine if migrants in Western industrialized countries consistently have different rates of caesarean than receiving-country-born women and to identify the reasons that explain these differences. Reports were identified by searching 12 literature databases (from inception to January 2012; no language limits) and the web, by bibliographic citation hand-searches and through key informants. Studies that compared caesarean rates between international migrants and non-migrants living in industrialized countries and that did not have a 'fatal flaw' according to the US Preventative Services Task Force criteria were included. Studies were summarized, analyzed descriptively and where possible, meta-analyzed. Seventy-six studies met inclusion criteria. Caesarean rates between migrants and non-migrants differed in 69% of studies. Meta-analyses revealed consistently higher overall caesarean rates for Sub-Saharan African, Somali and South Asian women; higher emergency rates for North African/West Asian and Latin American women; and lower overall rates for Eastern European and Vietnamese women. Evidence to explain the consistently different rates was limited. Frequently postulated risk factors for caesarean included: language/communication barriers, low SES, poor maternal health, GDM/high BMI, feto-pelvic disproportion, and inadequate prenatal care. Suggested protective factors included: a healthy immigrant effect, preference for a vaginal birth, a healthier lifestyle, younger mothers and the use of fewer interventions during childbirth. Certain groups of international migrants consistently have different caesarean rates than receiving-country-born women. There is insufficient evidence to explain the observed differences.

  8. International migration and caesarean birth: a systematic review and meta-analysis

    PubMed Central

    2013-01-01

    Background Perinatal health disparities including disparities in caesarean births have been observed between migrant and non-migrant women and some literature suggests that non-medical factors may be implicated. A systematic review was conducted to determine if migrants in Western industrialized countries consistently have different rates of caesarean than receiving-country-born women and to identify the reasons that explain these differences. Methods Reports were identified by searching 12 literature databases (from inception to January 2012; no language limits) and the web, by bibliographic citation hand-searches and through key informants. Studies that compared caesarean rates between international migrants and non-migrants living in industrialized countries and that did not have a ‘fatal flaw’ according to the US Preventative Services Task Force criteria were included. Studies were summarized, analyzed descriptively and where possible, meta-analyzed. Results Seventy-six studies met inclusion criteria. Caesarean rates between migrants and non-migrants differed in 69% of studies. Meta-analyses revealed consistently higher overall caesarean rates for Sub-Saharan African, Somali and South Asian women; higher emergency rates for North African/West Asian and Latin American women; and lower overall rates for Eastern European and Vietnamese women. Evidence to explain the consistently different rates was limited. Frequently postulated risk factors for caesarean included: language/communication barriers, low SES, poor maternal health, GDM/high BMI, feto-pelvic disproportion, and inadequate prenatal care. Suggested protective factors included: a healthy immigrant effect, preference for a vaginal birth, a healthier lifestyle, younger mothers and the use of fewer interventions during childbirth. Conclusion Certain groups of international migrants consistently have different caesarean rates than receiving-country-born women. There is insufficient evidence to explain the observed differences. PMID:23360183

  9. Extant and Extinct Lunar Regolith Simulants: Modal Analyses of NU-LHT-1M and -2m, OB-1, JSC-1, JSC-1A and -1AF,FJS-1, and MLS-1

    NASA Technical Reports Server (NTRS)

    Schrader, Christian; Rickman, Doug; McLemore, Carole; Fikes, John; Wilson, Stephen; Stoeser, Doug; Butcher, Alan; Botha, Pieter

    2008-01-01

    This work is part of a larger effort to compile an internally consistent database on lunar regolith (Apollo samples) and lunar regolith simulants. Characterize existing lunar regolith and simulants in terms of: a) Particle type; b) Particle size distribution; c) Particle shape distribution; d) Bulk density; and e) Other compositional characteristics. Evaluate regolith simulants (Figure of Merit) by above properties by comparison to lunar regolith (Apollo sample) This presentation covers new data on lunar simulants.

  10. Using non-local databases for the environmental assessment of industrial activities: The case of Latin America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osses de Eicker, Margarita, E-mail: Margarita.Osses@empa.c; Hischier, Roland, E-mail: Roland.Hischier@empa.c; Hurni, Hans, E-mail: Hans.Hurni@cde.unibe.c

    2010-04-15

    Nine non-local databases were evaluated with respect to their suitability for the environmental assessment of industrial activities in Latin America. Three assessment methods were considered, namely Life Cycle Assessment (LCA), Environmental Impact Assessment (EIA) and air emission inventories. The analysis focused on data availability in the databases and the applicability of their international data to Latin American industry. The study showed that the European EMEP/EEA Guidebook and the U.S. EPA AP-42 database are the most suitable ones for air emission inventories, whereas the LCI database Ecoinvent is the most suitable one for LCA and EIA. Due to the data coveragemore » in the databases, air emission inventories are easier to develop than LCA or EIA, which require more comprehensive information. One strategy to overcome the limitations of non-local databases for Latin American industry is the combination of validated data from international databases with newly developed local datasets.« less

  11. The International Experimental Thermal Hydraulic Systems database – TIETHYS: A new NEA validation tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, Upendra S.

    Nuclear reactor codes require validation with appropriate data representing the plant for specific scenarios. The thermal-hydraulic data is scattered in different locations and in different formats. Some of the data is in danger of being lost. A relational database is being developed to organize the international thermal hydraulic test data for various reactor concepts and different scenarios. At the reactor system level, that data is organized to include separate effect tests and integral effect tests for specific scenarios and corresponding phenomena. The database relies on the phenomena identification sections of expert developed PIRTs. The database will provide a summary ofmore » appropriate data, review of facility information, test description, instrumentation, references for the experimental data and some examples of application of the data for validation. The current database platform includes scenarios for PWR, BWR, VVER, and specific benchmarks for CFD modelling data and is to be expanded to include references for molten salt reactors. There are place holders for high temperature gas cooled reactors, CANDU and liquid metal reactors. This relational database is called The International Experimental Thermal Hydraulic Systems (TIETHYS) database and currently resides at Nuclear Energy Agency (NEA) of the OECD and is freely open to public access. Going forward the database will be extended to include additional links and data as they become available. https://www.oecd-nea.org/tiethysweb/« less

  12. MIPS: a database for protein sequences, homology data and yeast genome information.

    PubMed Central

    Mewes, H W; Albermann, K; Heumann, K; Liebl, S; Pfeiffer, F

    1997-01-01

    The MIPS group (Martinsried Institute for Protein Sequences) at the Max-Planck-Institute for Biochemistry, Martinsried near Munich, Germany, collects, processes and distributes protein sequence data within the framework of the tripartite association of the PIR-International Protein Sequence Database (,). MIPS contributes nearly 50% of the data input to the PIR-International Protein Sequence Database. The database is distributed on CD-ROM together with PATCHX, an exhaustive supplement of unique, unverified protein sequences from external sources compiled by MIPS. Through its WWW server (http://www.mips.biochem.mpg.de/ ) MIPS permits internet access to sequence databases, homology data and to yeast genome information. (i) Sequence similarity results from the FASTA program () are stored in the FASTA database for all proteins from PIR-International and PATCHX. The database is dynamically maintained and permits instant access to FASTA results. (ii) Starting with FASTA database queries, proteins have been classified into families and superfamilies (PROT-FAM). (iii) The HPT (hashed position tree) data structure () developed at MIPS is a new approach for rapid sequence and pattern searching. (iv) MIPS provides access to the sequence and annotation of the complete yeast genome (), the functional classification of yeast genes (FunCat) and its graphical display, the 'Genome Browser' (). A CD-ROM based on the JAVA programming language providing dynamic interactive access to the yeast genome and the related protein sequences has been compiled and is available on request. PMID:9016498

  13. Software development, nomenclature schemes, and mapping strategies for an international pediatric cardiac surgery database system.

    PubMed

    Jacobs, Jeffrey P

    2002-01-01

    The field of congenital heart surgery has the opportunity to create the first comprehensive international database for a medical subspecialty. An understanding of the demographics of congenital heart disease and the rapid growth of computer technology leads to the realization that creating a comprehensive international database for pediatric cardiac surgery represents an important and achievable goal. The evolution of computer-based data analysis creates an opportunity to develop software to manage an international congenital heart surgery database and eventually become an electronic medical record. The same database data set for congenital heart surgery is now being used in Europe and North America. Additional work is under way to involve Africa, Asia, Australia, and South America. The almost simultaneous publication of the European Association for Cardio-thoracic Surgery/Society of Thoracic Surgeons coding system and the Association for European Paediatric Cardiology coding system resulted in the potential for multiple coding. Representatives of the Association for European Paediatric Cardiology, Society of Thoracic Surgeons, European Association for Cardio-thoracic Surgery, and European Congenital Heart Surgeons Foundation agree that these hierarchical systems are complementary and not competitive. An international committee will map the two systems. The ideal coding system will permit a diagnosis or procedure to be coded only one time with mapping allowing this code to be used for patient care, billing, practice management, teaching, research, and reporting to governmental agencies. The benefits of international data gathering and sharing are global, with the long-term goal of the continued upgrade in the quality of congenital heart surgery worldwide. Copyright 2002 by W.B. Saunders Company

  14. The strength of primary care in Europe: an international comparative study.

    PubMed

    Kringos, Dionne; Boerma, Wienke; Bourgueil, Yann; Cartier, Thomas; Dedeu, Toni; Hasvold, Toralf; Hutchinson, Allen; Lember, Margus; Oleszczyk, Marek; Rotar Pavlic, Danica; Svab, Igor; Tedeschi, Paolo; Wilm, Stefan; Wilson, Andrew; Windak, Adam; Van der Zee, Jouke; Groenewegen, Peter

    2013-11-01

    A suitable definition of primary care to capture the variety of prevailing international organisation and service-delivery models is lacking. Evaluation of strength of primary care in Europe. International comparative cross-sectional study performed in 2009-2010, involving 27 EU member states, plus Iceland, Norway, Switzerland, and Turkey. Outcome measures covered three dimensions of primary care structure: primary care governance, economic conditions of primary care, and primary care workforce development; and four dimensions of primary care service-delivery process: accessibility, comprehensiveness, continuity, and coordination of primary care. The primary care dimensions were operationalised by a total of 77 indicators for which data were collected in 31 countries. Data sources included national and international literature, governmental publications, statistical databases, and experts' consultations. Countries with relatively strong primary care are Belgium, Denmark, Estonia, Finland, Lithuania, the Netherlands, Portugal, Slovenia, Spain, and the UK. Countries either have many primary care policies and regulations in place, combined with good financial coverage and resources, and adequate primary care workforce conditions, or have consistently only few of these primary care structures in place. There is no correlation between the access, continuity, coordination, and comprehensiveness of primary care of countries. Variation is shown in the strength of primary care across Europe, indicating a discrepancy in the responsibility given to primary care in national and international policy initiatives and the needed investments in primary care to solve, for example, future shortages of workforce. Countries are consistent in their primary care focus on all important structure dimensions. Countries need to improve their primary care information infrastructure to facilitate primary care performance management.

  15. International Soil Carbon Network (ISCN) Database v3-1

    DOE Data Explorer

    Nave, Luke [University of Michigan] (ORCID:0000000182588335); Johnson, Kris [USDA-Forest Service; van Ingen, Catharine [Microsoft Research; Agarwal, Deborah [Lawrence Berkeley National Laboratory] (ORCID:0000000150452396); Humphrey, Marty [University of Virginia; Beekwilder, Norman [University of Virginia

    2016-01-01

    The ISCN is an international scientific community devoted to the advancement of soil carbon research. The ISCN manages an open-access, community-driven soil carbon database. This is version 3-1 of the ISCN Database, released in December 2015. It gathers 38 separate dataset contributions, totalling 67,112 sites with data from 71,198 soil profiles and 431,324 soil layers. For more information about the ISCN, its scientific community and resources, data policies and partner networks visit: http://iscn.fluxdata.org/.

  16. Fine-grained policy control in U.S. Army Research Laboratory (ARL) multimodal signatures database

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly; Grueneberg, Keith; Wood, David; Calo, Seraphin

    2014-06-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) consists of a number of colocated relational databases representing a collection of data from various sensors. Role-based access to this data is granted to external organizations such as DoD contractors and other government agencies through a client Web portal. In the current MMSDB system, access control is only at the database and firewall level. In order to offer finer grained security, changes to existing user profile schemas and authentication mechanisms are usually needed. In this paper, we describe a software middleware architecture and implementation that allows fine-grained access control to the MMSDB at a dataset, table, and row level. Result sets from MMSDB queries issued in the client portal are filtered with the use of a policy enforcement proxy, with minimal changes to the existing client software and database. Before resulting data is returned to the client, policies are evaluated to determine if the user or role is authorized to access the data. Policies can be authored to filter data at the row, table or column level of a result set. The system uses various technologies developed in the International Technology Alliance in Network and Information Science (ITA) for policy-controlled information sharing and dissemination1. Use of the Policy Management Library provides a mechanism for the management and evaluation of policies to support finer grained access to the data in the MMSDB system. The GaianDB is a policy-enabled, federated database that acts as a proxy between the client application and the MMSDB system.

  17. Overview of the Nordic Seas CARINA data and salinity measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olsen, Are; Key, Robert; Jeansson, Emil

    2009-01-01

    Water column data of carbon and carbon relevant hydrographic and hydrochemical parameters from 188 previously non-publicly available cruises in the Arctic, Atlantic, and Southern Ocean have been retrieved and merged into a new database: CARINA (CARbon IN the Atlantic). The data have been subject to rigorous quality control (QC) in order to ensure highest possible quality and consistency. The data for most of the parameters included were examined in order to quantify systematic biases in the reported values, i.e. secondary quality control. Significant biases have been corrected for in the data products, i.e. the three merged files with measured, calculatedmore » and interpolated values for each of the three CARINA regions; the Arctic Mediterranean Seas (AMS), the Atlantic (ATL) and the Southern Ocean (SO).With the adjustments the CARINA database is consistent both internally as well as with GLODAP (Key et al., 2004) and is suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation. The Arctic Mediterranean Seas include the Arctic Ocean and the Nordic Seas, and the quality control was carried out separately in these two areas. This contribution provides an overview of the CARINA data from the Nordic Seas and summaries the findings of the QC of the salinity data. One cruise had salinity data that were of questionable quality, and these have been removed from the data product. An evaluation of the consistency of the quality controlled salinity data suggests that they are consistent to at least 0.005.« less

  18. InterRett, a Model for International Data Collection in a Rare Genetic Disorder

    ERIC Educational Resources Information Center

    Louise, Sandra; Fyfe, Sue; Bebbington, Ami; Bahi-Buisson, Nadia; Anderson, Alison; Pineda, Merce; Percy, Alan; Zeev, Bruria Ben; Wu, Xi Ru; Bao, Xinhua; MacLeod, Patrick; Armstrong, Judith; Leonard, Helen

    2009-01-01

    Rett syndrome (RTT) is a rare genetic disorder within the autistic spectrum. This study compared socio-demographic, clinical and genetic characteristics of the international database, InterRett, and the population-based Australian Rett syndrome database (ARSD). It also explored the strengths and limitations of InterRett in comparison with other…

  19. International energy: Research organizations, 1986--1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, P.; Jordan, S.

    The International Energy: Research Organizations publication contains the standardized names of energy research organizations used in energy information databases. Involved in this cooperative task are (1) the technical staff of the USDOE Office of Scientific and Technical Information (OSTI) in cooperation with the member countries of the Energy Technology Data Exchange (ETDE) and (2) the International Nuclear Information System (INIS). This publication identifies current organizations doing research in all energy fields, standardizes the format for recording these organization names in bibliographic citations, assigns a numeric code to facilitate data entry, and identifies report number prefixes assigned by these organizations. Thesemore » research organization names may be used in searching the databases Energy Science Technology'' on DIALOG and Energy'' on STN International. These organization names are also used in USDOE databases on the Integrated Technical Information System. Research organizations active in the past five years, as indicated by database records, were identified to form this publication. This directory includes approximately 34,000 organizations that reported energy-related literature from 1986 to 1990 and updates the DOE Energy Data Base: Corporate Author Entries.« less

  20. IRIS Toxicological Review of Ethyl Tertiary Butyl Ether (Etbe) ...

    EPA Pesticide Factsheets

    In September 2016, EPA released the draft IRIS Toxicological Review of Ethyl Tertiary Butyl Ether (ETBE) for public comment and discussion. The draft assessment was reviewed internally by EPA and by other federal agencies and White House Offices before public release. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science consultation materials provided to other agencies, including interagency review drafts of the IRIS Toxicological Review of Ethyl Tertiary Butyl Ether are posted on this site. EPA is undertaking an new health assessment for ethyl tertiary butyl ether (ETBE) for the Integrated Risk Information System (IRIS). The outcome of this project will be a Toxicological Review and IRIS Summary of ETBE that will be entered on the IRIS database. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment process, i.e., hazard identification and dose-response evaluation. IRIS assessments are used nationally and internationally in combination with specific situational exposure assessment infor

  1. 19 CFR 351.304 - Establishing business proprietary treatment of information.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... information. 351.304 Section 351.304 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE...) Electronic databases. In accordance with § 351.303(c)(3), an electronic database need not contain brackets... in the database. The public version of the database must be publicly summarized and ranged in...

  2. 19 CFR 351.304 - Establishing business proprietary treatment of information.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... information. 351.304 Section 351.304 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE...) Electronic databases. In accordance with § 351.303(c)(3), an electronic database need not contain brackets... in the database. The public version of the database must be publicly summarized and ranged in...

  3. 19 CFR 351.304 - Establishing business proprietary treatment of information.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... information. 351.304 Section 351.304 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE...) Electronic databases. In accordance with § 351.303(c)(3), an electronic database need not contain brackets... in the database. The public version of the database must be publicly summarized and ranged in...

  4. Spanish personal name variations in national and international biomedical databases: implications for information retrieval and bibliometric studies

    PubMed Central

    Ruiz-Pérez, R.; López-Cózar, E. Delgado; Jiménez-Contreras, E.

    2002-01-01

    Objectives: The study sought to investigate how Spanish names are handled by national and international databases and to identify mistakes that can undermine the usefulness of these databases for locating and retrieving works by Spanish authors. Methods: The authors sampled 172 articles published by authors from the University of Granada Medical School between 1987 and 1996 and analyzed the variations in how each of their names was indexed in Science Citation Index (SCI), MEDLINE, and Índice Médico Español (IME). The number and types of variants that appeared for each author's name were recorded and compared across databases to identify inconsistencies in indexing practices. We analyzed the relationship between variability (number of variants of an author's name) and productivity (number of items the name was associated with as an author), the consequences for retrieval of information, and the most frequent indexing structures used for Spanish names. Results: The proportion of authors who appeared under more then one name was 48.1% in SCI, 50.7% in MEDLINE, and 69.0% in IME. Productivity correlated directly with variability: more than 50% of the authors listed on five to ten items appeared under more than one name in any given database, and close to 100% of the authors listed on more than ten items appeared under two or more variants. Productivity correlated inversely with retrievability: as the number of variants for a name increased, the number of items retrieved under each variant decreased. For the most highly productive authors, the number of items retrieved under each variant tended toward one. The most frequent indexing methods varied between databases. In MEDLINE and IME, names were indexed correctly as “first surname second surname, first name initial middle name initial” (if present) in 41.7% and 49.5% of the records, respectively. However, in SCI, the most frequent method was “first surname, first name initial second name initial” (48.0% of the records) and first surname and second surname run together, first name initial (18.3%). Conclusions: Retrievability on the basis of author's name was poor in all three databases. Each database uses accurate indexing methods, but these methods fail to result in consistency or coherence for specific entries. The likely causes of inconsistency are: (1) use by authors of variants of their names during their publication careers, (2) lack of authority control in all three databases, (3) the use of an inappropriate indexing method for Spanish names in SCI, (4) authors' inconsistent behaviors, and (5) possible editorial interventions by some journals. We offer some suggestions as to how to avert the proliferation of author name variants in the databases. PMID:12398248

  5. An international comparative family medicine study of the Transition Project data from the Netherlands, Malta and Serbia. Is family medicine an international discipline? Comparing incidence and prevalence rates of reasons for encounter and diagnostic titles of episodes of care across populations.

    PubMed

    Soler, Jean K; Okkes, Inge; Oskam, Sibo; van Boven, Kees; Zivotic, Predrag; Jevtic, Milan; Dobbs, Frank; Lamberts, Henk

    2012-06-01

    This is a study of the epidemiology of family medicine (FM) in three practice populations from the Netherlands, Malta and Serbia. Incidence and prevalence rates, especially of reasons for encounter (RfEs) and episode labels, are compared. Participating family doctors (FDs) recorded details of all their patient contacts in an episode of care (EoC) structure using electronic patient records based on the International Classification of Primary Care (ICPC), collecting data on all elements of the doctor-patient encounter. RfEs presented by the patient, all FD interventions and the diagnostic labels (EoCs labels) recorded for each encounter were classified with ICPC (ICPC-2-E in Malta and Serbia and ICPC-1 in the Netherlands). The content of family practice in the three population databases, incidence and prevalence rates of the common top 20 RfEs and EoCs in the three databases are given. Data that are collected with an episode-based model define incidence and prevalence rates much more precisely. Incidence and prevalence rates reflect the content of the doctor-patient encounter in FM but only from a superficial perspective. However, we found evidence of an international FM core content and a local FM content reflected by important similarities in such distributions. FM is a complex discipline, and the reduction of the content of a consultation into one or more medical diagnoses, ignoring the patient's RfE, is a coarse reduction, which lacks power to fully characterize a population's health care needs. In fact, RfE distributions seem to be more consistent between populations than distributions of EoCs are, in many respects.

  6. Unification - An international aerospace information issue

    NASA Technical Reports Server (NTRS)

    Cotter, Gladys A.; Lahr, Thomas F.

    1992-01-01

    Scientific and Technical Information (STI) represents the results of large investments in research and development (R&D) and the expertise of a nation and is a valuable resource. For more than four decades, NASA and its predecessor organizations have developed and managed the preeminent aerospace information system. NASA obtains foreign materials through its international exchange relationships, continually increasing the comprehensiveness of the NASA Aerospace Database (NAD). The NAD is de facto the international aerospace database. This paper reviews current NASA goals and activities with a view toward maintaining compatibility among international aerospace information systems, eliminating duplication of effort, and sharing resources through international cooperation wherever possible.

  7. Duplicates, redundancies and inconsistencies in the primary nucleotide databases: a descriptive study.

    PubMed

    Chen, Qingyu; Zobel, Justin; Verspoor, Karin

    2017-01-01

    GenBank, the EMBL European Nucleotide Archive and the DNA DataBank of Japan, known collectively as the International Nucleotide Sequence Database Collaboration or INSDC, are the three most significant nucleotide sequence databases. Their records are derived from laboratory work undertaken by different individuals, by different teams, with a range of technologies and assumptions and over a period of decades. As a consequence, they contain a great many duplicates, redundancies and inconsistencies, but neither the prevalence nor the characteristics of various types of duplicates have been rigorously assessed. Existing duplicate detection methods in bioinformatics only address specific duplicate types, with inconsistent assumptions; and the impact of duplicates in bioinformatics databases has not been carefully assessed, making it difficult to judge the value of such methods. Our goal is to assess the scale, kinds and impact of duplicates in bioinformatics databases, through a retrospective analysis of merged groups in INSDC databases. Our outcomes are threefold: (1) We analyse a benchmark dataset consisting of duplicates manually identified in INSDC-a dataset of 67 888 merged groups with 111 823 duplicate pairs across 21 organisms from INSDC databases - in terms of the prevalence, types and impacts of duplicates. (2) We categorize duplicates at both sequence and annotation level, with supporting quantitative statistics, showing that different organisms have different prevalence of distinct kinds of duplicate. (3) We show that the presence of duplicates has practical impact via a simple case study on duplicates, in terms of GC content and melting temperature. We demonstrate that duplicates not only introduce redundancy, but can lead to inconsistent results for certain tasks. Our findings lead to a better understanding of the problem of duplication in biological databases.Database URL: the merged records are available at https://cloudstor.aarnet.edu.au/plus/index.php/s/Xef2fvsebBEAv9w. © The Author(s) 2017. Published by Oxford University Press.

  8. Duplicates, redundancies and inconsistencies in the primary nucleotide databases: a descriptive study

    PubMed Central

    Chen, Qingyu; Zobel, Justin; Verspoor, Karin

    2017-01-01

    GenBank, the EMBL European Nucleotide Archive and the DNA DataBank of Japan, known collectively as the International Nucleotide Sequence Database Collaboration or INSDC, are the three most significant nucleotide sequence databases. Their records are derived from laboratory work undertaken by different individuals, by different teams, with a range of technologies and assumptions and over a period of decades. As a consequence, they contain a great many duplicates, redundancies and inconsistencies, but neither the prevalence nor the characteristics of various types of duplicates have been rigorously assessed. Existing duplicate detection methods in bioinformatics only address specific duplicate types, with inconsistent assumptions; and the impact of duplicates in bioinformatics databases has not been carefully assessed, making it difficult to judge the value of such methods. Our goal is to assess the scale, kinds and impact of duplicates in bioinformatics databases, through a retrospective analysis of merged groups in INSDC databases. Our outcomes are threefold: (1) We analyse a benchmark dataset consisting of duplicates manually identified in INSDC—a dataset of 67 888 merged groups with 111 823 duplicate pairs across 21 organisms from INSDC databases – in terms of the prevalence, types and impacts of duplicates. (2) We categorize duplicates at both sequence and annotation level, with supporting quantitative statistics, showing that different organisms have different prevalence of distinct kinds of duplicate. (3) We show that the presence of duplicates has practical impact via a simple case study on duplicates, in terms of GC content and melting temperature. We demonstrate that duplicates not only introduce redundancy, but can lead to inconsistent results for certain tasks. Our findings lead to a better understanding of the problem of duplication in biological databases. Database URL: the merged records are available at https://cloudstor.aarnet.edu.au/plus/index.php/s/Xef2fvsebBEAv9w PMID:28077566

  9. Integrative neuroscience: the role of a standardized database.

    PubMed

    Gordon, E; Cooper, N; Rennie, C; Hermens, D; Williams, L M

    2005-04-01

    Most brain related databases bring together specialized information, with a growing number that include neuroimaging measures. This article outlines the potential use and insights from the first entirely standardized and centralized database, which integrates information from neuroimaging measures (EEG, event related potential (ERP), structural/functional MRI), arousal (skin conductance responses (SCR)s, heart rate, respiration), neuropsychological and personality tests, genomics and demographics: The Brain Resource International Database. It comprises data from over 2000 "normative" subjects and a growing number of patients with neurological and psychiatric illnesses, acquired from over 50 laboratories (in the U.S.A, United Kingdom, Holland, South Africa, Israel and Australia), all with identical equipment and experimental procedures. Three primary goals of this database are to quantify individual differences in normative brain function, to compare an individual's performance to their database peers, and to provide a robust normative framework for clinical assessment and treatment prediction. We present three example demonstrations in relation to these goals. First, we show how consistent age differences may be quantified when large subject numbers are available, using EEG and ERP data from nearly 2000 stringently screened. normative subjects. Second, the use of a normalization technique provides a means to compare clinical subjects (50 ADHD subjects in this study) to the normative database with the effects of age and gender taken into account. Third, we show how a profile of EEG/ERP and autonomic measures potentially provides a means to predict treatment response in ADHD subjects. The example data consists of EEG under eyes open and eyes closed and ERP data for auditory oddball, working memory and Go-NoGo paradigms. Autonomic measures of skin conductance (tonic skin conductance level, SCL, and phasic skin conductance responses, SCRs) were acquired simultaneously with central EEG/ERP measures. The findings show that the power of large samples, tested using standardized protocols, allows for the quantification of individual differences that can subsequently be used to control such variation and to enhance the sensitivity and specificity of comparisons between normative and clinical groups. In terms of broader significance, the combination of size and multidimensional measures tapping the brain's core cognitive competencies, may provide a normative and evidence-based framework for individually-based assessments in "Personalized Medicine."

  10. FRED, a Front End for Databases.

    ERIC Educational Resources Information Center

    Crystal, Maurice I.; Jakobson, Gabriel E.

    1982-01-01

    FRED (a Front End for Databases) was conceived to alleviate data access difficulties posed by the heterogeneous nature of online databases. A hardware/software layer interposed between users and databases, it consists of three subsystems: user-interface, database-interface, and knowledge base. Architectural alternatives for this database machine…

  11. “NaKnowBase”: A Nanomaterials Relational Database

    EPA Science Inventory

    NaKnowBase is an internal relational database populated with data from peer-reviewed ORD nanomaterials research publications. The database focuses on papers describing the actions of nanomaterials in environmental or biological media including their interactions, transformations...

  12. Examples of Use of SINBAD Database for Nuclear Data and Code Validation

    NASA Astrophysics Data System (ADS)

    Kodeli, Ivan; Žerovnik, Gašper; Milocco, Alberto

    2017-09-01

    The SINBAD database currently contains compilations and evaluations of over 100 shielding benchmark experiments. The SINBAD database is widely used for code and data validation. Materials covered include: Air, N. O, H2O, Al, Be, Cu, graphite, concrete, Fe, stainless steel, Pb, Li, Ni, Nb, SiC, Na, W, V and mixtures thereof. Over 40 organisations from 14 countries and 2 international organisations have contributed data and work in support of SINBAD. Examples of the use of the database in the scope of different international projects, such as the Working Party on Evaluation Cooperation of the OECD and the European Fusion Programme demonstrate the merit and possible usage of the database for the validation of modern nuclear data evaluations and new computer codes.

  13. Planetary Data Archiving Plan at JAXA

    NASA Astrophysics Data System (ADS)

    Shinohara, Iku; Kasaba, Yasumasa; Yamamoto, Yukio; Abe, Masanao; Okada, Tatsuaki; Imamura, Takeshi; Sobue, Shinichi; Takashima, Takeshi; Terazono, Jun-Ya

    After the successful rendezvous of Hayabusa with the small-body planet Itokawa, and the successful launch of Kaguya to the moon, Japanese planetary community has gotten their own and full-scale data. However, at this moment, these datasets are only available from the data sites managed by each mission team. The databases are individually constructed in the different formats, and the user interface of these data sites is not compatible with foreign databases. To improve the usability of the planetary archives at JAXA and to enable the international data exchange smooth, we are investigating to make a new planetary database. Within a coming decade, Japan will have fruitful datasets in the planetary science field, Venus (Planet-C), Mercury (BepiColombo), and several missions in planning phase (small-bodies). In order to strongly assist the international scientific collaboration using these mission archive data, the planned planetary data archive at JAXA should be managed in an unified manner and the database should be constructed in the international planetary database standard style. In this presentation, we will show the current status and future plans of the planetary data archiving at JAXA.

  14. National security and national competitiveness: Open source solutions; NASA requirements and capabilities

    NASA Technical Reports Server (NTRS)

    Cotter, Gladys A.

    1993-01-01

    Foreign competitors are challenging the world leadership of the U.S. aerospace industry, and increasingly tight budgets everywhere make international cooperation in aerospace science necessary. The NASA STI Program has as part of its mission to support NASA R&D, and to that end has developed a knowledge base of aerospace-related information known as the NASA Aerospace Database. The NASA STI Program is already involved in international cooperation with NATO/AGARD/TIP, CENDI, ICSU/ICSTI, and the U.S. Japan Committee on STI. With the new more open political climate, the perceived dearth of foreign information in the NASA Aerospace Database, and the development of the ESA database and DELURA, the German databases, the NASA STI Program is responding by sponsoring workshops on foreign acquisitions and by increasing its cooperation with international partners and with other U.S. agencies. The STI Program looks to the future of improved database access through networking and a GUI; new media; optical disk, video, and full text; and a Technology Focus Group that will keep the NASA STI Program current with technology.

  15. Solving the Problem: Genome Annotation Standards before the Data Deluge.

    PubMed

    Klimke, William; O'Donovan, Claire; White, Owen; Brister, J Rodney; Clark, Karen; Fedorov, Boris; Mizrachi, Ilene; Pruitt, Kim D; Tatusova, Tatiana

    2011-10-15

    The promise of genome sequencing was that the vast undiscovered country would be mapped out by comparison of the multitude of sequences available and would aid researchers in deciphering the role of each gene in every organism. Researchers recognize that there is a need for high quality data. However, different annotation procedures, numerous databases, and a diminishing percentage of experimentally determined gene functions have resulted in a spectrum of annotation quality. NCBI in collaboration with sequencing centers, archival databases, and researchers, has developed the first international annotation standards, a fundamental step in ensuring that high quality complete prokaryotic genomes are available as gold standard references. Highlights include the development of annotation assessment tools, community acceptance of protein naming standards, comparison of annotation resources to provide consistent annotation, and improved tracking of the evidence used to generate a particular annotation. The development of a set of minimal standards, including the requirement for annotated complete prokaryotic genomes to contain a full set of ribosomal RNAs, transfer RNAs, and proteins encoding core conserved functions, is an historic milestone. The use of these standards in existing genomes and future submissions will increase the quality of databases, enabling researchers to make accurate biological discoveries.

  16. Scales for evaluating self-perceived anxiety levels in patients admitted to intensive care units: a review.

    PubMed

    Perpiñá-Galvañ, Juana; Richart-Martínez, Miguel

    2009-11-01

    To review studies of anxiety in critically ill patients admitted to an intensive care unit to describe the level of anxiety and synthesize the psychometric properties of the instruments used to measure anxiety. The CUIDEN, IME, ISOC, CINAHL, MEDLINE, and PSYCINFO databases for 1995 to 2005 were searched. The search focused on 3 concepts: anxiety, intensive care, and mechanical ventilation for the English-language databases and ansiedad, cuidados intensivos, and ventilación mecánica for the Spanish-language databases. Information was extracted from 18 selected articles on the level of anxiety experienced by patients and the psychometric properties of the instruments used to measure anxiety. Moderate levels of anxiety were reported. Levels were higher in women than in men, and higher in patients undergoing positive pressure ventilation regardless of sex. Most multi-item instruments had high coefficients of internal consistency. The reliability of instruments with only a single item was not demonstrated, even though the instruments had moderate-to-high correlations with other measurements. Midlength scales, such the anxiety subscale of the Brief Symptom Inventory or the shortened state version of the State-Trait Anxiety Inventory are best for measuring anxiety in critical care patients.

  17. The North American Forest Database: going beyond national-level forest resource assessment statistics.

    PubMed

    Smith, W Brad; Cuenca Lara, Rubí Angélica; Delgado Caballero, Carina Edith; Godínez Valdivia, Carlos Isaías; Kapron, Joseph S; Leyva Reyes, Juan Carlos; Meneses Tovar, Carmen Lourdes; Miles, Patrick D; Oswalt, Sonja N; Ramírez Salgado, Mayra; Song, Xilong Alex; Stinson, Graham; Villela Gaytán, Sergio Armando

    2018-05-21

    Forests cannot be managed sustainably without reliable data to inform decisions. National Forest Inventories (NFI) tend to report national statistics, with sub-national stratification based on domestic ecological classification systems. It is becoming increasingly important to be able to report statistics on ecosystems that span international borders, as global change and globalization expand stakeholders' spheres of concern. The state of a transnational ecosystem can only be properly assessed by examining the entire ecosystem. In global forest resource assessments, it may be useful to break national statistics down by ecosystem, especially for large countries. The Inventory and Monitoring Working Group (IMWG) of the North American Forest Commission (NAFC) has begun developing a harmonized North American Forest Database (NAFD) for managing forest inventory data, enabling consistent, continental-scale forest assessment supporting ecosystem-level reporting and relational queries. The first iteration of the database contains data describing 1.9 billion ha, including 677.5 million ha of forest. Data harmonization is made challenging by the existence of definitions and methodologies tailored to suit national circumstances, emerging from each country's professional forestry development. This paper reports the methods used to synchronize three national forest inventories, starting with a small suite of variables and attributes.

  18. Solving the Problem: Genome Annotation Standards before the Data Deluge

    PubMed Central

    Klimke, William; O'Donovan, Claire; White, Owen; Brister, J. Rodney; Clark, Karen; Fedorov, Boris; Mizrachi, Ilene; Pruitt, Kim D.; Tatusova, Tatiana

    2011-01-01

    The promise of genome sequencing was that the vast undiscovered country would be mapped out by comparison of the multitude of sequences available and would aid researchers in deciphering the role of each gene in every organism. Researchers recognize that there is a need for high quality data. However, different annotation procedures, numerous databases, and a diminishing percentage of experimentally determined gene functions have resulted in a spectrum of annotation quality. NCBI in collaboration with sequencing centers, archival databases, and researchers, has developed the first international annotation standards, a fundamental step in ensuring that high quality complete prokaryotic genomes are available as gold standard references. Highlights include the development of annotation assessment tools, community acceptance of protein naming standards, comparison of annotation resources to provide consistent annotation, and improved tracking of the evidence used to generate a particular annotation. The development of a set of minimal standards, including the requirement for annotated complete prokaryotic genomes to contain a full set of ribosomal RNAs, transfer RNAs, and proteins encoding core conserved functions, is an historic milestone. The use of these standards in existing genomes and future submissions will increase the quality of databases, enabling researchers to make accurate biological discoveries. PMID:22180819

  19. Aerodynamic Analyses and Database Development for Ares I Vehicle First Stage Separation

    NASA Technical Reports Server (NTRS)

    Pamadi, Bandu N.; Pei, Jing; Pinier, Jeremy T.; Holland, Scott D.; Covell, Peter F.; Klopfer, Goetz, H.

    2012-01-01

    This paper presents the aerodynamic analysis and database development for the first stage separation of the Ares I A106 Crew Launch Vehicle configuration. Separate databases were created for the first stage and upper stage. Each database consists of three components: isolated or free-stream coefficients, power-off proximity increments, and power-on proximity increments. The power-on database consists of three parts, all plumes firing at nominal conditions, the one booster deceleration motor out condition, and the one ullage settling motor out condition. The isolated and power-off incremental databases were developed using wind tunnel test data. The power-on proximity increments were developed using CFD solutions.

  20. Bridging international law and rights-based litigation: mapping health-related rights through the development of the Global Health and Human Rights Database.

    PubMed

    Meier, Benjamin Mason; Cabrera, Oscar A; Ayala, Ana; Gostin, Lawrence O

    2012-06-15

    The O'Neill Institute for National and Global Health Law at Georgetown University, the World Health Organization, and the Lawyers Collective have come together to develop a searchable Global Health and Human Rights Database that maps the intersection of health and human rights in judgments, international and regional instruments, and national constitutions. Where states long remained unaccountable for violations of health-related human rights, litigation has arisen as a central mechanism in an expanding movement to create rights-based accountability. Facilitated by the incorporation of international human rights standards in national law, this judicial enforcement has supported the implementation of rights-based claims, giving meaning to states' longstanding obligations to realize the highest attainable standard of health. Yet despite these advancements, there has been insufficient awareness of the international and domestic legal instruments enshrining health-related rights and little understanding of the scope and content of litigation upholding these rights. As this accountability movement evolves, the Global Health and Human Rights Database seeks to chart this burgeoning landscape of international instruments, national constitutions, and judgments for health-related rights. Employing international legal research to document and catalogue these three interconnected aspects of human rights for the public's health, the Database's categorization by human rights, health topics, and regional scope provides a comprehensive means of understanding health and human rights law. Through these categorizations, the Global Health and Human Rights Database serves as a basis for analogous legal reasoning across states to serve as precedents for future cases, for comparative legal analysis of similar health claims in different country contexts, and for empirical research to clarify the impact of human rights judgments on public health outcomes. Copyright © 2012 Meier, Nygren-Krug, Cabrera, Ayala, and Gostin.

  1. A Quality System Database

    NASA Technical Reports Server (NTRS)

    Snell, William H.; Turner, Anne M.; Gifford, Luther; Stites, William

    2010-01-01

    A quality system database (QSD), and software to administer the database, were developed to support recording of administrative nonconformance activities that involve requirements for documentation of corrective and/or preventive actions, which can include ISO 9000 internal quality audits and customer complaints.

  2. Annual Review of Database Development: 1992.

    ERIC Educational Resources Information Center

    Basch, Reva

    1992-01-01

    Reviews recent trends in databases and online systems. Topics discussed include new access points for established databases; acquisitions, consolidations, and competition between vendors; European coverage; international services; online reference materials, including telephone directories; political and legal materials and public records;…

  3. Going off script: Effects of awe on memory for script-typical and -irrelevant narrative detail.

    PubMed

    Danvers, Alexander F; Shiota, Michelle N

    2017-09-01

    People often filter their experience of new events through knowledge they already have; for example, encoding new events by relying on prototypical event "scripts" at the expense of actual details. Previous research suggests that positive affect often increases this tendency. Three studies assessed whether awe-an emotion elicited by perceived vastness, and thought to promote cognitive accommodation-has the opposite effect, reducing rather than increasing reliance on event scripts. True/false questions on details of a short story about a romantic dinner were used to determine whether awe (a) reduces the tendency to impute script-consistent but false details into memory, and/or (b) promotes memory of unexpected details. Across studies we consistently found support for the first effect; evidence for the second was less consistent. Effects were partially mediated by subjective awe, and independent of other aspects of subjective affect. Results suggest that awe reduces reliance on internal knowledge in processing new events. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Report on Legal Protection for Databases. A Report of the Register of Copyrights. August, 1997.

    ERIC Educational Resources Information Center

    Library of Congress, Washington, DC. Copyright Office.

    This report gives an overview of the past and present domestic and international legal framework for database protection. It describes database industry practices in securing protection against unauthorized use and Copyright Office registration practices relating to databases. Finally, it discusses issues raised and concerns expressed in a series…

  5. An international database of radionuclide concentration ratios for wildlife: development and uses.

    PubMed

    Copplestone, D; Beresford, N A; Brown, J E; Yankovich, T

    2013-12-01

    A key element of most systems for assessing the impact of radionuclides on the environment is a means to estimate the transfer of radionuclides to organisms. To facilitate this, an international wildlife transfer database has been developed to provide an online, searchable compilation of transfer parameters in the form of equilibrium-based whole-organism to media concentration ratios. This paper describes the derivation of the wildlife transfer database, the key data sources it contains and highlights the applications for the data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. RNAcentral: an international database of ncRNA sequences

    DOE PAGES

    Williams, Kelly Porter

    2014-10-28

    The field of non-coding RNA biology has been hampered by the lack of availability of a comprehensive, up-to-date collection of accessioned RNA sequences. Here we present the first release of RNAcentral, a database that collates and integrates information from an international consortium of established RNA sequence databases. The initial release contains over 8.1 million sequences, including representatives of all major functional classes. A web portal (http://rnacentral.org) provides free access to data, search functionality, cross-references, source code and an integrated genome browser for selected species.

  7. A Molecular Framework for Understanding DCIS

    DTIC Science & Technology

    2016-10-01

    well. Pathologic and Clinical Annotation Database A clinical annotation database titled the Breast Oncology Database has been established to...complement the procured SPORE sample characteristics and annotated pathology data. This Breast Oncology Database is an offsite clinical annotation...database adheres to CSMC Enterprise Information Services (EIS) research database security standards. The Breast Oncology Database consists of: 9 Baseline

  8. Data resource profile: United Nations Children's Fund (UNICEF).

    PubMed

    Murray, Colleen; Newby, Holly

    2012-12-01

    The United Nations Children's Fund (UNICEF) plays a leading role in the collection, compilation, analysis and dissemination of data to inform sound policies, legislation and programmes for promoting children's rights and well-being, and for global monitoring of progress towards the Millennium Development Goals. UNICEF maintains a set of global databases representing nearly 200 countries and covering the areas of child mortality, child health, maternal health, nutrition, immunization, water and sanitation, HIV/AIDS, education and child protection. These databases consist of internationally comparable and statistically sound data, and are updated annually through a process that draws on a wealth of data provided by UNICEF's wide network of >150 field offices. The databases are composed primarily of estimates from household surveys, with data from censuses, administrative records, vital registration systems and statistical models contributing to some key indicators as well. The data are assessed for quality based on a set of objective criteria to ensure that only the most reliable nationally representative information is included. For most indicators, data are available at the global, regional and national levels, plus sub-national disaggregation by sex, urban/rural residence and household wealth. The global databases are featured in UNICEF's flagship publications, inter-agency reports, including the Secretary General's Millennium Development Goals Report and Countdown to 2015, sector-specific reports and statistical country profiles. They are also publicly available on www.childinfo.org, together with trend data and equity analyses.

  9. Posttraumatic growth in bereaved parents: A multidimensional model of associated factors.

    PubMed

    Albuquerque, Sara; Narciso, Isabel; Pereira, Marco

    2018-03-01

    Although the death of a child is a devastating event, recent evidence shows that personal growth is a relevant outcome of parents' grief. This study aimed to examine the factors associated with posttraumatic growth (PTG) and to propose a multidimensional model consisting of sociodemographic, situational, and intrapersonal and interpersonal factors. A sample (N = 197; 89.8% female; mean age = 39.44 years) of bereaved parents completed the Post-Traumatic Growth Inventory-Short Form, the 14-Item Resilience Scale, the Continuing Bonds Scale, and the Dyadic Coping Inventory. The final model consisted of sociodemographic, situational, intrapersonal, and interpersonal factors of PTG, which accounted for 36.7% of the variance. Higher levels of PTG were generally associated with female sex, younger age of the child, higher levels of resilience, higher levels of internalized continuing bonds (i.e., internal representation of the child, maintaining psychological proximity), and higher levels of stress communication by the partner (communicating the stress experience and requesting emotional or practical support). In clinical practice, health professionals assisting bereaved parents should pay attention to men and parents of older children, who might be at higher risk of difficulties in developing PTG. Additionally, promoting a more internalized bond with the child, resilience and dyadic coping, especially stress communication, can constitute important therapeutic goals. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Belgian health-related data in three international databases

    PubMed Central

    2011-01-01

    Aims of the study This study wants to examine the availability of Belgian healthcare data in the three main international health databases: the World Health Organization European Health for All Database (WHO-HFA), the Organisation for Economic Co-operation and Development Health Data 2009 and EUROSTAT. Methods For the indicators present in the three databases, the availability of Belgian data and the source of these data were checked. Main findings The most important problem concerning the availability of Belgian health-related data in the three major international databases is the lack of recent data. Recent data are available for 27% of the indicators of the WHO-HFA database, 73% of the OECD Health Data, and for half of the Eurostat indicators. Especially recent data about health status (including mortality-based indicators) are lacking. Discussion Only the availability of the health-related data is studied in this article. The quality of the Belgian data is however also important to examine. The main problem concerning the availability of health data is the timeliness. One of the causes of this lack of (especially mortality) data is the reform of the Belgian State. Nowadays mortality data are provided by the communities. This results in a delay in the delivery of national mortality data. However several efforts are made to catch up. PMID:22958554

  11. [Informatics support for risk assessment and identification of preventive measures in small and micro-enterprises: occupational hazard datasheets].

    PubMed

    de Merich, D; Forte, Giulia

    2011-01-01

    Risk assessment is the fundamental process of an enterprise's prevention system and is the principal mandatory provision contained in the Health and Safety Law (Legislative Decree 81/2008) amended by Legislative Decree 106/2009. In order to properly comply with this obligation also in small-sized enterprises, the appropriate regulatory bodies should provide the enterprises with standardized tools and methods for identifying, assessing and managing risks. To assist in particular small and micro-enterprises (SMEs) with risk assessment, by providing a flexible tool that can also be standardized in the form of a datasheet, that can be updated with more detailed information on the various work contexts in Italy. Official efforts to provide Italian SMEs with information may initially make use of the findings of research conducted by ISPESL over the past 20 years, thanks in part to cooperation with other institutions (Regions, INAIL-National Insurance Institute for Occupational Accidents and Diseases), which have led to the creation of an information system on prevention consisting of numerous databases, both statistical and documental ("National System of Surveillance on fatal and serious accidents", "National System of Surveillance on work-related diseases", "Sector hazard profiles" database, "Solutions and Best Practices" database, "Technical Guidelines" database, "Training packages for prevention professionals in enterprises" database). With regard to evaluation criteria applicable within the enterprise, the possibility of combining traditional and uniform areas of assessment (by sector or by risk factor) with assessments by job/occupation has become possible thanks to the cooperation agreement made in 2009 by ISPESL, the ILO (International Labour Organisation) of Geneva and IIOSH (Israel Institute for Occupational Health and Hygiene) regarding the creation of an international Database (HDODB) based on risk datasheets per occupation. The project sets out to assist in particular small and micro-enterprises with risk assessment, providing a flexible and standardized tool in the form of a datasheet, that can be updated with more detailed information on the various work contexts in Italy. The model proposed by ISPESL selected the ILO's "Hazard Datasheet on Occupation" as an initial information tool to steer efforts to assess and manage hazards in small and micro-enterprises. In addition to being an internationally validated tool, the occupation datasheet has a very simple structure that is very effective in communicating and updating information in relation to the local context. According to the logic based on the providing support to enterprises by means of a collaborative network among institutions, local supervisory services and social partners, standardised hazard assessment procedures should be, irrespective of any legal obligations, the preferred tools of an "updatable information system" capable of providing support for the need to improve the process of assessing and managing hazards in enterprises.

  12. GOVERNING GENETIC DATABASES: COLLECTION, STORAGE AND USE

    PubMed Central

    Gibbons, Susan M.C.; Kaye, Jane

    2008-01-01

    This paper provides an introduction to a collection of five papers, published as a special symposium journal issue, under the title: “Governing Genetic Databases: Collection, Storage and Use”. It begins by setting the scene, to provide a backdrop and context for the papers. It describes the evolving scientific landscape around genetic databases and genomic research, particularly within the biomedical and criminal forensic investigation fields. It notes the lack of any clear, coherent or coordinated legal governance regime, either at the national or international level. It then identifies and reflects on key cross-cutting issues and themes that emerge from the five papers, in particular: terminology and definitions; consent; special concerns around population genetic databases (biobanks) and forensic databases; international harmonisation; data protection; data access; boundary-setting; governance; and issues around balancing individual interests against public good values. PMID:18841252

  13. International energy: Research organizations, 1988--1992. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, P.; Jordan, S.

    This publication contains the standardized names of energy research organizations used in energy information databases. Involved in this cooperative task are (1) the technical staff of the US DOE Office of Scientific and Technical Information (OSTI) in cooperation with the member countries of the Energy Technology Data Exchange (ETDE) and (2) the International Nuclear Information System (INIS). ETDE member countries are also members of the International Nuclear Information System (INIS). Nuclear organization names recorded for INIS by these ETDE member countries are also included in the ETDE Energy Database. Therefore, these organization names are cooperatively standardized for use in bothmore » information systems. This publication identifies current organizations doing research in all energy fields, standardizes the format for recording these organization names in bibliographic citations, assigns a numeric code to facilitate data entry, and identifies report number prefixes assigned by these organizations. These research organization names may be used in searching the databases ``Energy Science & Technology`` on DIALOG and ``Energy`` on STN International. These organization names are also used in USDOE databases on the Integrated Technical Information System. Research organizations active in the past five years, as indicated by database records, were identified to form this publication. This directory includes approximately 31,000 organizations that reported energy-related literature from 1988 to 1992 and updates the DOE Energy Data Base: Corporate Author Entries.« less

  14. Psychometric evaluation of the Northwick Park Dependency Scale.

    PubMed

    Siegert, Richard J; Turner-Stokes, Lynne

    2010-11-01

    To examine the psychometric properties of the Northwick Park Dependency Scale (NPDS). Review of existing literature and psychometric analysis in relation to other standardized measures of disability in a large neurorehabilitation cohort. A regional post-acute specialist inpatient neurorehabilitation unit in London, UK. A total of 569 inpatients with complex neurological disabilities (350 males, 219 females; mean age 44.4 years). The NPDS, Barthel Index, Functional Independence and Functional Assessment measures. A database search found 5 studies that examined the psychometrics of the NPDS. These supported its validity and reliability. The present study added to these by evaluating the internal consistency, factor structure, discriminatory power and responsiveness to change during rehabilitation. The NPDS was found to have good internal consistency (α = 0.90), suggesting that it can reasonably be summed to a single total score. It discriminated among people with different levels of dependency and was responsive to change, particularly in the higher dependency groups. The NPDS is a psychometrically robust tool, providing a broader range of information on nursing needs than some other commonly-used disability measures. The Special Nursing Needs subscale provides clinically useful information, but its metric properties require further development, which is now underway.

  15. KSC-99pp0331

    NASA Image and Video Library

    1999-03-22

    The Shuttle Radar Topography Mission (SRTM) sits uncovered inside the Multi-Payload Processing Facility. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  16. KSC-99pp0326

    NASA Image and Video Library

    1999-03-24

    The vehicle carrying the Shuttle Radar Topography Mission (SRTM) arrives at the Multi-Payload Processing Facility. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  17. KSC-99pp0327

    NASA Image and Video Library

    1999-03-24

    Inside the Multi-Payload Processing Facility, the lid covering the Shuttle Radar Topography Mission (SRTM) is lifted. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  18. Development of a global land cover characteristics database and IGBP DISCover from 1 km AVHRR data

    USGS Publications Warehouse

    Loveland, Thomas R.; Reed, B.C.; Brown, Jesslyn F.; Ohlen, D.O.; Zhu, Z.; Yang, L.; Merchant, J.W.

    2000-01-01

    Researchers from the U.S. Geological Survey, University of Nebraska-Lincoln and the European Commission's Joint Research Centre, Ispra, Italy produced a 1 km resolution global land cover characteristics database for use in a wide range of continental-to global-scale environmental studies. This database provides a unique view of the broad patterns of the biogeographical and ecoclimatic diversity of the global land surface, and presents a detailed interpretation of the extent of human development. The project was carried out as an International Geosphere-Biosphere Programme, Data and Information Systems (IGBP-DIS) initiative. The IGBP DISCover global land cover product is an integral component of the global land cover database. DISCover includes 17 general land cover classes defined to meet the needs of IGBP core science projects. A formal accuracy assessment of the DISCover data layer will be completed in 1998. The 1 km global land cover database was developed through a continent-by-continent unsupervised classification of 1 km monthly Advanced Very High Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) composites covering 1992-1993. Extensive post-classification stratification was necessary to resolve spectral/temporal confusion between disparate land cover types. The complete global database consists of 961 seasonal land cover regions that capture patterns of land cover, seasonality and relative primary productivity. The seasonal land cover regions were aggregated to produce seven separate land cover data sets used for global environmental modelling and assessment. The data sets include IGBP DISCover, U.S. Geological Survey Anderson System, Simple Biosphere Model, Simple Biosphere Model 2, Biosphere-Atmosphere Transfer Scheme, Olson Ecosystems and Running Global Remote Sensing Land Cover. The database also includes all digital sources that were used in the classification. The complete database can be sourced from the website: http://edcwww.cr.usgs.gov/landdaac/glcc/glcc.html.

  19. Function Point Analysis Depot

    NASA Technical Reports Server (NTRS)

    Muniz, R.; Martinez, El; Szafran, J.; Dalton, A.

    2011-01-01

    The Function Point Analysis (FPA) Depot is a web application originally designed by one of the NE-C3 branch's engineers, Jamie Szafran, and created specifically for the Software Development team of the Launch Control Systems (LCS) project. The application consists of evaluating the work of each developer to be able to get a real estimate of the hours that is going to be assigned to a specific task of development. The Architect Team had made design change requests for the depot to change the schema of the application's information; that information, changed in the database, needed to be changed in the graphical user interface (GUI) (written in Ruby on Rails (RoR and the web service/server side in Java to match the database changes. These changes were made by two interns from NE-C, Ricardo Muniz from NE-C3, who made all the schema changes for the GUI in RoR and Edwin Martinez, from NE-C2, who made all the changes in the Java side.

  20. The electric dipole moment of DNA-binding HU protein calculated by the use of an NMR database.

    PubMed

    Takashima, S; Yamaoka, K

    1999-08-30

    Electric birefringence measurements indicated the presence of a large permanent dipole moment in HU protein-DNA complex. In order to substantiate this observation, numerical computation of the dipole moment of HU protein homodimer was carried out by using NMR protein databases. The dipole moments of globular proteins have hitherto been calculated with X-ray databases and NMR data have never been used before. The advantages of NMR databases are: (a) NMR data are obtained, unlike X-ray databases, using protein solutions. Accordingly, this method eliminates the bothersome question as to the possible alteration of the protein structure due to the transition from the crystalline state to the solution state. This question is particularly important for proteins such as HU protein which has some degree of internal flexibility; (b) the three-dimensional coordinates of hydrogen atoms in protein molecules can be determined with a sufficient resolution and this enables the N-H as well as C = O bond moments to be calculated. Since the NMR database of HU protein from Bacillus stearothermophilus consists of 25 models, the surface charge as well as the core dipole moments were computed for each of these structures. The results of these calculations show that the net permanent dipole moments of HU protein homodimer is approximately 500-530 D (1 D = 3.33 x 10(-30) Cm) at pH 7.5 and 600-630 D at the isoelectric point (pH 10.5). These permanent dipole moments are unusually large for a small protein of the size of 19.5 kDa. Nevertheless, the result of numerical calculations is compatible with the electro-optical observation, confirming a very large dipole moment in this protein.

  1. [Violence against women: the role of the health sector in international legislation].

    PubMed

    Ortiz-Barreda, Gaby; Vives-Cases, Carmen

    2012-01-01

    To identify and describe the responsibilities attributed to health administrations in preventing and addressing violence against women in the international legislation on this issue. We carried out a content analysis of the laws on violence against women collected in the following legal databases: the Annual Review of Law of Harvard University, the United Nations' Secretary-General's database on Violence against Women, the International Digest of Health Legislation and Stop Violence against Women. All legal documents explicitly mentioning the participation of the health sector in interventions against violence against women were identified. Subsequently, the interventions selected were classified into primary, secondary and tertiary prevention, as defined by the World Health Organization in its first World Report on Violence and Health (2002). Of the 115 countries analyzed, 55 have laws on violence against women that include the participation of the health sector in interventions concerning this phenomenon. In most of these countries, this participation focusses on reporting detected cases and on providing healthcare and assistance to women referred from police services. We identified 24 laws that explicitly mention the interventions developed by the health sector, mainly consisting of tertiary prevention. The laws of Mexico, Colombia, Argentina, El Salvador, Spain and the Philippines include interventions involving the three levels of prevention. One-fourth of the laws concerning violence against women studied incorporate specific interventions in the health sector, suggesting that a comprehensive approach to the problem is still required. Greater utilization of the potential of this sector is required in interventions to prevent violence against women. Copyright © 2011 SESPAS. Published by Elsevier Espana. All rights reserved.

  2. A systematic review of safety data reporting in clinical trials of vaccines against malaria, tuberculosis, and human immunodeficiency virus.

    PubMed

    Tamminga, Cindy; Kavanaugh, Michael; Fedders, Charlotte; Maiolatesi, Santina; Abraham, Neethu; Bonhoeffer, Jan; Heininger, Ulrich; Vasquez, Carlos S; Moorthy, Vasee S; Epstein, Judith E; Richie, Thomas L

    2013-08-02

    Malaria, tuberculosis (TB) and human immunodeficiency virus (HIV) are diseases with devastating effects on global public health, especially in the developing world. Clinical trials of candidate vaccines for these diseases are being conducted at an accelerating rate, and require accurate and consistent methods for safety data collection and reporting. We performed a systematic review of publications describing the safety results from clinical trials of malaria, TB and HIV vaccines, to ascertain the nature and consistency of safety data collection and reporting. The target for the review was pre-licensure trials for malaria, TB and HIV vaccines published in English from 2000 to 2009. Search strategies were customized for each of the databases utilized (MEDLINE, EMBASE, the Cochrane Database of Systematic Reviews and the Database of Reviews and Effects). Data extracted included age of trial participants, vaccine platform, route and method of vaccine administration, duration of participant follow-up, reporting of laboratory abnormalities, and the type, case definitions, severity, reporting methods and internal reporting consistency of adverse events. Of 2278 publications screened, 124 were eligible for inclusion (malaria: 66, TB: 9, HIV: 49). Safety data reporting was found to be highly variable among publications and often incomplete: overall, 269 overlapping terms were used to describe specific adverse events. 17% of publications did not mention fever. Descriptions of severity or degree of relatedness to immunization of adverse events were frequently omitted. 26% (32/124) of publications failed to report data on serious adverse events. The review demonstrated lack of standardized safety data reporting in trials for vaccines against malaria, TB and HIV. Standardization of safety data collection and reporting should be encouraged to improve data quality and comparability. The search strategy missed studies published in languages other than English and excluded studies reporting on vaccine trials for diseases besides malaria, TB and HIV. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. An Overview of ARL’s Multimodal Signatures Database and Web Interface

    DTIC Science & Technology

    2007-12-01

    ActiveX components, which hindered distribution due to license agreements and run-time license software to use such components. g. Proprietary...Overview The database consists of multimodal signature data files in the HDF5 format. Generally, each signature file contains all the ancillary...only contains information in the database, Web interface, and signature files that is releasable to the public. The Web interface consists of static

  4. Using an international p53 mutation database as a foundation for an online laboratory in an upper level undergraduate biology class.

    PubMed

    Melloy, Patricia G

    2015-01-01

    A two-part laboratory exercise was developed to enhance classroom instruction on the significance of p53 mutations in cancer development. Students were asked to mine key information from an international database of p53 genetic changes related to cancer, the IARC TP53 database. Using this database, students designed several data mining activities to look at the changes in the p53 gene from a number of perspectives, including potential cancer-causing agents leading to particular changes and the prevalence of certain p53 variations in certain cancers. In addition, students gained a global perspective on cancer prevalence in different parts of the world. Students learned how to use the database in the first part of the exercise, and then used that knowledge to search particular cancers and cancer-causing agents of their choosing in the second part of the exercise. Students also connected the information gathered from the p53 exercise to a previous laboratory exercise looking at risk factors for cancer development. The goal of the experience was to increase student knowledge of the link between p53 genetic variation and cancer. Students also were able to walk a similar path through the website as a cancer researcher using the database to enhance bench work-based experiments with complementary large-scale database p53 variation information. © 2014 The International Union of Biochemistry and Molecular Biology.

  5. Overcoming Dietary Assessment Challenges in Low-Income Countries: Technological Solutions Proposed by the International Dietary Data Expansion (INDDEX) Project.

    PubMed

    Coates, Jennifer C; Colaiezzi, Brooke A; Bell, Winnie; Charrondiere, U Ruth; Leclercq, Catherine

    2017-03-16

    An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper.

  6. A comprehensive linear programming tool to optimize formulations of ready-to-use therapeutic foods: an application to Ethiopia.

    PubMed

    Ryan, Kelsey N; Adams, Katherine P; Vosti, Stephen A; Ordiz, M Isabel; Cimo, Elizabeth D; Manary, Mark J

    2014-12-01

    Ready-to-use therapeutic food (RUTF) is the standard of care for children suffering from noncomplicated severe acute malnutrition (SAM). The objective was to develop a comprehensive linear programming (LP) tool to create novel RUTF formulations for Ethiopia. A systematic approach that surveyed international and national crop and animal food databases was used to create a global and local candidate ingredient database. The database included information about each ingredient regarding nutrient composition, ingredient category, regional availability, and food safety, processing, and price. An LP tool was then designed to compose novel RUTF formulations. For the example case of Ethiopia, the objective was to minimize the ingredient cost of RUTF; the decision variables were ingredient weights and the extent of use of locally available ingredients, and the constraints were nutritional and product-quality related. Of the new RUTF formulations found by the LP tool for Ethiopia, 32 were predicted to be feasible for creating a paste, and these were prepared in the laboratory. Palatable final formulations contained a variety of ingredients, including fish, different dairy powders, and various seeds, grains, and legumes. Nearly all of the macronutrient values calculated by the LP tool differed by <10% from results produced by laboratory analyses, but the LP tool consistently underestimated total energy. The LP tool can be used to develop new RUTF formulations that make more use of locally available ingredients. This tool has the potential to lead to production of a variety of low-cost RUTF formulations that meet international standards and thereby potentially allow more children to be treated for SAM. © 2014 American Society for Nutrition.

  7. Overcoming Dietary Assessment Challenges in Low-Income Countries: Technological Solutions Proposed by the International Dietary Data Expansion (INDDEX) Project

    PubMed Central

    Coates, Jennifer C.; Colaiezzi, Brooke A.; Bell, Winnie; Charrondiere, U. Ruth; Leclercq, Catherine

    2017-01-01

    An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper. PMID:28300759

  8. Catalogue of UV sources in the Galaxy

    NASA Astrophysics Data System (ADS)

    Beitia-Antero, L.; Gómez de Castro, A. I.

    2017-03-01

    The Galaxy Evolution Explorer (GALEX) ultraviolet (UV) database contains the largest photometric catalogue in the ultraviolet range; as a result GALEX photometric bands, Near UV band (NUV) and the Far UV band (FUV), have become standards. Nevertheless, the GALEX catalogue does not include bright UV sources due to the high sensitivity of its detectors, neither sources in the Galactic plane. In order to extend the GALEX database for future UV missions, we have obtained synthetic FUV and NUV photometry using the database of UV spectra generated by the International Ultraviolet Explorer (IUE). This database contains 63,755 spectra in the low dispersion mode (λ / δ λ ˜ 300) obtained during its 18-year lifetime. For stellar sources in the IUE database, we have selected spectra with high Signal-To-NoiseRatio (SNR) and computed FUV and NUV magnitudes using the GALEX transmission curves along with the conversion equations between flux and magnitudes provided by the mission. Besides, we have performed variability tests to determine whether the sources were variable (during the IUE observations). As a result, we have generated two different catalogues: one for non-variable stars and another one for variable sources. The former contains FUV and NUV magnitudes, while the latter gives the basic information and the FUV magnitude for each observation. The consistency of the magnitudes has been tested using White Dwarfs contained in both GALEX and IUE samples. The catalogues are available through the Centre des Donées Stellaires. The sources are distributed throughout the whole sky, with a special coverage of the Galactic plane.

  9. BAO Plate Archive Project

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Gigoyan, K. S.; Gyulzadyan, M. V.; Paronyan, G. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Kostandyan, G. R.; Samsonyan, A. L.; Mikayelyan, G. A.; Farmanyan, S. V.; Harutyunyan, V. L.

    2017-12-01

    We present the Byurakan Astrophysical Observatory (BAO) Plate Archive Project that is aimed at digitization, extraction and analysis of archival data and building an electronic database and interactive sky map. BAO Plate Archive consists of 37,500 photographic plates and films, obtained with 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. The famous Markarian Survey (or the First Byurakan Survey, FBS) 2000 plates were digitized in 2002-2005 and the Digitized FBS (DFBS, www.aras.am/Dfbs/dfbs.html) was created. New science projects have been conducted based on this low-dispersion spectroscopic material. Several other smaller digitization projects have been carried out as well, such as part of Second Byurakan Survey (SBS) plates, photographic chain plates in Coma, where the blazar ON 231 is located and 2.6m film spectra of FBS Blue Stellar Objects. However, most of the plates and films are not digitized. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. Armenian Virtual Observatory (ArVO, www.aras.am/Arvo/arvo.htm) database will accommodate all new data. The project runs in collaboration with the Armenian Institute of Informatics and Automation Problems (IIAP) and will continues during 4 years in 2015-2018. The final result will be an Electronic Database and online Interactive Sky map to be used for further research projects. ArVO will provide all standards and tools for efficient usage of the scientific output and its integration in international databases.

  10. Observational constraints on the inter-binary stellar flare hypothesis for the gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Rao, A. R.; Vahia, M. N.

    1994-01-01

    The Gamma Ray Observatory/Burst and Transient Source Experiment (GRO/BATSE) results on the Gamma Ray Bursts (GRBs) have given an internally consistent set of observations of about 260 GRBs which have been released for analysis by the BATSE team. Using this database we investigate our earlier suggestion (Vahia and Rao, 1988) that GRBs are inter-binary stellar flares from a group of objects classified as Magnetically Active Stellar Systems (MASS) which includes flare stars, RS CVn binaries and cataclysmic variables. We show that there exists an observationally consistent parameter space for the number density, scale height and flare luminosity of MASS which explains the complete log(N) - log(P) distribution of GRBs as also the observed isotropic distribution. We further use this model to predict anisotropy in the GRB distribution at intermediate luminosities. We make definite predictions under the stellar flare hypothesis that can be tested in the near future.

  11. Using Spreadsheets and Internally Consistent Databases to Explore Thermodynamics

    NASA Astrophysics Data System (ADS)

    Dasgupta, S.; Chakraborty, S.

    2003-12-01

    Much common wisdom has been handed down to generations of petrology students in words - a non-exhaustive list may include (a) do not mix data from two different thermodynamic databases, (b) use of different heat capacity functions or extrapolation beyond the P-T range of fit can have disastrous results, (c) consideration of errors in thermodynamic calculations is crucial, (d) consideration of non-ideality, interaction parameters etc. are important in some cases, but not in others. Actual calculations to demonstrate these effects were either too laborious, tedious, time consuming or involved elaborate computer programming beyond the reaches of the average undergraduate. We have produced "Live" thermodynamic tables in the form of ExcelTM spreadsheets based on standard internally consistent thermodynamic databases (e.g. Berman, Holland and Powell) that allow quick, easy and most importantly, transparent manipulation of thermodynamic data to calculate mineral stabilities and to explore the role of different parameters. We have intentionally avoided the use of advanced tools such as macros, and have set up columns of data that are easy to relate to thermodynamic relationships to enhance transparency. The approach consists of the following basic steps: (i) use a simple supporting spreadsheet to enter mineral compositions (in formula units) to obtain a balanced reaction by matrix inversion. (ii) enter the stoichiometry of this reaction in a designated space and a P and T to get the delta G of the reaction (iii) vary P and or T to locate equilibrium through a change of sign of delta G. These results can be collected to explore practically any problem of chemical equilibrium and mineral stability. Some of our favorites include (a) hierarchical addition of complexity to equilibrium calculations - start with a simple end member reaction ignoring heat capacity and volume derivatives, add the effects of these, followed by addition of compositional effects in the form of ideal solutions, add non-ideality next and finally, explore the role of varying parameters in simple models of non-ideality. (b) Arbitrarily change (i.e. simulate error) or mix data from different sources to see the consequences directly. More traditional exercises such as exploration of slopes of reaction in P-T space are trivial, and other thermodynamic tidbits such as "bigger the mineral formula, greater its thermodynamic weight" become apparent to undergraduates early on through such direct handling of data. The overall outcome is a far more quantitative appreciation of mineral stabilities and thermodynamic variables without actually doing any Math!

  12. Network-based statistical comparison of citation topology of bibliographic databases

    PubMed Central

    Šubelj, Lovro; Fiala, Dalibor; Bajec, Marko

    2014-01-01

    Modern bibliographic databases provide the basis for scientific research and its evaluation. While their content and structure differ substantially, there exist only informal notions on their reliability. Here we compare the topological consistency of citation networks extracted from six popular bibliographic databases including Web of Science, CiteSeer and arXiv.org. The networks are assessed through a rich set of local and global graph statistics. We first reveal statistically significant inconsistencies between some of the databases with respect to individual statistics. For example, the introduced field bow-tie decomposition of DBLP Computer Science Bibliography substantially differs from the rest due to the coverage of the database, while the citation information within arXiv.org is the most exhaustive. Finally, we compare the databases over multiple graph statistics using the critical difference diagram. The citation topology of DBLP Computer Science Bibliography is the least consistent with the rest, while, not surprisingly, Web of Science is significantly more reliable from the perspective of consistency. This work can serve either as a reference for scholars in bibliometrics and scientometrics or a scientific evaluation guideline for governments and research agencies. PMID:25263231

  13. Aviation Safety Issues Database

    NASA Technical Reports Server (NTRS)

    Morello, Samuel A.; Ricks, Wendell R.

    2009-01-01

    The aviation safety issues database was instrumental in the refinement and substantiation of the National Aviation Safety Strategic Plan (NASSP). The issues database is a comprehensive set of issues from an extremely broad base of aviation functions, personnel, and vehicle categories, both nationally and internationally. Several aviation safety stakeholders such as the Commercial Aviation Safety Team (CAST) have already used the database. This broader interest was the genesis to making the database publically accessible and writing this report.

  14. The CIS Database: Occupational Health and Safety Information Online.

    ERIC Educational Resources Information Center

    Siegel, Herbert; Scurr, Erica

    1985-01-01

    Describes document acquisition, selection, indexing, and abstracting and discusses online searching of the CIS database, an online system produced by the International Occupational Safety and Health Information Centre. This database comprehensively covers information in the field of occupational health and safety. Sample searches and search…

  15. Database Development for Electrical, Electronic, and Electromechanical (EEE) Parts for the International Space Station Alpha

    NASA Technical Reports Server (NTRS)

    Wassil-Grimm, Andrew D.

    1997-01-01

    More effective electronic communication processes are needed to transfer contractor and international partner data into NASA and prime contractor baseline database systems. It is estimated that the International Space Station Alpha (ISSA) parts database will contain up to one million parts each of which may require database capabilities for approximately one thousand bytes of data for each part. The resulting gigabyte database must provide easy access to users who will be preparing multiple analyses and reports in order to verify as-designed, as-built, launch, on-orbit, and return configurations for up to 45 missions associated with the construction of the ISSA. Additionally, Internet access to this data base is strongly indicated to allow multiple user access from clients located in many foreign countries. This summer's project involved familiarization and evaluation of the ISSA Electrical, Electronic, and Electromechanical (EEE) Parts data and the process of electronically managing these data. Particular attention was devoted to improving the interfaces among the many elements of the ISSA information system and its global customers and suppliers. Additionally, prototype queries were developed to facilitate the identification of data changes in the data base, verifications that the designs used only approved parts, and certifications that the flight hardware containing EEE parts was ready for flight. This project also resulted in specific recommendations to NASA for further development in the area of EEE parts database development and usage.

  16. [Analysis of relation between the development of study and literatures about benign positional paroxysmal vertigo published international and domestic].

    PubMed

    Jia, Jianping; Sun, Xiaohui; Dai, Song; Sang, Yuehong

    2016-01-01

    Benign paroxysmal positional vertigo (BPPV) is a common vestibular disorder that causes vertigo. Study of BPPV has dramatically rapid progress in recent years. We analyze the BPPV growth We searched the international data quantity year by year in database of PubMed, ScienceDirect and WILEY before 2014 respectively, then we searched the domestic data quantity year by year in database of CNKI, VIP and Wanfang Data before 2015 by selecting "Benign paroxysmal positional vertigo" as the keywords. Then we carried out regression analysis with the gathered results in above databases to determine data growth regularity and main factors that affect future development of BPPV. Also, we analyzes published BPPV papers in domestic and international journals. PubMed database contains 808 literatures, ScienceDirect contains 177 database and WILEY contains 46 literatures, All together we collected 1 038 international articles. CNKI contains 440 literatures, VIP contains 580 literatures and WanFang data contains 449 literatures. All together we collected 1 469 domestic literatures. It shows the rising trend of the literature accumulation amount of BPPV. The scattered point diagram of BPPV shows an exponential growing trend, which was growing slowly in the early time but rapidly in recent years. It shows that the development of BPPV has three stages from international arical: exploration period (before 1985), breakthrough period (1986-1998). The deepening stage (after 1998), Chinese literature also has three stages from domestic BPPV precess. Blank period (before the year of 1982), the enlightenment period (1982-2004), the deepening stage (after the year of 2004). In the pregress of BPPV, many outsantding sccholars played an important role in domestic scitifction of researching, which has produced a certain influence in the worldwide.

  17. Adding glycaemic index and glycaemic load functionality to DietPLUS, a Malaysian food composition database and diet intake calculator.

    PubMed

    Shyam, Sangeetha; Wai, Tony Ng Kock; Arshad, Fatimah

    2012-01-01

    This paper outlines the methodology to add glycaemic index (GI) and glycaemic load (GL) functionality to food DietPLUS, a Microsoft Excel-based Malaysian food composition database and diet intake calculator. Locally determined GI values and published international GI databases were used as the source of GI values. Previously published methodology for GI value assignment was modified to add GI and GL calculators to the database. Two popular local low GI foods were added to the DietPLUS database, bringing up the total number of foods in the database to 838 foods. Overall, in relation to the 539 major carbohydrate foods in the Malaysian Food Composition Database, 243 (45%) food items had local Malaysian values or were directly matched to International GI database and another 180 (33%) of the foods were linked to closely-related foods in the GI databases used. The mean ± SD dietary GI and GL of the dietary intake of 63 women with previous gestational diabetes mellitus, calculated using DietPLUS version3 were, 62 ± 6 and 142 ± 45, respectively. These values were comparable to those reported from other local studies. DietPLUS version3, a simple Microsoft Excel-based programme aids calculation of diet GI and GL for Malaysian diets based on food records.

  18. The epidemiological modelling of dysthymia: application for the Global Burden of Disease Study 2010.

    PubMed

    Charlson, Fiona J; Ferrari, Alize J; Flaxman, Abraham D; Whiteford, Harvey A

    2013-10-01

    In order to capture the differences in burden between the subtypes of depression, the Global Burden of Disease 2010 Study for the first time estimated the burden of dysthymia and major depressive disorder separately from the previously used umbrella term 'unipolar depression'. A global summary of epidemiological parameters are necessary inputs in burden of disease calculations for 21 world regions, males and females and for the year 1990, 2005 and 2010. This paper reports findings from a systematic review of global epidemiological data and the subsequent development of an internally consistent epidemiological model of dysthymia. A systematic search was conducted to identify data sources for the prevalence, incidence, remission and excess-mortality of dysthymia using Medline, PsycINFO and EMBASE electronic databases and grey literature. DisMod-MR, a Bayesian meta-regression tool, was used to check the epidemiological parameters for internal consistency and to predict estimates for world regions with no or few data. The systematic review identified 38 studies meeting inclusion criteria which provided 147 data points for 30 countries in 13 of 21 world regions. Prevalence increases in the early ages, peaking at around 50 years. Females have higher prevalence of dysthymia than males. Global pooled prevalence remained constant across time points at 1.55% (95%CI 1.50-1.60). There was very little regional variation in prevalence estimates. There were eight GBD world regions for which we found no data for which DisMod-MR had to impute estimates. The addition of internally consistent epidemiological estimates by world region, age, sex and year for dysthymia contributed to a more comprehensive estimate of mental health burden in GBD 2010. © 2013 Elsevier B.V. All rights reserved.

  19. [Comparing different treatments for femoral neck fracture of displacement type in the elderly:a meta analysis].

    PubMed

    Zhao, Wenbo; Tu, Chongqi; Zhang, Hui; Fang, Yue; Wang, Guanglin; Liu, Lei

    2014-04-01

    To compare the effects and security between internal fixation and total hip arthroplasty for the patients in elderly with femoral neck fracture of displacement type through a meta analysis. Studies on comparison between internal fixation and total hip arthroplasty for the patients in the elderly with femoral neck fracture of displacement type were identified from PubMed database,EMBase database, COCHRANE library, CMB database, CNKI database and MEDLINE database. Data analysis were performed using Revman 5.2.6(the Cochrane Collaboration). Six published randomized controlled trials including 627 patients were suitable for the review, 286 cases in internal fixation group and 341 cases in total hip arthroplasty group. The results of meta analysis indicated that statistically significant difference were observed between the two groups in the quality of life which was reflected by the Harris scale (RR = 0.82, 95%CI:0.72-0.93, P < 0.05) , the reoperation rate (RR = 5.81, 95%CI:3.09-10.95, P < 0.05) and the major complications rate (RR = 3.60, 95%CI:2.29-5.67, P < 0.05) postoperatively. There were no difference in the mortality at 1 year and 5 years postoperatively(P > 0.05). For the patients with femoral neck fracture of displacement type in the elderly, there is no statistical difference between two groups in the mortality postoperatively. The quality of life and the security of operation in internal fixation group is worse than the total hip arthroplasty group.

  20. Measurement properties of patient-reported outcome measures (PROMs) used in adult patients with chronic kidney disease: A systematic review

    PubMed Central

    Kyte, Derek; Cockwell, Paul; Marshall, Tom; Gheorghe, Adrian; Keeley, Thomas; Slade, Anita; Calvert, Melanie

    2017-01-01

    Background Patient-reported outcome measures (PROMs) can provide valuable information which may assist with the care of patients with chronic kidney disease (CKD). However, given the large number of measures available, it is unclear which PROMs are suitable for use in research or clinical practice. To address this we comprehensively evaluated studies that assessed the measurement properties of PROMs in adults with CKD. Methods Four databases were searched; reference list and citation searching of included studies was also conducted. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist was used to appraise the methodological quality of the included studies and to inform a best evidence synthesis for each PROM. Results The search strategy retrieved 3,702 titles/abstracts. After 288 duplicates were removed, 3,414 abstracts were screened and 71 full-text articles were retrieved for further review. Of these, 24 full-text articles were excluded as they did not meet the eligibility criteria. Following reference list and citation searching, 19 articles were retrieved bringing the total number of papers included in the final analysis to 66. There was strong evidence supporting internal consistency and moderate evidence supporting construct validity for the Kidney Disease Quality of Life-36 (KDQOL-36) in pre-dialysis patients. In the dialysis population, the KDQOL-Short Form (KDQOL-SF) had strong evidence for internal consistency and structural validity and moderate evidence for test-retest reliability and construct validity while the KDQOL-36 had moderate evidence of internal consistency, test-retest reliability and construct validity. The End Stage Renal Disease-Symptom Checklist Transplantation Module (ESRD-SCLTM) demonstrated strong evidence for internal consistency and moderate evidence for test-retest reliability, structural and construct validity in renal transplant recipients. Conclusions We suggest considering the KDQOL-36 for use in pre-dialysis patients; the KDQOL-SF or KDQOL-36 for dialysis patients and the ESRD-SCLTM for use in transplant recipients. However, further research is required to evaluate the measurement error, structural validity, responsiveness and patient acceptability of PROMs used in CKD. PMID:28636678

  1. New design and facilities for the International Database for Absolute Gravity Measurements (AGrav): A support for the Establishment of a new Global Absolute Gravity Reference System

    NASA Astrophysics Data System (ADS)

    Wziontek, Hartmut; Falk, Reinhard; Bonvalot, Sylvain; Rülke, Axel

    2017-04-01

    After about 10 years of successful joint operation by BGI and BKG, the International Database for Absolute Gravity Measurements "AGrav" (see references hereafter) was under a major revision. The outdated web interface was replaced by a responsive, high level web application framework based on Python and built on top of Pyramid. Functionality was added, like interactive time series plots or a report generator and the interactive map-based station overview was updated completely, comprising now clustering and the classification of stations. Furthermore, the database backend was migrated to PostgreSQL for better support of the application framework and long-term availability. As comparisons of absolute gravimeters (AG) become essential to realize a precise and uniform gravity standard, the database was extended to document the results on international and regional level, including those performed at monitoring stations equipped with SGs. By this it will be possible to link different AGs and to trace their equivalence back to the key comparisons under the auspices of International Committee for Weights and Measures (CIPM) as the best metrological realization of the absolute gravity standard. In this way the new AGrav database accommodates the demands of the new Global Absolute Gravity Reference System as recommended by the IAG Resolution No. 2 adopted in Prague 2015. The new database will be presented with focus on the new user interface and new functionality, calling all institutions involved in absolute gravimetry to participate and contribute with their information to built up a most complete picture of high precision absolute gravimetry and improve its visibility. A Digital Object Identifier (DOI) will be provided by BGI to contributors to give a better traceability and facilitate the referencing of their gravity surveys. Links and references: BGI mirror site : http://bgi.obs-mip.fr/data-products/Gravity-Databases/Absolute-Gravity-data/ BKG mirror site: http://agrav.bkg.bund.de/agrav-meta/ Wilmes, H., H. Wziontek, R. Falk, S. Bonvalot (2009). AGrav - the New Absolute Gravity Database and a Proposed Cooperation with the GGP Project. J. of Geodynamics, 48, pp. 305-309. doi:10.1016/j.jog.2009.09.035. Wziontek, H., H. Wilmes, S. Bonvalot (2011). AGrav: An international database for absolute gravity measurements. In Geodesy for Planet Earth (S. Kenyon at al. eds). IAG Symposia, 136, 1035-1040, Springer, Berlin. 2011. doi:10.1007/978-3-642-20338-1_130.

  2. REPDOSE: A database on repeated dose toxicity studies of commercial chemicals--A multifunctional tool.

    PubMed

    Bitsch, A; Jacobi, S; Melber, C; Wahnschaffe, U; Simetska, N; Mangelsdorf, I

    2006-12-01

    A database for repeated dose toxicity data has been developed. Studies were selected by data quality. Review documents or risk assessments were used to get a pre-screened selection of available valid data. The structure of the chemicals should be rather simple for well defined chemical categories. The database consists of three core data sets for each chemical: (1) structural features and physico-chemical data, (2) data on study design, (3) study results. To allow consistent queries, a high degree of standardization categories and glossaries were developed for relevant parameters. At present, the database consists of 364 chemicals investigated in 1018 studies which resulted in a total of 6002 specific effects. Standard queries have been developed, which allow analyzing the influence of structural features or PC data on LOELs, target organs and effects. Furthermore, it can be used as an expert system. First queries have shown that the database is a very valuable tool.

  3. Market Pressure and Government Intervention in the Administration and Development of Molecular Databases.

    ERIC Educational Resources Information Center

    Sillince, J. A. A.; Sillince, M.

    1993-01-01

    Discusses molecular databases and the role that government and private companies play in their administration and development. Highlights include copyright and patent issues relating to public databases and the information contained in them; data quality; data structures and technological questions; the international organization of molecular…

  4. ExplorEnz: the primary source of the IUBMB enzyme list

    PubMed Central

    McDonald, Andrew G.; Boyce, Sinéad; Tipton, Keith F.

    2009-01-01

    ExplorEnz is the MySQL database that is used for the curation and dissemination of the International Union of Biochemistry and Molecular Biology (IUBMB) Enzyme Nomenclature. A simple web-based query interface is provided, along with an advanced search engine for more complex Boolean queries. The WWW front-end is accessible at http://www.enzyme-database.org, from where downloads of the database as SQL and XML are also available. An associated form-based curatorial application has been developed to facilitate the curation of enzyme data as well as the internal and public review processes that occur before an enzyme entry is made official. Suggestions for new enzyme entries, or modifications to existing ones, can be made using the forms provided at http://www.enzyme-database.org/forms.php. PMID:18776214

  5. The International Outer Planets Watch atmospheres node database of giant-planet images

    NASA Astrophysics Data System (ADS)

    Hueso, R.; Legarreta, J.; Sánchez-Lavega, A.; Rojas, J. F.; Gómez-Forrellad, J. M.

    2011-10-01

    The Atmospheres Node of the International Outer Planets Watch (IOPW) is aimed to encourage the observations and study of the atmospheres of the Giant Planets. One of its main activities is to provide an interaction between the professional and amateur astronomical communities maintaining an online and fully searchable database of images of the giant planets obtained from amateur astronomers and available to both professional and amateurs [1]. The IOPW database contains about 13,000 image observations of Jupiter and Saturn obtained in the visible range with a few contributions of Uranus and Neptune. We describe the organization and structure of the database as posted in the Internet and in particular the PVOL software (Planetary Virtual Observatory & Laboratory) designed to manage the site and based in concepts from Virtual Observatory projects.

  6. Defining traumatic brain injury in children and youth using international classification of diseases version 10 codes: a systematic review protocol.

    PubMed

    Chan, Vincy; Thurairajah, Pravheen; Colantonio, Angela

    2013-11-13

    Although healthcare administrative data are commonly used for traumatic brain injury research, there is currently no consensus or consistency on using the International Classification of Diseases version 10 codes to define traumatic brain injury among children and youth. This protocol is for a systematic review of the literature to explore the range of International Classification of Diseases version 10 codes that are used to define traumatic brain injury in this population. The databases MEDLINE, MEDLINE In-Process, Embase, PsychINFO, CINAHL, SPORTDiscus, and Cochrane Database of Systematic Reviews will be systematically searched. Grey literature will be searched using Grey Matters and Google. Reference lists of included articles will also be searched. Articles will be screened using predefined inclusion and exclusion criteria and all full-text articles that meet the predefined inclusion criteria will be included for analysis. The study selection process and reasons for exclusion at the full-text level will be presented using a PRISMA study flow diagram. Information on the data source of included studies, year and location of study, age of study population, range of incidence, and study purpose will be abstracted into a separate table and synthesized for analysis. All International Classification of Diseases version 10 codes will be listed in tables and the codes that are used to define concussion, acquired traumatic brain injury, head injury, or head trauma will be identified. The identification of the optimal International Classification of Diseases version 10 codes to define this population in administrative data is crucial, as it has implications for policy, resource allocation, planning of healthcare services, and prevention strategies. It also allows for comparisons across countries and studies. This protocol is for a review that identifies the range and most common diagnoses used to conduct surveillance for traumatic brain injury in children and youth. This is an important first step in reaching an appropriate definition using International Classification of Diseases version 10 codes and can inform future work on reaching consensus on the codes to define traumatic brain injury for this vulnerable population.

  7. Database of Standardized Questionnaires About Walking & Bicycling

    Cancer.gov

    This database contains questionnaire items and a list of validation studies for standardized items related to walking and biking. The items come from multiple national and international physical activity questionnaires.

  8. NREL: U.S. Life Cycle Inventory Database - About the LCI Database Project

    Science.gov Websites

    About the LCI Database Project The U.S. Life Cycle Inventory (LCI) Database is a publicly available data collection and analysis methods. Finding consistent and transparent LCI data for life cycle and maintain the database. The 2009 U.S. Life Cycle Inventory (LCI) Data Stakeholder meeting was an

  9. P43-S Computational Biology Applications Suite for High-Performance Computing (BioHPC.net)

    PubMed Central

    Pillardy, J.

    2007-01-01

    One of the challenges of high-performance computing (HPC) is user accessibility. At the Cornell University Computational Biology Service Unit, which is also a Microsoft HPC institute, we have developed a computational biology application suite that allows researchers from biological laboratories to submit their jobs to the parallel cluster through an easy-to-use Web interface. Through this system, we are providing users with popular bioinformatics tools including BLAST, HMMER, InterproScan, and MrBayes. The system is flexible and can be easily customized to include other software. It is also scalable; the installation on our servers currently processes approximately 8500 job submissions per year, many of them requiring massively parallel computations. It also has a built-in user management system, which can limit software and/or database access to specified users. TAIR, the major database of the plant model organism Arabidopsis, and SGN, the international tomato genome database, are both using our system for storage and data analysis. The system consists of a Web server running the interface (ASP.NET C#), Microsoft SQL server (ADO.NET), compute cluster running Microsoft Windows, ftp server, and file server. Users can interact with their jobs and data via a Web browser, ftp, or e-mail. The interface is accessible at http://cbsuapps.tc.cornell.edu/.

  10. Acupuncture for chronic prostatitis: A systematic review and meta-analysis protocol.

    PubMed

    Peng, Tianzhong; Cheng, Ying; Jin, Yuhao; Xu, Na; Guo, Taipin

    2018-04-01

    Chronic prostatitis (CP) is a prevalent genitourinary condition. Considering its safety profile, acupuncture can be an option treating CP symptoms. The aim of this review is to undertake a systematic review to estimate the effectiveness and safety of acupuncture on CP. We will search all randomized controlled trials for CP in August 2018 in the databases of MEDLINE, Cochrane Library, Web of Science, EMBASE, Springer, WHO International Clinical Trials Registry Platform (ICTRP), China National Knowledge Infrastructure (CNKI), Wan fang, Chinese Biomedical Literature Database (CBM), PsycInfo, Chinese Scientific Journal Database (VIP), and other available resources. Languages are limited as English and Chinese. Search terms used are will "acupuncture," and "chronic prostatitis," "non-bacterial prostatitis," "abacterial prostatitis." And duplicates will be screened. The primary outcomes consisted of improvement rate and pain relief evaluated by The National Institutes of Health Chronic Prostatitis Symptom Index (NIH-CPSI) index. Secondary outcomes include the recurrence rate and side effects, such as pneumothorax, discomforts, and infection. This study will demonstrate an evidence-based review of acupuncture for chronic prostatitis. The study will provide clear evidence to assess the effectiveness and side effects of acupuncture for chronic prostatitis. There is no requirement of ethical approval and it will be in print or disseminated by electronic copies. CRD42018088834.

  11. Modeling and Prediction of Solvent Effect on Human Skin Permeability using Support Vector Regression and Random Forest.

    PubMed

    Baba, Hiromi; Takahara, Jun-ichi; Yamashita, Fumiyoshi; Hashida, Mitsuru

    2015-11-01

    The solvent effect on skin permeability is important for assessing the effectiveness and toxicological risk of new dermatological formulations in pharmaceuticals and cosmetics development. The solvent effect occurs by diverse mechanisms, which could be elucidated by efficient and reliable prediction models. However, such prediction models have been hampered by the small variety of permeants and mixture components archived in databases and by low predictive performance. Here, we propose a solution to both problems. We first compiled a novel large database of 412 samples from 261 structurally diverse permeants and 31 solvents reported in the literature. The data were carefully screened to ensure their collection under consistent experimental conditions. To construct a high-performance predictive model, we then applied support vector regression (SVR) and random forest (RF) with greedy stepwise descriptor selection to our database. The models were internally and externally validated. The SVR achieved higher performance statistics than RF. The (externally validated) determination coefficient, root mean square error, and mean absolute error of SVR were 0.899, 0.351, and 0.268, respectively. Moreover, because all descriptors are fully computational, our method can predict as-yet unsynthesized compounds. Our high-performance prediction model offers an attractive alternative to permeability experiments for pharmaceutical and cosmetic candidate screening and optimizing skin-permeable topical formulations.

  12. Three Library and Information Science Databases Revisited: Currency, Coverage and Overlap, Interindexing Consistency.

    ERIC Educational Resources Information Center

    Blackwell, Michael Lind

    This study evaluates the "Education Resources Information Center" (ERIC), "Library and Information Science Abstracts" (LISA), and "Library Literature" (LL) databases, determining how long the databases take to enter records (indexing delay), how much duplication of effort exists among the three databases (indexing…

  13. Current Abstracts Nuclear Reactors and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bales, J.D.; Hicks, S.C.

    1993-01-01

    This publication Nuclear Reactors and Technology (NRT) announces on a monthly basis the current worldwide information available from the open literature on nuclear reactors and technology, including all aspects of power reactors, components and accessories, fuel elements, control systems, and materials. This publication contains the abstracts of DOE reports, journal articles, conference papers, patents, theses, and monographs added to the Energy Science and Technology Database during the past month. Also included are US information obtained through acquisition programs or interagency agreements and international information obtained through acquisition programs or interagency agreements and international information obtained through the International Energy Agency`smore » Energy Technology Data Exchange or government-to-government agreements. The digests in NRT and other citations to information on nuclear reactors back to 1948 are available for online searching and retrieval on the Energy Science and Technology Database and Nuclear Science Abstracts (NSA) database. Current information, added daily to the Energy Science and Technology Database, is available to DOE and its contractors through the DOE Integrated Technical Information System. Customized profiles can be developed to provide current information to meet each user`s needs.« less

  14. Interrater agreement in visual scoring of neonatal seizures based on majority voting on a web-based system: The Neoguard EEG database.

    PubMed

    Dereymaeker, Anneleen; Ansari, Amir H; Jansen, Katrien; Cherian, Perumpillichira J; Vervisch, Jan; Govaert, Paul; De Wispelaere, Leen; Dielman, Charlotte; Matic, Vladimir; Dorado, Alexander Caicedo; De Vos, Maarten; Van Huffel, Sabine; Naulaers, Gunnar

    2017-09-01

    To assess interrater agreement based on majority voting in visual scoring of neonatal seizures. An online platform was designed based on a multicentre seizure EEG-database. Consensus decision based on 'majority voting' and interrater agreement was estimated using Fleiss' Kappa. The influences of different factors on agreement were determined. 1919 Events extracted from 280h EEG of 71 neonates were reviewed by 4 raters. Majority voting was applied to assign a seizure/non-seizure classification. 44% of events were classified with high, 36% with moderate, and 20% with poor agreement, resulting in a Kappa value of 0.39. 68% of events were labelled as seizures, and in 46%, all raters were convinced about electrographic seizures. The most common seizure duration was <30s. Raters agreed best for seizures lasting 60-120s. There was a significant difference in electrographic characteristics of seizures versus dubious events, with seizures having longer duration, higher power and amplitude. There is a wide variability in identifying rhythmic ictal and non-ictal EEG events, and only the most robust ictal patterns are consistently agreed upon. Database composition and electrographic characteristics are important factors that influence interrater agreement. The use of well-described databases and input of different experts will improve neonatal EEG interpretation and help to develop uniform seizure definitions, useful for evidence-based studies of seizure recognition and management. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  15. Herpesvirus systematics☆

    PubMed Central

    Davison, Andrew J.

    2010-01-01

    This paper is about the taxonomy and genomics of herpesviruses. Each theme is presented as a digest of current information flanked by commentaries on past activities and future directions. The International Committee on Taxonomy of Viruses recently instituted a major update of herpesvirus classification. The former family Herpesviridae was elevated to a new order, the Herpesvirales, which now accommodates 3 families, 3 subfamilies, 17 genera and 90 species. Future developments will include revisiting the herpesvirus species definition and the criteria used for taxonomic assignment, particularly in regard to the possibilities of classifying the large number of herpesviruses detected only as DNA sequences by polymerase chain reaction. Nucleotide sequence accessions in primary databases, such as GenBank, consist of the sequences plus annotations of the genetic features. The quality of these accessions is important because they provide a knowledge base that is used widely by the research community. However, updating the accessions to take account of improved knowledge is essentially reserved to the original depositors, and this activity is rarely undertaken. Thus, the primary databases are likely to become antiquated. In contrast, secondary databases are open to curation by experts other than the original depositors, thus increasing the likelihood that they will remain up to date. One of the most promising secondary databases is RefSeq, which aims to furnish the best available annotations for complete genome sequences. Progress in regard to improving the RefSeq herpesvirus accessions is discussed, and insights into particular aspects of herpesvirus genomics arising from this work are reported. PMID:20346601

  16. Data Resource Profile: United Nations Children’s Fund (UNICEF)

    PubMed Central

    Murray, Colleen; Newby, Holly

    2012-01-01

    The United Nations Children’s Fund (UNICEF) plays a leading role in the collection, compilation, analysis and dissemination of data to inform sound policies, legislation and programmes for promoting children’s rights and well-being, and for global monitoring of progress towards the Millennium Development Goals. UNICEF maintains a set of global databases representing nearly 200 countries and covering the areas of child mortality, child health, maternal health, nutrition, immunization, water and sanitation, HIV/AIDS, education and child protection. These databases consist of internationally comparable and statistically sound data, and are updated annually through a process that draws on a wealth of data provided by UNICEF’s wide network of >150 field offices. The databases are composed primarily of estimates from household surveys, with data from censuses, administrative records, vital registration systems and statistical models contributing to some key indicators as well. The data are assessed for quality based on a set of objective criteria to ensure that only the most reliable nationally representative information is included. For most indicators, data are available at the global, regional and national levels, plus sub-national disaggregation by sex, urban/rural residence and household wealth. The global databases are featured in UNICEF’s flagship publications, inter-agency reports, including the Secretary General’s Millennium Development Goals Report and Countdown to 2015, sector-specific reports and statistical country profiles. They are also publicly available on www.childinfo.org, together with trend data and equity analyses. PMID:23211414

  17. Glycemic Index Diet: What's Behind the Claims

    MedlinePlus

    ... choices for people with diabetes. An international GI database is maintained by Sydney University Glycemic Index Research Services in Sydney, Australia. The database contains the results of studies conducted there and ...

  18. Producing a Climate-Quality Database of Global Upper Ocean Profile Temperatures - The IQuOD (International Quality-controlled Ocean Database) Project.

    NASA Astrophysics Data System (ADS)

    Sprintall, J.; Cowley, R.; Palmer, M. D.; Domingues, C. M.; Suzuki, T.; Ishii, M.; Boyer, T.; Goni, G. J.; Gouretski, V. V.; Macdonald, A. M.; Thresher, A.; Good, S. A.; Diggs, S. C.

    2016-02-01

    Historical ocean temperature profile observations provide a critical element for a host of ocean and climate research activities. These include providing initial conditions for seasonal-to-decadal prediction systems, evaluating past variations in sea level and Earth's energy imbalance, ocean state estimation for studying variability and change, and climate model evaluation and development. The International Quality controlled Ocean Database (IQuOD) initiative represents a community effort to create the most globally complete temperature profile dataset, with (intelligent) metadata and assigned uncertainties. With an internationally coordinated effort organized by oceanographers, with data and ocean instrumentation expertise, and in close consultation with end users (e.g., climate modelers), the IQuOD initiative will assess and maximize the potential of an irreplaceable collection of ocean temperature observations (tens of millions of profiles collected at a cost of tens of billions of dollars, since 1772) to fulfil the demand for a climate-quality global database that can be used with greater confidence in a vast range of climate change related research and services of societal benefit. Progress towards version 1 of the IQuOD database, ongoing and future work will be presented. More information on IQuOD is available at www.iquod.org.

  19. Adaptive plasticity in speech perception: Effects of external information and internal predictions.

    PubMed

    Guediche, Sara; Fiez, Julie A; Holt, Lori L

    2016-07-01

    When listeners encounter speech under adverse listening conditions, adaptive adjustments in perception can improve comprehension over time. In some cases, these adaptive changes require the presence of external information that disambiguates the distorted speech signals, whereas in other cases mere exposure is sufficient. Both external (e.g., written feedback) and internal (e.g., prior word knowledge) sources of information can be used to generate predictions about the correct mapping of a distorted speech signal. We hypothesize that these predictions provide a basis for determining the discrepancy between the expected and actual speech signal that can be used to guide adaptive changes in perception. This study provides the first empirical investigation that manipulates external and internal factors through (a) the availability of explicit external disambiguating information via the presence or absence of postresponse orthographic information paired with a repetition of the degraded stimulus, and (b) the accuracy of internally generated predictions; an acoustic distortion is introduced either abruptly or incrementally. The results demonstrate that the impact of external information on adaptive plasticity is contingent upon whether the intelligibility of the stimuli permits accurate internally generated predictions during exposure. External information sources enhance adaptive plasticity only when input signals are severely degraded and cannot reliably access internal predictions. This is consistent with a computational framework for adaptive plasticity in which error-driven supervised learning relies on the ability to compute sensory prediction error signals from both internal and external sources of information. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. FIREMON Database

    Treesearch

    John F. Caratti

    2006-01-01

    The FIREMON database software allows users to enter data, store, analyze, and summarize plot data, photos, and related documents. The FIREMON database software consists of a Java application and a Microsoft® Access database. The Java application provides the user interface with FIREMON data through data entry forms, data summary reports, and other data management tools...

  1. Hypersonic and Supersonic Flow Roadmaps Using Bibliometrics and Database Tomography.

    ERIC Educational Resources Information Center

    Kostoff, R. N.; Eberhart, Henry J.; Toothman, Darrell Ray

    1999-01-01

    Database Tomography (DT) is a textual database-analysis system consisting of algorithms for extracting multiword phrase frequencies and proximities from a large textual database, to augment interpretative capabilities of the expert human analyst. Describes use of the DT process, supplemented by literature bibliometric analyses, to derive technical…

  2. KSC-99pp0313

    NASA Image and Video Library

    1999-03-23

    In the Multi-Payload Processing Facility, Mary Reaves (left) and Richard Rainen, with the Jet Propulsion Laboratory, check out the carrier and horizontal antenna mast for the STS-99 Shuttle Radar Topography Mission (SRTM). The SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during an 11-day mission in September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  3. KSC-99pp0503

    NASA Image and Video Library

    1999-05-07

    Inside the Space Station Processing Facility, the Shuttle Radar Topography Mission (SRTM) is maneuvered into place to prepare it for launch targeted for September 1999. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  4. KSC-99pp0312

    NASA Image and Video Library

    1999-03-23

    In the Multi-Payload Processing Facility, Beverly St. Ange, with the Jet Propulsion Laboratory, wires a biopod, a component of the STS-99 Shuttle Radar Topography Mission (SRTM). The SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during an 11-day mission in September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  5. KSC-99pp0330

    NASA Image and Video Library

    1999-03-24

    The Shuttle Radar Topography Mission (SRTM) sits inside the Multi-Payload Processing Facility after the SRTM's cover was removed. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  6. KSC-99pp0329

    NASA Image and Video Library

    1999-03-24

    Inside the Multi-Payload Processing Facility, the Shuttle Radar Topography Mission (SRTM) is revealed after the lid of its container was removed. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  7. KSC-99pp0328

    NASA Image and Video Library

    1999-03-24

    Inside the Multi-Payload Processing Facility, the lid covering the Shuttle Radar Topography Mission (SRTM) is lifted from the crate. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission scheduled for September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  8. KSC-99pp0502

    NASA Image and Video Library

    1999-05-07

    The Shuttle Radar Topography Mission (SRTM) is moved into the Space Station Processing Facility to prepare it for launch targeted for September 1999. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  9. New geothermal database for Utah

    USGS Publications Warehouse

    Blackett, Robert E.; ,

    1993-01-01

    The Utah Geological Survey complied a preliminary database consisting of over 800 records on thermal wells and springs in Utah with temperatures of 20??C or greater. Each record consists of 35 fields, including location of the well or spring, temperature, depth, flow-rate, and chemical analyses of water samples. Developed for applications on personal computers, the database will be useful for geochemical, statistical, and other geothermal related studies. A preliminary map of thermal wells and springs in Utah, which accompanies the database, could eventually incorporate heat-flow information, bottom-hole temperatures from oil and gas wells, traces of Quaternary faults, and locations of young volcanic centers.

  10. Microcomputer-Based Access to Machine-Readable Numeric Databases.

    ERIC Educational Resources Information Center

    Wenzel, Patrick

    1988-01-01

    Describes the use of microcomputers and relational database management systems to improve access to numeric databases by the Data and Program Library Service at the University of Wisconsin. The internal records management system, in-house reference tools, and plans to extend these tools to the entire campus are discussed. (3 references) (CLB)

  11. The Database Business: Managing Today--Planning for Tomorrow. Issues and Futures.

    ERIC Educational Resources Information Center

    Aitchison, T. M.; And Others

    1988-01-01

    Current issues and the future of the database business are discussed in five papers. Topics covered include aspects relating to the quality of database production; international ownership in the U.S. information marketplace; an overview of pricing strategies in the electronic information industry; and pricing issues from the viewpoints of online…

  12. Interactive Scene Analysis Module - A sensor-database fusion system for telerobotic environments

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; Vazquez, Sixto L.; Goode, Plesent W.

    1992-01-01

    Accomplishing a task with telerobotics typically involves a combination of operator control/supervision and a 'script' of preprogrammed commands. These commands usually assume that the location of various objects in the task space conform to some internal representation (database) of that task space. The ability to quickly and accurately verify the task environment against the internal database would improve the robustness of these preprogrammed commands. In addition, the on-line initialization and maintenance of a task space database is difficult for operators using Cartesian coordinates alone. This paper describes the Interactive Scene' Analysis Module (ISAM) developed to provide taskspace database initialization and verification utilizing 3-D graphic overlay modelling, video imaging, and laser radar based range imaging. Through the fusion of taskspace database information and image sensor data, a verifiable taskspace model is generated providing location and orientation data for objects in a task space. This paper also describes applications of the ISAM in the Intelligent Systems Research Laboratory (ISRL) at NASA Langley Research Center, and discusses its performance relative to representation accuracy and operator interface efficiency.

  13. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  14. ITS-90 Thermocouple Database

    National Institute of Standards and Technology Data Gateway

    SRD 60 NIST ITS-90 Thermocouple Database (Web, free access)   Web version of Standard Reference Database 60 and NIST Monograph 175. The database gives temperature -- electromotive force (emf) reference functions and tables for the letter-designated thermocouple types B, E, J, K, N, R, S and T. These reference functions have been adopted as standards by the American Society for Testing and Materials (ASTM) and the International Electrotechnical Commission (IEC).

  15. Variability in Standard Outcomes of Posterior Lumbar Fusion Determined by National Databases.

    PubMed

    Joseph, Jacob R; Smith, Brandon W; Park, Paul

    2017-01-01

    National databases are used with increasing frequency in spine surgery literature to evaluate patient outcomes. The differences between individual databases in relationship to outcomes of lumbar fusion are not known. We evaluated the variability in standard outcomes of posterior lumbar fusion between the University HealthSystem Consortium (UHC) database and the Healthcare Cost and Utilization Project National Inpatient Sample (NIS). NIS and UHC databases were queried for all posterior lumbar fusions (International Classification of Diseases, Ninth Revision code 81.07) performed in 2012. Patient demographics, comorbidities (including obesity), length of stay (LOS), in-hospital mortality, and complications such as urinary tract infection, deep venous thrombosis, pulmonary embolism, myocardial infarction, durotomy, and surgical site infection were collected using specific International Classification of Diseases, Ninth Revision codes. Analysis included 21,470 patients from the NIS database and 14,898 patients from the UHC database. Demographic data were not significantly different between databases. Obesity was more prevalent in UHC (P = 0.001). Mean LOS was 3.8 days in NIS and 4.55 in UHC (P < 0.0001). Complications were significantly higher in UHC, including urinary tract infection, deep venous thrombosis, pulmonary embolism, myocardial infarction, surgical site infection, and durotomy. In-hospital mortality was similar between databases. NIS and UHC databases had similar demographic patient populations undergoing posterior lumbar fusion. However, the UHC database reported significantly higher complication rate and longer LOS. This difference may reflect academic institutions treating higher-risk patients; however, a definitive reason for the variability between databases is unknown. The inability to precisely determine the basis of the variability between databases highlights the limitations of using administrative databases for spinal outcome analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Post flood damage data collection and assessment in Albania based on DesInventar methodology

    NASA Astrophysics Data System (ADS)

    Toto, Emanuela; Massabo, Marco; Deda, Miranda; Rossello, Laura

    2015-04-01

    In 2013 in Albania was implemented a collection of disaster losses based on Desinventar. The DesInventar system consists in a methodology and software tool that lead to the systematic collection, documentation and analysis of loss data on disasters. The main sources of information about disasters used for the Albanian database were the Albanian Ministry of Internal Affairs, the National Library and the State archive. Specifically for floods the database created contains nearly 900 datasets, for a period of 148 years (from 1865 to 2013). The data are georeferenced on the administrative units of Albania: Region, Provinces and Municipalities. The datasets describe the events by reporting the date of occurrence, the duration, the localization in administrative units and the cause. Additional information regards the effects and damage that the event caused on people (deaths, injured, missing, affected, relocated, evacuated, victims) and on houses (houses damaged or destroyed). Other quantitative indicators are the losses in local currency or US dollars, the damage on roads, the crops affected , the lost cattle and the involvement of social elements over the territory such as education and health centers. Qualitative indicators simply register the sectors (e.g. transportations, communications, relief, agriculture, water supply, sewerage, power and energy, industries, education, health sector, other sectors) that were affected. Through the queries and analysis of the data collected it was possible to identify the most affected areas, the economic loss, the damage in agriculture, the houses and people affected and many other variables. The most vulnerable Regions for the past floods in Albania were studied and individuated, as well as the rivers that cause more damage in the country. Other analysis help to estimate the damage and losses during the main flood events of the recent years, occurred in 2010 and 2011, and to recognize the most affected sectors. The database was used to find the most frequent drivers that cause floods and to identify the areas with a higher priority for intervention and the areas with a higher economic loss. In future the loss and damage database could address interventions for risk mitigation and decision making processes. Using the database is also possible to build Empirical Loss Exceedance Curves, that permit to find the average number of times for year that a certain level of loss happened. The users of the database information can be researchers, students, citizens and policy makers. The operators of the National Operative Center for Civil Emergencies (Albanian Ministry of Internal Affairs) use the database daily to insert new data. Nowadays in Albania there isn't an entity in charge for the registration of damage and consequences of floods in a systematic and organized way. In this sense, the database DesInventar provides a basis for the future and helps to identify priorities to create a national database.

  17. Cryptanalysis of Password Protection of Oracle Database Management System (DBMS)

    NASA Astrophysics Data System (ADS)

    Koishibayev, Timur; Umarova, Zhanat

    2016-04-01

    This article discusses the currently available encryption algorithms in the Oracle database, also the proposed upgraded encryption algorithm, which consists of 4 steps. In conclusion we make an analysis of password encryption of Oracle Database.

  18. Development of an updated PBPK model for trichloroethylene and metabolites in mice, and its application to discern the role of oxidative metabolism in TCE-induced hepatomegaly.

    PubMed

    Evans, M V; Chiu, W A; Okino, M S; Caldwell, J C

    2009-05-01

    Trichloroethylene (TCE) is a lipophilic solvent rapidly absorbed and metabolized via oxidation and conjugation to a variety of metabolites that cause toxicity to several internal targets. Increases in liver weight (hepatomegaly) have been reported to occur quickly in rodents after TCE exposure, with liver tumor induction reported in mice after long-term exposure. An integrated dataset for gavage and inhalation TCE exposure and oral data for exposure to two of its oxidative metabolites (TCA and DCA) was used, in combination with an updated and more accurate physiologically-based pharmacokinetic (PBPK) model, to examine the question as to whether the presence of TCA in the liver is responsible for TCE-induced hepatomegaly in mice. The updated PBPK model was used to help discern the quantitative contribution of metabolites to this effect. The update of the model was based on a detailed evaluation of predictions from previously published models and additional preliminary analyses based on gas uptake inhalation data in mice. The parameters of the updated model were calibrated using Bayesian methods with an expanded pharmacokinetic database consisting of oral, inhalation, and iv studies of TCE administration as well as studies of TCE metabolites in mice. The dose-response relationships for hepatomegaly derived from the multi-study database showed that the proportionality of dose to response for TCE- and DCA-induced hepatomegaly is not observed for administered doses of TCA in the studied range. The updated PBPK model was used to make a quantitative comparison of internal dose of metabolized and administered TCA. While the internal dose of TCA predicted by modeling of TCE exposure (i.e., mg TCA/kg-d) showed a linear relationship with hepatomegaly, the slope of the relationship was much greater than that for directly administered TCA. Thus, the degree of hepatomegaly induced per unit of TCA produced through TCE oxidation is greater than that expected per unit of TCA administered directly, which is inconsistent with the hypothesis that TCA alone accounts for TCE-induced hepatomegaly. In addition, TCE-induced hepatomegaly showed a much more consistent relationship with PBPK model predictions of total oxidative metabolism than with predictions of TCE area-under-the-curve in blood, consistent with toxicity being induced by oxidative metabolites rather than the parent compound. Therefore, these results strongly suggest that oxidative metabolites in addition to TCA are necessary contributors to TCE-induced liver weight changes in mice.

  19. Multimethod assessment of psychopathy in relation to factors of internalizing and externalizing from the Personality Assessment Inventory: the impact of method variance and suppressor effects.

    PubMed

    Blonigen, Daniel M; Patrick, Christopher J; Douglas, Kevin S; Poythress, Norman G; Skeem, Jennifer L; Lilienfeld, Scott O; Edens, John F; Krueger, Robert F

    2010-03-01

    Research to date has revealed divergent relations across factors of psychopathy measures with criteria of internalizing (INT; anxiety, depression) and externalizing (EXT; antisocial behavior, substance use). However, failure to account for method variance and suppressor effects has obscured the consistency of these findings across distinct measures of psychopathy. Using a large correctional sample, the current study employed a multimethod approach to psychopathy assessment (self-report, interview and file review) to explore convergent and discriminant relations between factors of psychopathy measures and latent criteria of INT and EXT derived from the Personality Assessment Inventory (Morey, 2007). Consistent with prediction, scores on the affective-interpersonal factor of psychopathy were negatively associated with INT and negligibly related to EXT, whereas scores on the social deviance factor exhibited positive associations (moderate and large, respectively) with both INT and EXT. Notably, associations were highly comparable across the psychopathy measures when accounting for method variance (in the case of EXT) and when assessing for suppressor effects (in the case of INT). Findings are discussed in terms of implications for clinical assessment and evaluation of the validity of interpretations drawn from scores on psychopathy measures. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  20. The Pittsburgh sleep quality index as a screening tool for sleep dysfunction in clinical and non-clinical samples: A systematic review and meta-analysis.

    PubMed

    Mollayeva, Tatyana; Thurairajah, Pravheen; Burton, Kirsteen; Mollayeva, Shirin; Shapiro, Colin M; Colantonio, Angela

    2016-02-01

    This review appraises the process of development and the measurement properties of the Pittsburgh sleep quality index (PSQI), gauging its potential as a screening tool for sleep dysfunction in non-clinical and clinical samples; it also compares non-clinical and clinical populations in terms of PSQI scores. MEDLINE, Embase, PsycINFO, and HAPI databases were searched. Critical appraisal of studies of measurement properties was performed using COSMIN. Of 37 reviewed studies, 22 examined construct validity, 19 - known-group validity, 15 - internal consistency, and three - test-retest reliability. Study quality ranged from poor to excellent, with the majority designated fair. Internal consistency, based on Cronbach's alpha, was good. Discrepancies were observed in factor analytic studies. In non-clinical and clinical samples with known differences in sleep quality, the PSQI global scores and all subscale scores, with the exception of sleep disturbance, differed significantly. The best evidence synthesis for the PSQI showed strong reliability and validity, and moderate structural validity in a variety of samples, suggesting the tool fulfills its intended utility. A taxonometric analysis can contribute to better understanding of sleep dysfunction as either a dichotomous or continuous construct. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Statistical modeling of occupational chlorinated solvent exposures for case–control studies using a literature-based database

    PubMed Central

    Hein, Misty J.; Waters, Martha A.; Ruder, Avima M.; Stenzel, Mark R.; Blair, Aaron; Stewart, Patricia A.

    2010-01-01

    Objectives: Occupational exposure assessment for population-based case–control studies is challenging due to the wide variety of industries and occupations encountered by study participants. We developed and evaluated statistical models to estimate the intensity of exposure to three chlorinated solvents—methylene chloride, 1,1,1-trichloroethane, and trichloroethylene—using a database of air measurement data and associated exposure determinants. Methods: A measurement database was developed after an extensive review of the published industrial hygiene literature. The database of nearly 3000 measurements or summary measurements included sample size, measurement characteristics (year, duration, and type), and several potential exposure determinants associated with the measurements: mechanism of release (e.g. evaporation), process condition, temperature, usage rate, type of ventilation, location, presence of a confined space, and proximity to the source. The natural log-transformed measurement levels in the exposure database were modeled as a function of the measurement characteristics and exposure determinants using maximum likelihood methods. Assuming a single lognormal distribution of the measurements, an arithmetic mean exposure intensity level was estimated for each unique combination of exposure determinants and decade. Results: The proportions of variability in the measurement data explained by the modeled measurement characteristics and exposure determinants were 36, 38, and 54% for methylene chloride, 1,1,1-trichloroethane, and trichloroethylene, respectively. Model parameter estimates for the exposure determinants were in the anticipated direction. Exposure intensity estimates were plausible and exhibited internal consistency, but the ability to evaluate validity was limited. Conclusions: These prediction models can be used to estimate chlorinated solvent exposure intensity for jobs reported by population-based case–control study participants that have sufficiently detailed information regarding the exposure determinants. PMID:20418277

  2. Electronic medical record: research tool for pancreatic cancer?

    PubMed

    Arous, Edward J; McDade, Theodore P; Smith, Jillian K; Ng, Sing Chau; Sullivan, Mary E; Zottola, Ralph J; Ranauro, Paul J; Shah, Shimul A; Whalen, Giles F; Tseng, Jennifer F

    2014-04-01

    A novel data warehouse based on automated retrieval from an institutional health care information system (HIS) was made available to be compared with a traditional prospectively maintained surgical database. A newly established institutional data warehouse at a single-institution academic medical center autopopulated by HIS was queried for International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9-CM) diagnosis codes for pancreatic neoplasm. Patients with ICD-9-CM diagnosis codes for pancreatic neoplasm were captured. A parallel query was performed using a prospective database populated by manual entry. Duplicated patients and those unique to either data set were identified. All patients were manually reviewed to determine the accuracy of diagnosis. A total of 1107 patients were identified from the HIS-linked data set with pancreatic neoplasm from 1999-2009. Of these, 254 (22.9%) patients were also captured by the surgical database, whereas 853 (77.1%) patients were only in the HIS-linked data set. Manual review of the HIS-only group demonstrated that 45.0% of patients were without identifiable pancreatic pathology, suggesting erroneous capture, whereas 36.3% of patients were consistent with pancreatic neoplasm and 18.7% with other pancreatic pathology. Of the 394 patients identified by the surgical database, 254 (64.5%) patients were captured by HIS, whereas 140 (35.5%) patients were not. Manual review of patients only captured by the surgical database demonstrated 85.9% with pancreatic neoplasm and 14.1% with other pancreatic pathology. Finally, review of the 254 patient overlap demonstrated that 80.3% of patients had pancreatic neoplasm and 19.7% had other pancreatic pathology. These results suggest that cautious interpretation of administrative data rely only on ICD-9-CM diagnosis codes and clinical correlation through previously validated mechanisms. Published by Elsevier Inc.

  3. Nursing Child Assessment Satellite Training Parent-Child Interaction Scales: Comparing American and Canadian Normative and High-Risk Samples.

    PubMed

    Letourneau, Nicole L; Tryphonopoulos, Panagiota D; Novick, Jason; Hart, J Martha; Giesbrecht, Gerald; Oxford, Monica L

    Many nurses rely on the American Nursing Child Assessment Satellite Training (NCAST) Parent-Child Interaction (PCI) Teaching and Feeding Scales to identify and target interventions for families affected by severe/chronic stressors (e.g. postpartum depression (PPD), intimate partner violence (IPV), low-income). However, the NCAST Database that provides normative data for comparisons may not apply to Canadian families. The purpose of this study was to compare NCAST PCI scores in Canadian and American samples and to assess the reliability of the NCAST PCI Scales in Canadian samples. This secondary analysis employed independent samples t-tests (p < 0.005) to compare PCI between the American NCAST Database and Canadian high-risk (families with PPD, exposure to IPV or low-income) and community samples. Cronbach's alphas were calculated for the Canadian and American samples. In both American and Canadian samples, belonging to a high-risk population reduced parents' abilities to engage in sensitive and responsive caregiving (i.e. healthy serve and return relationships) as measured by the PCI Scales. NCAST Database mothers were more effective at executing caregiving responsibilities during PCI compared to the Canadian community sample, while infants belonging to the Canadian community sample provided clearer cues to caregivers during PCI compared to those of the NCAST Database. Internal consistency coefficients for the Canadian samples were generally acceptable. The NCAST Database can be reliably used for assessing PCI in normative and high-risk Canadian families. Canadian nurses can be assured that the PCI Scales adequately identify risks and can help target interventions to promote optimal parent-child relationships and ultimately child development. Crown Copyright © 2018. Published by Elsevier Inc. All rights reserved.

  4. POLARIS: A 30-meter probabilistic soil series map of the contiguous United States

    USGS Publications Warehouse

    Chaney, Nathaniel W; Wood, Eric F; McBratney, Alexander B; Hempel, Jonathan W; Nauman, Travis; Brungard, Colby W.; Odgers, Nathan P

    2016-01-01

    A new complete map of soil series probabilities has been produced for the contiguous United States at a 30 m spatial resolution. This innovative database, named POLARIS, is constructed using available high-resolution geospatial environmental data and a state-of-the-art machine learning algorithm (DSMART-HPC) to remap the Soil Survey Geographic (SSURGO) database. This 9 billion grid cell database is possible using available high performance computing resources. POLARIS provides a spatially continuous, internally consistent, quantitative prediction of soil series. It offers potential solutions to the primary weaknesses in SSURGO: 1) unmapped areas are gap-filled using survey data from the surrounding regions, 2) the artificial discontinuities at political boundaries are removed, and 3) the use of high resolution environmental covariate data leads to a spatial disaggregation of the coarse polygons. The geospatial environmental covariates that have the largest role in assembling POLARIS over the contiguous United States (CONUS) are fine-scale (30 m) elevation data and coarse-scale (~ 2 km) estimates of the geographic distribution of uranium, thorium, and potassium. A preliminary validation of POLARIS using the NRCS National Soil Information System (NASIS) database shows variable performance over CONUS. In general, the best performance is obtained at grid cells where DSMART-HPC is most able to reduce the chance of misclassification. The important role of environmental covariates in limiting prediction uncertainty suggests including additional covariates is pivotal to improving POLARIS' accuracy. This database has the potential to improve the modeling of biogeochemical, water, and energy cycles in environmental models; enhance availability of data for precision agriculture; and assist hydrologic monitoring and forecasting to ensure food and water security.

  5. Comparison of National Operative Mortality in Gastroenterological Surgery Using Web-based Prospective Data Entry Systems.

    PubMed

    Anazawa, Takayuki; Paruch, Jennifer L; Miyata, Hiroaki; Gotoh, Mitsukazu; Ko, Clifford Y; Cohen, Mark E; Hirahara, Norimichi; Zhou, Lynn; Konno, Hiroyuki; Wakabayashi, Go; Sugihara, Kenichi; Mori, Masaki

    2015-12-01

    International collaboration is important in healthcare quality evaluation; however, few international comparisons of general surgery outcomes have been accomplished. Furthermore, predictive model application for risk stratification has not been internationally evaluated. The National Clinical Database (NCD) in Japan was developed in collaboration with the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP), with a goal of creating a standardized surgery database for quality improvement. The study aimed to compare the consistency and impact of risk factors of 3 major gastroenterological surgical procedures in Japan and the United States (US) using web-based prospective data entry systems: right hemicolectomy (RH), low anterior resection (LAR), and pancreaticoduodenectomy (PD).Data from NCD and ACS-NSQIP, collected over 2 years, were examined. Logistic regression models were used for predicting 30-day mortality for both countries. Models were exchanged and evaluated to determine whether the models built for one population were accurate for the other population.We obtained data for 113,980 patients; 50,501 (Japan: 34,638; US: 15,863), 42,770 (Japan: 35,445; US: 7325), and 20,709 (Japan: 15,527; US: 5182) underwent RH, LAR, and, PD, respectively. Thirty-day mortality rates for RH were 0.76% (Japan) and 1.88% (US); rates for LAR were 0.43% versus 1.08%; and rates for PD were 1.35% versus 2.57%. Patient background, comorbidities, and practice style were different between Japan and the US. In the models, the odds ratio for each variable was similar between NCD and ACS-NSQIP. Local risk models could predict mortality using local data, but could not accurately predict mortality using data from other countries.We demonstrated the feasibility and efficacy of the international collaborative research between Japan and the US, but found that local risk models remain essential for quality improvement.

  6. The Model Parameter Estimation Experiment (MOPEX): Its structure, connection to other international initiatives and future directions

    USGS Publications Warehouse

    Wagener, T.; Hogue, T.; Schaake, J.; Duan, Q.; Gupta, H.; Andreassian, V.; Hall, A.; Leavesley, G.

    2006-01-01

    The Model Parameter Estimation Experiment (MOPEX) is an international project aimed at developing enhanced techniques for the a priori estimation of parameters in hydrological models and in land surface parameterization schemes connected to atmospheric models. The MOPEX science strategy involves: database creation, a priori parameter estimation methodology development, parameter refinement or calibration, and the demonstration of parameter transferability. A comprehensive MOPEX database has been developed that contains historical hydrometeorological data and land surface characteristics data for many hydrological basins in the United States (US) and in other countries. This database is being continuously expanded to include basins from various hydroclimatic regimes throughout the world. MOPEX research has largely been driven by a series of international workshops that have brought interested hydrologists and land surface modellers together to exchange knowledge and experience in developing and applying parameter estimation techniques. With its focus on parameter estimation, MOPEX plays an important role in the international context of other initiatives such as GEWEX, HEPEX, PUB and PILPS. This paper outlines the MOPEX initiative, discusses its role in the scientific community, and briefly states future directions.

  7. Data, knowledge and method bases in chemical sciences. Part IV. Current status in databases.

    PubMed

    Braibanti, Antonio; Rao, Rupenaguntla Sambasiva; Rao, Gollapalli Nagesvara; Ramam, Veluri Anantha; Rao, Sattiraju Veera Venkata Satyanarayana

    2002-01-01

    Computer readable databases have become an integral part of chemical research right from planning data acquisition to interpretation of the information generated. The databases available today are numerical, spectral and bibliographic. Data representation by different schemes--relational, hierarchical and objects--is demonstrated. Quality index (QI) throws light on the quality of data. The objective, prospects and impact of database activity on expert systems are discussed. The number and size of corporate databases available on international networks crossed manageable number leading to databases about their contents. Subsets of corporate or small databases have been developed by groups of chemists. The features and role of knowledge-based or intelligent databases are described.

  8. International LCA

    EPA Science Inventory

    To provide global guidance on the establishment and maintenance of LCA databases, as the basis for improved dataset exchangeability and interlinkages of databases worldwide. Increase the credibility of existing LCA data, the generation of more data and their overall accessibilit...

  9. An Update on Electronic Information Sources.

    ERIC Educational Resources Information Center

    Ackerman, Katherine

    1987-01-01

    This review of new developments and products in online services discusses trends in travel related services; full text databases; statistical source databases; an emphasis on regional and international business news; and user friendly systems. (Author/CLB)

  10. Crystallography Open Database – an open-access collection of crystal structures

    PubMed Central

    Gražulis, Saulius; Chateigner, Daniel; Downs, Robert T.; Yokochi, A. F. T.; Quirós, Miguel; Lutterotti, Luca; Manakova, Elena; Butkus, Justas; Moeck, Peter; Le Bail, Armel

    2009-01-01

    The Crystallography Open Database (COD), which is a project that aims to gather all available inorganic, metal–organic and small organic molecule structural data in one database, is described. The database adopts an open-access model. The COD currently contains ∼80 000 entries in crystallographic information file format, with nearly full coverage of the International Union of Crystallography publications, and is growing in size and quality. PMID:22477773

  11. What can we learn from a decade of database audits? The Duke Clinical Research Institute experience, 1997--2006.

    PubMed

    Rostami, Reza; Nahm, Meredith; Pieper, Carl F

    2009-04-01

    Despite a pressing and well-documented need for better sharing of information on clinical trials data quality assurance methods, many research organizations remain reluctant to publish descriptions of and results from their internal auditing and quality assessment methods. We present findings from a review of a decade of internal data quality audits performed at the Duke Clinical Research Institute, a large academic research organization that conducts data management for a diverse array of clinical studies, both academic and industry-sponsored. In so doing, we hope to stimulate discussions that could benefit the wider clinical research enterprise by providing insight into methods of optimizing data collection and cleaning, ultimately helping patients and furthering essential research. We present our audit methodologies, including sampling methods, audit logistics, sample sizes, counting rules used for error rate calculations, and characteristics of audited trials. We also present database error rates as computed according to two analytical methods, which we address in detail, and discuss the advantages and drawbacks of two auditing methods used during this 10-year period. Our review of the DCRI audit program indicates that higher data quality may be achieved from a series of small audits throughout the trial rather than through a single large database audit at database lock. We found that error rates trended upward from year to year in the period characterized by traditional audits performed at database lock (1997-2000), but consistently trended downward after periodic statistical process control type audits were instituted (2001-2006). These increases in data quality were also associated with cost savings in auditing, estimated at 1000 h per year, or the efforts of one-half of a full time equivalent (FTE). Our findings are drawn from retrospective analyses and are not the result of controlled experiments, and may therefore be subject to unanticipated confounding. In addition, the scope and type of audits we examine here are specific to our institution, and our results may not be broadly generalizable. Use of statistical process control methodologies may afford advantages over more traditional auditing methods, and further research will be necessary to confirm the reliability and usability of such techniques. We believe that open and candid discussion of data quality assurance issues among academic and clinical research organizations will ultimately benefit the entire research community in the coming era of increased data sharing and re-use.

  12. Distributed Episodic Exploratory Planning (DEEP)

    DTIC Science & Technology

    2008-12-01

    API). For DEEP, Hibernate offered the following advantages: • Abstracts SQL by utilizing HQL so any database with a Java Database Connectivity... Hibernate SQL ICCRTS International Command and Control Research and Technology Symposium JDB Java Distributed Blackboard JDBC Java Database Connectivity...selected because of its opportunistic reasoning capabilities and implemented in Java for platform independence. Java was chosen for ease of

  13. The Teachers' Choices Cognate Database for K-3 Teachers of Latino English Learners

    ERIC Educational Resources Information Center

    Montelongo, José A.; Hernández, Anita C.

    2013-01-01

    The purpose of the present paper is to introduce the Teachers' Choices Cognate Database. English-Spanish cognates are words that are orthographically and semantically identical or nearly identical in both English and Spanish. To create this free online database, the cognates from every one of the 146 International Reading Association's…

  14. Potential use of routine databases in health technology assessment.

    PubMed

    Raftery, J; Roderick, P; Stevens, A

    2005-05-01

    To develop criteria for classifying databases in relation to their potential use in health technology (HT) assessment and to apply them to a list of databases of relevance in the UK. To explore the extent to which prioritized databases could pick up those HTs being assessed by the National Coordinating Centre for Health Technology Assessment (NCCHTA) and the extent to which these databases have been used in HT assessment. To explore the validation of the databases and their cost. Electronic databases. Key literature sources. Experienced users of routine databases. A 'first principles' examination of the data necessary for each type of HT assessment was carried out, supplemented by literature searches and a historical review. The principal investigators applied the criteria to the databases. Comments of the 'keepers' of the prioritized databases were incorporated. Details of 161 topics funded by the NHS R&D Health Technology Assessment (HTA) programme were reviewed iteratively by the principal investigators. Uses of databases in HTAs were identified by literature searches, which included the title of each prioritized database as a keyword. Annual reports of databases were examined and 'keepers' queried. The validity of each database was assessed using criteria based on a literature search and involvement by the authors in a national academic network. The costs of databases were established from annual reports, enquiries to 'keepers' of databases and 'guesstimates' based on cost per record. For assessing effectiveness, equity and diffusion, routine databases were classified into three broad groups: (1) group I databases, identifying both HTs and health states, (2) group II databases, identifying the HTs, but not a health state, and (3) group III databases, identifying health states, but not an HT. Group I datasets were disaggregated into clinical registries, clinical administrative databases and population-oriented databases. Group III were disaggregated into adverse event reporting, confidential enquiries, disease-only registers and health surveys. Databases in group I can be used not only to assess effectiveness but also to assess diffusion and equity. Databases in group II can only assess diffusion. Group III has restricted scope for assessing HTs, except for analysis of adverse events. For use in costing, databases need to include unit costs or prices. Some databases included unit cost as well as a specific HT. A list of around 270 databases was identified at the level of UK, England and Wales or England (over 1000 including Scotland, Wales and Northern Ireland). Allocation of these to the above groups identified around 60 databases with some potential for HT assessment, roughly half to group I. Eighteen clinical registers were identified as having the greatest potential although the clinical administrative datasets had potential mainly owing to their inclusion of a wide range of technologies. Only two databases were identified that could directly be used in costing. The review of the potential capture of HTs prioritized by the UK's NHS R&D HTA programme showed that only 10% would be captured in these databases, mainly drugs prescribed in primary care. The review of the use of routine databases in any form of HT assessment indicated that clinical registers were mainly used for national comparative audit. Some databases have only been used in annual reports, usually time trend analysis. A few peer-reviewed papers used a clinical register to assess the effectiveness of a technology. Accessibility is suggested as a barrier to using most databases. Clinical administrative databases (group Ib) have mainly been used to build population needs indices and performance indicators. A review of the validity of used databases showed that although internal consistency checks were common, relatively few had any form of external audit. Some comparative audit databases have data scrutinised by participating units. Issues around coverage and coding have, in general, received little attention. NHS funding of databases has been mainly for 'Central Returns' for management purposes, which excludes those databases with the greatest potential for HT assessment. Funding for databases was various, but some are unfunded, relying on goodwill. The estimated total cost of databases in group I plus selected databases from groups II and III has been estimated at pound 50 million or around 0.1% of annual NHS spend. A few databases with limited potential for HT assessment account for the bulk of spending. Suggestions for policy include clarification of responsibility for the strategic development of databases, improved resourcing, and issues around coding, confidentiality, ownership and access, maintenance of clinical support, optimal use of information technology, filling gaps and remedying deficiencies. Recommendations for researchers include closer policy links between routine data and R&D, and selective investment in the more promising databases. Recommended research topics include optimal capture and coding of the range of HTs, international comparisons of the role, funding and use of routine data in healthcare systems and use of routine database in trials and in modelling. Independent evaluations are recommended for information strategies (such as those around the National Service Frameworks and various collaborations) and for electronic patient and health records.

  15. How to measure the internationality of scientific publications.

    PubMed

    Buela-Casal, Gualberto; Zych, Izabela

    2012-01-01

    Although the term "internationality" has never been defined by consensus, it is commonly used as a synonym of quality. Even though its meaning has never been established, internationality is frequently used to evaluate scientists, publications, or universities in many different countries. The present investigation is based on the opinion about the meaning of the concept "internationality" of the members of scientific community, represented by a broad sample of 16,056 scientists from 109 countries working in all the fields of knowledge defined by UNESCO. The sample was randomly selected from the Web of Science database from the scientists who have published at least one article in one of the journals indexed by the database. A questionnaire based on eleven criteria was designed for the purpose of the study. As a result, the first measure of internationality has been obtained. The most important criteria of internationality are: the publication language, online access, and international publication standards. There are significant differences among geographic zones and fields of knowledge.

  16. Atypical EEG power correlates with indiscriminately friendly behavior in internationally adopted children.

    PubMed

    Tarullo, Amanda R; Garvin, Melissa C; Gunnar, Megan R

    2011-03-01

    While effects of institutional care on behavioral development have been studied extensively, effects on neural systems underlying these socioemotional and attention deficits are only beginning to be examined. The current study assessed electroencephalogram (EEG) power in 18-month-old internationally adopted, postinstitutionalized children (n = 37) and comparison groups of nonadopted children (n = 47) and children internationally adopted from foster care (n = 39). For their age, postinstitutionalized children had an atypical EEG power distribution, with relative power concentrated in lower frequency bands compared with nonadopted children. Both internationally adopted groups had lower absolute alpha power than nonadopted children. EEG power was not related to growth at adoption or to global cognitive ability. Atypical EEG power distribution at 18 months predicted indiscriminate friendliness and poorer inhibitory control at 36 months. Both postinstitutionalized and foster care children were more likely than nonadopted children to exhibit indiscriminate friendliness. Results are consistent with a cortical hypoactivation model of the effects of early deprivation on neural development and provide initial evidence associating this atypical EEG pattern with indiscriminate friendliness. Outcomes observed in the foster care children raise questions about the specificity of institutional rearing as a risk factor and emphasize the need for broader consideration of the effects of early deprivation and disruptions in care. PsycINFO Database Record (c) 2011 APA, all rights reserved.

  17. Building a pantheoretical model of dehumanization with transgender men: Integrating objectification and minority stress theories.

    PubMed

    Velez, Brandon L; Breslow, Aaron S; Brewster, Melanie E; Cox, Robert; Foster, Aasha B

    2016-10-01

    With a national sample of 304 transgender men, the present study tested a pantheoretical model of dehumanization (Moradi, 2013) with hypotheses derived from objectification theory (Fredrickson & Roberts, 1997), minority stress theory (Meyer, 2003), and prior research regarding men's body image concerns. Specifically, we tested common objectification theory constructs (internalization of sociocultural standards of attractiveness [SSA], body surveillance, body satisfaction) as direct and indirect predictors of compulsive exercise. We also examined the roles of transgender-specific minority stress variables-antitransgender discrimination and transgender identity congruence-in the model. Results of a latent variable structural equation model yielded mixed support for the posited relations. The direct and indirect interrelations of internalization of SSA, body surveillance, and body satisfaction were consistent with prior objectification theory research, but only internalization of SSA yielded a significant direct relation with compulsive exercise. In addition, neither internalization of SSA nor body surveillance yielded significant indirect relations with compulsive exercise. However, antitransgender discrimination yielded predicted indirect relations with body surveillance, body satisfaction, and compulsive exercise, with transgender congruence playing a key mediating role in most of these relations. The implications of this pantheoretical model for research and practice with transgender men are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  18. Automatic public access to documents and maps stored on and internal secure system.

    NASA Astrophysics Data System (ADS)

    Trench, James; Carter, Mary

    2013-04-01

    The Geological Survey of Ireland operates a Document Management System for providing documents and maps stored internally in high resolution and in a high level secure environment, to an external service where the documents are automatically presented in a lower resolution to members of the public. Security is devised through roles and Individual Users where role level and folder level can be set. The application is an electronic document/data management (EDM) system which has a Geographical Information System (GIS) component integrated to allow users to query an interactive map of Ireland for data that relates to a particular area of interest. The data stored in the database consists of Bedrock Field Sheets, Bedrock Notebooks, Bedrock Maps, Geophysical Surveys, Geotechnical Maps & Reports, Groundwater, GSI Publications, Marine, Mine Records, Mineral Localities, Open File, Quaternary and Unpublished Reports. The Konfig application Tool is both an internal and public facing application. It acts as a tool for high resolution data entry which are stored in a high resolution vault. The public facing application is a mirror of the internal application and differs only in that the application furnishes high resolution data into low resolution format which is stored in a low resolution vault thus, making the data web friendly to the end user for download.

  19. How far has The Korean Journal of Internal Medicine advanced in terms of journal metrics?

    PubMed

    Huh, Sun

    2013-11-01

    The Korean Journal of Internal Medicine has already been valued as an international journal, according to a citation analysis in 2011. Now, 2 years later, I would like to confirm how much the Journal has advanced from the point of view of journal metrics by looking at the impact factor, cites per document (2 years), SCImago Journal Rank (SJR), and the Hirsch index. These were obtained from a variety of databases, such as the Korean Medical Citation Index, KoreaMed Synapse, Web of Science, JCR Web, and SCImago Journal & Country Rank. The manually calculated 2012 impact factor was 1.252 in the Web of Science, with a ranking of 70/151 (46.4%) in the category of general and internal medicine. Cites per documents (2 years) for 2012 was 1.619, with a ranking of 267/1,588 (16.8%) in the category of medicine (miscellaneous). The 2012 SJR was 0.464, with a ranking of 348/1,588 (21.9%) in the category of medicine (miscellaneous). The Hirsch index from KoreaMed Synapse, Web of Science, and SCImago Journal & Country Rank were 12, 15, and 19, respectively. In comparison with data from 2010, the values of all the journal metrics increased consistently. These results reflect favorably on the increased competency of editors and authors of The Korean Journal of Internal Medicine.

  20. How far has The Korean Journal of Internal Medicine advanced in terms of journal metrics?

    PubMed Central

    2013-01-01

    The Korean Journal of Internal Medicine has already been valued as an international journal, according to a citation analysis in 2011. Now, 2 years later, I would like to confirm how much the Journal has advanced from the point of view of journal metrics by looking at the impact factor, cites per document (2 years), SCImago Journal Rank (SJR), and the Hirsch index. These were obtained from a variety of databases, such as the Korean Medical Citation Index, KoreaMed Synapse, Web of Science, JCR Web, and SCImago Journal & Country Rank. The manually calculated 2012 impact factor was 1.252 in the Web of Science, with a ranking of 70/151 (46.4%) in the category of general and internal medicine. Cites per documents (2 years) for 2012 was 1.619, with a ranking of 267/1,588 (16.8%) in the category of medicine (miscellaneous). The 2012 SJR was 0.464, with a ranking of 348/1,588 (21.9%) in the category of medicine (miscellaneous). The Hirsch index from KoreaMed Synapse, Web of Science, and SCImago Journal & Country Rank were 12, 15, and 19, respectively. In comparison with data from 2010, the values of all the journal metrics increased consistently. These results reflect favorably on the increased competency of editors and authors of The Korean Journal of Internal Medicine. PMID:24307835

  1. VO-Dance an IVOA tools to easy publish data into VO and it's extension on planetology request

    NASA Astrophysics Data System (ADS)

    Smareglia, R.; Capria, M. T.; Molinaro, M.

    2012-09-01

    Data publishing through the self standing portals can be joined to VO resource publishing, i.e. astronomical resources deployed through VO compliant services. Since the IVOA (International Virtual Observatory Alliance) provides many protocols and standards for the various data flavors (images, spectra, catalogues … ), and since the data center has as a goal to grow up in number of hosted archives and services providing, the idea arose to find a way to easily deploy and maintain VO resources. VO-Dance is a java web application developed at IA2 that addresses this idea creating, in a dynamical way, VO resources out of database tables or views. It is structured to be potentially DBMS and platform independent and consists of 3 main tokens, an internal DB to store resources description and model metadata information, a restful web application to deploy the resources to the VO community. It's extension to planetology request is under study to best effort INAF software development and archive efficiency.

  2. The behaviour and sexual health of young international travellers (backpackers) in Australia.

    PubMed

    McNulty, A M; Egan, C; Wand, H; Donovan, B

    2010-06-01

    To study the demographics, risk behaviours and morbidity of young long-term international travellers (backpackers) attending a sexual health service in Sydney, Australia. Data on new patients were extracted from the Sydney Sexual Health Centre database for the period 1998 to 2006. The sexual risk behaviours and morbidity of the backpackers were compared with other patients of a similar age. The 5698 backpackers who attended the centre reported higher numbers of sexual partners (three or more partners in the past 3 months, 18% vs 12%, p<0.001) and a greater proportion drank alcohol at hazardous levels (22%) than the comparison group (9%, p<0.001). Rates of consistent (100%) condom use in the past 3 months were low in both backpackers (22%) and the comparison population (19%). Backpackers had higher rates of genital chlamydia infection (7% vs 5%, p<0.001) and reported higher rates of previous sexually transmitted infections (15% vs 10%, p<0.001). Backpackers should be a priority population for sexual health promotion and access to services.

  3. Utilisation, Reliability and Validity of Clinical Evaluation Exercise in Otolaryngology Training.

    PubMed

    Awad, Z; Hayden, L; Muthuswamy, K; Tolley, N S

    2015-10-01

    To investigate the utilisation, reliability and validity of clinical evaluation exercise (CEX) in otolaryngology training. Retrospective database analysis. Online assessment database. We analysed all CEXs submitted by north London core (CT) and speciality trainees (ST) in otolaryngology from 2010 to 2013. Internal consistency of the 7 CEX items rated as either O: outstanding, S: satisfactory or D: development required. Overall performance rating (pS) of 1-4 assessed against completion of training level. Receiver operating characteristic was used to describe CEX sensitivity and specificity. Overall score (cS), pS and the number of 'D'-rated items were used to investigate construct validity. One thousand one hundred and sixty CEXs from 45 trainees were included. CEX showed good internal consistency (Cronbach's alpha= 0.85). CEX was highly sensitive (99%), yet not specific (6%). cS and pS for ST was higher than CT (99.1% ± 0.4 versus 96.6% ± 0.8 and 3.06 ± 0.05 versus 1.92 ± 0.04, respectively P < 0.001). pS showed a significant stepwise increase from CT1 to ST6 (P < 0.001). In contrast, cS only showed improvement up to ST4 (P = 0.025). The most frequently utilised item 'management and follow-up planning' was found to be the best predictor of cS and pS (rs  = +0.69 and +0.21, respectively). CEX is reliable in assessing early years otolaryngology trainees in clinical examination, but not at higher level. It has the potential to be used in a summative capacity in selecting trainees for ST positions. This would also encourage trainees to master all domains of otolaryngology clinical examination by end of CT. © 2015 John Wiley & Sons Ltd.

  4. IRIS TOXICOLOGICAL REVIEW AND SUMMARY ...

    EPA Pesticide Factsheets

    EPA's assessment of the noncancer health effects and carcinogenic potential of Beryllium was added to the IRIS database in 1998. The IRIS program is updating the IRIS assessment for Beryllium. This update will incorporate health effects information published since the last assessment was prepared as well as new risk assessment methods. The IRIS assessment for Beryllium will consist of an updated Toxicological Review and IRIS Summary. The Toxicological Review is a critical review of the physicochemical and toxicokinetic properties of the chemical and its toxicity in humans and experimental systems. The assessment will present reference values for noncancer effects of Beryllium (RfD and RfC) and a cancer assessment. The Toxicological Review and IRIS Summary will be subject to internal peer consultation, Agency and Interagency review, and external scientific peer review. The final products will constitute the Agency's opinion on the toxicity of Beryllium. Beryllium is a light alkaline earth metal used in metal alloys and in high-performance products in the metallurgical, aerospace, and nuclear industries. According to the Superfund database, beryllium is found in over 300 NPL sites

  5. Read Code Quality Assurance

    PubMed Central

    Schulz, Erich; Barrett, James W.; Price, Colin

    1998-01-01

    As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with “business rules” declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short. PMID:9670131

  6. Read Code quality assurance: from simple syntax to semantic stability.

    PubMed

    Schulz, E B; Barrett, J W; Price, C

    1998-01-01

    As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with "business rules" declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short.

  7. Impacts of European drought events: insights from an international database of text-based reports

    NASA Astrophysics Data System (ADS)

    Stahl, K.; Kohn, I.; Blauhut, V.; Urquijo, J.; De Stefano, L.; Acacio, V.; Dias, S.; Stagge, J. H.; Tallaksen, L. M.; Kampragou, E.; Van Loon, A. F.; Barker, L. J.; Melsen, L. A.; Bifulco, C.; Musolino, D.; de Carli, A.; Massarutto, A.; Assimacopoulos, D.; Van Lanen, H. A. J.

    2015-09-01

    Drought is a natural hazard that can cause a wide range of impacts affecting the environment, society, and the economy. Assessing and reducing vulnerability to these impacts for regions beyond the local scale, spanning political and sectoral boundaries, requires systematic and detailed data regarding impacts. This study presents an assessment of the diversity of drought impacts across Europe based on the European Drought Impact report Inventory (EDII), a unique research database that has collected close to 5000 impact reports from 33 European countries. The reported drought impacts were classified into major impact categories, each of which had a number of subtypes. The distribution of these categories and types was then analyzed over time, by country, across Europe and for particular drought events. The results show that impacts on agriculture and public water supply dominate the collection of drought impact reports for most countries and for all major drought events since the 1970s, while the number and relative fractions of reported impacts in other sectors can vary regionally and from event to event. The data also shows that reported impacts have increased over time as more media and website information has become available and environmental awareness has increased. Even though the distribution of impact categories is relatively consistent across Europe, the details of the reports show some differences. They confirm severe impacts in southern regions (particularly on agriculture and public water supply) and sector-specific impacts in central and northern regions (e.g. on forestry or energy production). As a text-based database, the EDII presents a new challenge for quantitative analysis; however, the EDII provides a new and more comprehensive view on drought impacts. Related studies have already developed statistical techniques to evaluate the link between drought indices and impacts using the EDII. The EDII is a living database and is a promising source for further research on drought impacts, vulnerabilities, and risks across Europe. A key result is the extensive variety of impacts found across Europe and its documentation. This data coverage may help drought policy planning at national to international levels.

  8. ERIC: Sphinx or Golden Griffin?

    ERIC Educational Resources Information Center

    Lopez, Manuel D.

    1989-01-01

    Evaluates the Educational Resources Information Center (ERIC) database. Summarizes ERIC's history and organization, and discusses criticisms concerning access, currency, and database content. Reviews role of component clearinghouses, indexing practices, thesaurus structure, international coverage, and comparative studies. Finds ERIC a valuable…

  9. Patent Family Databases.

    ERIC Educational Resources Information Center

    Simmons, Edlyn S.

    1985-01-01

    Reports on retrieval of patent information online and includes definition of patent family, basic and equivalent patents, "parents and children" applications, designated states, patent family databases--International Patent Documentation Center, World Patents Index, APIPAT (American Petroleum Institute), CLAIMS (IFI/Plenum). A table…

  10. Online-data Bases On Natural-hazard Research, Early-warning Systems and Operative Disaster Prevention Programs

    NASA Astrophysics Data System (ADS)

    Hermanns, R. L.; Zentel, K.-O.; Wenzel, F.; Hövel, M.; Hesse, A.

    In order to benefit from synergies and to avoid replication in the field of disaster re- duction programs and related scientific projects it is important to create an overview on the state of art, the fields of activity and their key aspects. Therefore, the German Committee for Disaster Reduction intends to document projects and institution related to natural disaster prevention in three databases. One database is designed to docu- ment scientific programs and projects related to natural hazards. In a first step data acquisition concentrated on projects carried out by German institutions. In a second step projects from all other European countries will be archived. The second database focuses on projects on early-warning systems and has no regional limit. Data mining started in November 2001 and will be finished soon. The third database documents op- erational projects dealing with disaster prevention and concentrates on international projects or internationally funded projects. These databases will be available on the internet end of spring 2002 (http://www.dkkv.org) and will be updated continuously. They will allow rapid and concise information on various international projects, pro- vide up-to-date descriptions, and facilitate exchange as all relevant information in- cluding contact addresses are available to the public. The aim of this contribution is to present concepts and the work done so far, to invite participation, and to contact other organizations with similar objectives.

  11. Health Technology Assessment in nursing: a literature review.

    PubMed

    Ramacciati, N

    2013-03-01

    The Health Technology Assessment (HTA) approach, which provides scientific support for the decisions taken within the health field, is of increasing importance worldwide. In a context of limited resources, HTA has the potential of being an efficient tool for addressing the sustainability problems and the allocation choices arising from the constant increase in demand. This study aims to investigate HTA use in nursing, both in terms of quantifying HTA evaluations of nursing phenomena which have been conducted and in terms of the extent to which nursing has used the HTA approach. The Italian context has been analysed because of the growing diffusion of the HTA in Italy along with the recent developments in the nursing profession. A narrative review of international literature was undertaken using the following databases: HTA, PubMed, CINAHL, ILISI. Seventy evaluation studies on nursing were identified from the HTA database (1.12% of all studies in the database). The areas of nursing intervention and the country of origin of the studies were identified. Two nursing studies on the HTA approach were found in the PubMed, CINAHL and HTA databases. The first focused on the evaluation of nursing technology process and analysed 126 studies in six main thematic areas; the second was a systematic review on HTA in nursing and analysed 192 studies (46 meta-analyses, 31 Finnish primary studies, 117 international primary studies). Three Italian studies were identified from the ILISI database and Italian grey literature. In the international literature, although analyses regarding the efficacy of nursing interventions have been conducted, there are to date very few research projects that focus exclusively on the HTA process as applied to nursing. The recent development of a standardized nursing language coupled with the open debate as to which research method (qualitative vs. quantitative) best serves to 'read' nursing phenomena may explain the scarce diffusion of HTA in the field of nursing. © 2012 The Author. International Nursing Review © 2012 International Council of Nurses.

  12. Preparing College Students To Search Full-Text Databases: Is Instruction Necessary?

    ERIC Educational Resources Information Center

    Riley, Cheryl; Wales, Barbara

    Full-text databases allow Central Missouri State University's clients to access some of the serials that libraries have had to cancel due to escalating subscription costs; EbscoHost, the subject of this study, is one such database. The database is available free to all Missouri residents. A survey was designed consisting of 21 questions intended…

  13. 75 FR 36536 - Office of the Special Inspector General for the Troubled Asset Relief Program; Privacy Act of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-28

    ... identifying individual criminal offenders and alleged offenders and consisting only of identifying data and... to 5 U.S.C. 552a(j)(2): DO .220--SIGTARP Hotline Database. DO .221--SIGTARP Correspondence Database. DO .222--SIGTARP Investigative MIS Database. DO .223--SIGTARP Investigative Files Database. DO .224...

  14. Interactive, Automated Management of Icing Data

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.

    2009-01-01

    IceVal DatAssistant is software (see figure) that provides an automated, interactive solution for the management of data from research on aircraft icing. This software consists primarily of (1) a relational database component used to store ice shape and airfoil coordinates and associated data on operational and environmental test conditions and (2) a graphically oriented database access utility, used to upload, download, process, and/or display data selected by the user. The relational database component consists of a Microsoft Access 2003 database file with nine tables containing data of different types. Included in the database are the data for all publicly releasable ice tracings with complete and verifiable test conditions from experiments conducted to date in the Glenn Research Center Icing Research Tunnel. Ice shapes from computational simulations with the correspond ing conditions performed utilizing the latest version of the LEWICE ice shape prediction code are likewise included, and are linked to the equivalent experimental runs. The database access component includes ten Microsoft Visual Basic 6.0 (VB) form modules and three VB support modules. Together, these modules enable uploading, downloading, processing, and display of all data contained in the database. This component also affords the capability to perform various database maintenance functions for example, compacting the database or creating a new, fully initialized but empty database file.

  15. Infant-Feeding Intentions and Practices of Internal Medicine Physicians.

    PubMed

    Sattari, Maryam; Serwint, Janet R; Shuster, Jonathan J; Levine, David M

    2016-05-01

    Personal breastfeeding behavior of physician mothers is associated with their clinical breastfeeding advocacy, which in turn impacts patients' breastfeeding behavior. Internists can play an important role in breastfeeding advocacy as they usually come in contact with mothers longitudinally. To explore the personal infant-feeding decisions and behavior of physician mothers in internal medicine (IM). Physicians with current or previous IM training were isolated from our "Breastfeeding Among Physicians" database. The data in the database were gathered from cross-sectional surveys of 130 physician volunteers, mainly affiliated with the Johns Hopkins University School of Medicine (Baltimore, MD) and the University of Florida College of Medicine (Gainesville, FL). Seventy-two mothers reported current or previous IM training and had 196 infants. Breastfeeding rates were 96% at birth, 77% at 6 months, and 40% at 12 months. Exclusive breastfeeding rates were 78% at birth, 67% at 3 months, and 30% at 6 months. While maternal goal for breastfeeding duration correlated with duration of both exclusive and any breastfeeding, there was a consistent and appreciable disparity between maternal duration goal and actual breastfeeding duration. The participants reported work-related reasons for early supplementation and breastfeeding cessation. We have described for the first time in the literature the personal infant-feeding intentions and behavior of a cohort of IM physician mothers. Workplace interventions to enable internists to maintain breastfeeding after return to work and to achieve their breastfeeding goals might improve the health of these mothers and their infants and positively impact their clinical breastfeeding advocacy.

  16. Investigative and extrapolative role of microRNAs' genetic expression in breast carcinoma.

    PubMed

    Usmani, Ambreen; Shoro, Amir Ali; Shirazi, Bushra; Memon, Zahida

    2016-01-01

    MicroRNAs (miRs) are non-coding ribonucleic acids consisting of about 18-22 nucleotide bases. Expression of several miRs can be altered in breast carcinomas in comparison to healthy breast tissue, or between various subtypes of breast cancer. These are regulated as either oncogene or tumor suppressors, this shows that their expression is misrepresented in cancers. Some miRs are specifically associated with breast cancer and are affected by cancer-restricted signaling pathways e.g. downstream of estrogen receptor-α or HER2/neu. Connection of multiple miRs with breast cancer, and the fact that most of these post transcript structures may transform complex functional networks of mRNAs, identify them as potential investigative, extrapolative and predictive tumor markers, as well as possible targets for treatment. Investigative tools that are currently available are RNA-based molecular techniques. An additional advantage related to miRs in oncology is that they are remarkably stable and are notably detectable in serum and plasma. Literature search was performed by using database of PubMed, the keywords used were microRNA (52 searches) AND breast cancer (169 searches). PERN was used by database of Bahria University, this included literature and articles from international sources; 2 articles from Pakistan on this topic were consulted (one in international journal and one in a local journal). Of these, 49 articles were shortlisted which discussed relation of microRNA genetic expression in breast cancer. These articles were consulted for this review.

  17. The VO-Dance web application at the IA2 data center

    NASA Astrophysics Data System (ADS)

    Molinaro, Marco; Knapic, Cristina; Smareglia, Riccardo

    2012-09-01

    Italian center for Astronomical Archives (IA2, http://ia2.oats.inaf.it) is a national infrastructure project of the Italian National Institute for Astrophysics (Istituto Nazionale di AstroFisica, INAF) that provides services for the astronomical community. Besides data hosting for the Large Binocular Telescope (LBT) Corporation, the Galileo National Telescope (Telescopio Nazionale Galileo, TNG) Consortium and other telescopes and instruments, IA2 offers proprietary and public data access through user portals (both developed and mirrored) and deploys resources complying the Virtual Observatory (VO) standards. Archiving systems and web interfaces are developed to be extremely flexible about adding new instruments from other telescopes. VO resources publishing, along with data access portals, implements the International Virtual Observatory Alliance (IVOA) protocols providing astronomers with new ways of analyzing data. Given the large variety of data flavours and IVOA standards, the need for tools to easily accomplish data ingestion and data publishing arises. This paper describes the VO-Dance tool, that IA2 started developing to address VO resources publishing in a dynamical way from already existent database tables or views. The tool consists in a Java web application, potentially DBMS and platform independent, that stores internally the services' metadata and information, exposes restful endpoints to accept VO queries for these services and dynamically translates calls to these endpoints to SQL queries coherent with the published table or view. In response to the call VO-Dance translates back the database answer in a VO compliant way.

  18. Incident reporting in one UK accident and emergency department.

    PubMed

    Tighe, Catherine M; Woloshynowych, Maria; Brown, Ruth; Wears, Bob; Vincent, Charles

    2006-01-01

    Greater focus is needed on improving patient safety in modern healthcare systems and the first step to achieving this is to reliably identify the safety issues arising in healthcare. Research has shown the accident and emergency (A&E) department to be a particularly problematic environment where safety is a concern due to various factors, such as the range, nature and urgency of presenting conditions and the high turnover of patients. As in all healthcare environments clinical incident reporting in A&E is an important tool for detecting safety issues which can result in identifying solutions, learning from error and enhancing patient safety. This tool must be responsive and flexible to the local circumstances and work for the department to support the clinical governance agenda. In this paper, we describe the local processes for reporting and reviewing clinical incidents in one A&E department in a London teaching hospital and report recent changes to the system within the department. We used the historical data recorded on the Trust incident database as a representation of the information that would be available to the department in order to identify the high risk areas. In this paper, we evaluate the internal processes, the information available on the database and make recommendations to assist the emergency department in their internal processes. These will strengthen the internal review and staff feedback system so that the department can learn from incidents in a consistent manner. The process was reviewed by detailed examination of the centrally held electronic record (Datix database) of all incidents reported in a one year period. The nature of the incident and the level and accuracy of information provided in the incident reports was evaluated. There were positive aspects to the established system including evidence of positive changes made as a result of the reporting process, new initiatives to feedback to staff, and evolution of the programme for reporting and discussing the incidents internally. There appeared to be a mismatch between the recorded events and the category allocated to the incident in the historical record. In addition the database did not contain complete information for every incident, contributory factors were rarely recorded and relatively large numbers of incidents were recorded as "other" in the type of incident. There was also observed difficulty in updating the system as there is at least a months time lag between reporting or an incident and discussion/resolution of issues at the local departmental clinical risk management committee meetings. We used Leape's model for assessing the reporting system as a whole and found the system in the department to be relatively safe, fairly easy to use and moderately effective. Recommendations as a result of this study include the introduction of an electronic reporting system, limiting the number of staff who categorise the incidents--using clear definitions for classifications including a structured framework for contributory factors, and a process that allows incidents to be updated on the database locally after the discussion. This research may have implications for the incident reporting process in other specialities as well as in other hospitals.

  19. International Database of Volcanic Ash Impacts

    NASA Astrophysics Data System (ADS)

    Wallace, K.; Cameron, C.; Wilson, T. M.; Jenkins, S.; Brown, S.; Leonard, G.; Deligne, N.; Stewart, C.

    2015-12-01

    Volcanic ash creates extensive impacts to people and property, yet we lack a global ash impacts catalog to organize, distribute, and archive this important information. Critical impact information is often stored in ephemeral news articles or other isolated resources, which cannot be queried or located easily. A global ash impacts database would improve 1) warning messages, 2) public and lifeline emergency preparation, and 3) eruption response and recovery. Ashfall can have varying consequences, such as disabling critical lifeline infrastructure (e.g. electrical generation and transmission, water supplies, telecommunications, aircraft and airports) or merely creating limited and expensive inconvenience to local communities. Impacts to the aviation sector can be a far-reaching global issue. The international volcanic ash impacts community formed a committee to develop a database to catalog the impacts of volcanic ash. We identify three user populations for this database: 1) research teams, who would use the database to assist in systematic collection, recording, and storage of ash impact data, and to prioritize impact assessment trips and lab experiments 2) volcanic risk assessment scientists who rely on impact data for assessments (especially vulnerability/fragility assessments); a complete dataset would have utility for global, regional, national and local scale risk assessments, and 3) citizen science volcanic hazard reporting. Publication of an international ash impacts database will encourage standardization and development of best practices for collecting and reporting impact information. Data entered will be highly categorized, searchable, and open source. Systematic cataloging of impact data will allow users to query the data and extract valuable information to aid in the development of improved emergency preparedness, response and recovery measures.

  20. HepSEQ: International Public Health Repository for Hepatitis B

    PubMed Central

    Gnaneshan, Saravanamuttu; Ijaz, Samreen; Moran, Joanne; Ramsay, Mary; Green, Jonathan

    2007-01-01

    HepSEQ is a repository for an extensive library of public health and molecular data relating to hepatitis B virus (HBV) infection collected from international sources. It is hosted by the Centre for Infections, Health Protection Agency (HPA), England, United Kingdom. This repository has been developed as a web-enabled, quality-controlled database to act as a tool for surveillance, HBV case management and for research. The web front-end for the database system can be accessed from . The format of the database system allows for comprehensive molecular, clinical and epidemiological data to be deposited into a functional database, to search and manipulate the stored data and to extract and visualize the information on epidemiological, virological, clinical, nucleotide sequence and mutational aspects of HBV infection through web front-end. Specific tools, built into the database, can be utilized to analyse deposited data and provide information on HBV genotype, identify mutations with known clinical significance (e.g. vaccine escape, precore and antiviral-resistant mutations) and carry out sequence homology searches against other deposited strains. Further mechanisms are also in place to allow specific tailored searches of the database to be undertaken. PMID:17130143

  1. TIMSS 2011 User Guide for the International Database. Supplement 2: National Adaptations of International Background Questionnaires

    ERIC Educational Resources Information Center

    Foy, Pierre, Ed.; Arora, Alka, Ed.; Stanco, Gabrielle M., Ed.

    2013-01-01

    This supplement describes national adaptations made to the international version of the TIMSS 2011 background questionnaires. This information provides users with a guide to evaluate the availability of internationally comparable data for use in secondary analyses involving the TIMSS 2011 background variables. Background questionnaire adaptations…

  2. Deaths from international terrorism compared with road crash deaths in OECD countries

    PubMed Central

    Wilson, N; Thomson, G

    2005-01-01

    Methods: Data on deaths from international terrorism (US State Department database) were collated (1994–2003) and compared to the road injury deaths (year 2000 and 2001 data) from the OECD International Road Transport Accident Database. Results: In the 29 OECD countries for which comparable data were available, the annual average death rate from road injury was approximately 390 times that from international terrorism. The ratio of annual road to international terrorism deaths (averaged over 10 years) was lowest for the United States at 142 times. In 2001, road crash deaths in the US were equal to those from a September 11 attack every 26 days. Conclusions: There is a large difference in the magnitude of these two causes of deaths from injury. Policy makers need to be aware of this when allocating resources to preventing these two avoidable causes of mortality. PMID:16326764

  3. The Decision Making Trial and Evaluation Laboratory (Dematel) and Analytic Network Process (ANP) for Safety Management System Evaluation Performance

    NASA Astrophysics Data System (ADS)

    Rolita, Lisa; Surarso, Bayu; Gernowo, Rahmat

    2018-02-01

    In order to improve airport safety management system (SMS) performance, an evaluation system is required to improve on current shortcomings and maximize safety. This study suggests the integration of the DEMATEL and ANP methods in decision making processes by analyzing causal relations between the relevant criteria and taking effective analysis-based decision. The DEMATEL method builds on the ANP method in identifying the interdependencies between criteria. The input data consists of questionnaire data obtained online and then stored in an online database. Furthermore, the questionnaire data is processed using DEMATEL and ANP methods to obtain the results of determining the relationship between criteria and criteria that need to be evaluated. The study cases on this evaluation system were Adi Sutjipto International Airport, Yogyakarta (JOG); Ahmad Yani International Airport, Semarang (SRG); and Adi Sumarmo International Airport, Surakarta (SOC). The integration grades SMS performance criterion weights in a descending order as follow: safety and destination policy, safety risk management, healthcare, and safety awareness. Sturges' formula classified the results into nine grades. JOG and SMG airports were in grade 8, while SOG airport was in grade 7.

  4. Clinical characteristics predicting internal neurofibromas in 357 children with neurofibromatosis-1: results from a cross-selectional study

    PubMed Central

    2012-01-01

    Objective To identify clinical characteristics associated with internal neurofibromas in children with NF1, as a means of ensuring the early identification of patients at high risk for malignant peripheral nerve-sheath tumors developed from preexisting internal neurofibromas. Patients and methods We used data from two NF1 populations, in France and North America, respectively. The French database comprised 1083 patients meeting NIH diagnostic criteria for NF1 and the Neurofibromatosis Institute Database of North America comprised 703 patients. Patients younger than 17 years of age were eligible for our study if they had been evaluated for internal neurofibromas using computed tomography and/or magnetic resonance imaging. Clinical characteristics associated with internal neurofibromas by univariate analysis (P ≤ 0.15) were entered into a multiple logistic regression model after checking for potential interactions and confounding. Multiple imputation was used for missing values. Results Among the 746 children in the two databases, 357 (48%) met our inclusion criteria. Their mean age was 7.7 ± 5.0 years and there were 192 (53.8%) males. Internal neurofibromas were present in 35 (9.8%) patients. Internal neurofibromas developed earlier in females than in males and their prevalence increased during adolescence. Factors independently associated with internal neurofibromas were age (OR = 1.16 [1.07-1.27]), xanthogranulomas (OR = 5.85 [2.18-15.89]) and presence of both subcutaneous and plexiform neurofibromas (OR = 6.80 [1.52-30.44]). Conclusions Several easily recognizable clinical characteristics indicate a high risk of internal neurofibromas in children with NF1 and, therefore, a need for very close monitoring. PMID:22943186

  5. TOPDOM: database of conservatively located domains and motifs in proteins.

    PubMed

    Varga, Julia; Dobson, László; Tusnády, Gábor E

    2016-09-01

    The TOPDOM database-originally created as a collection of domains and motifs located consistently on the same side of the membranes in α-helical transmembrane proteins-has been updated and extended by taking into consideration consistently localized domains and motifs in globular proteins, too. By taking advantage of the recently developed CCTOP algorithm to determine the type of a protein and predict topology in case of transmembrane proteins, and by applying a thorough search for domains and motifs as well as utilizing the most up-to-date version of all source databases, we managed to reach a 6-fold increase in the size of the whole database and a 2-fold increase in the number of transmembrane proteins. TOPDOM database is available at http://topdom.enzim.hu The webpage utilizes the common Apache, PHP5 and MySQL software to provide the user interface for accessing and searching the database. The database itself is generated on a high performance computer. tusnady.gabor@ttk.mta.hu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  6. 76 FR 67732 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... proposed information collection project: ``Nursing Home Survey on Patient Safety Culture Comparative... Nursing Home Survey on Patient Safety Culture Comparative Database The Agency for Healthcare Research and... Culture (Nursing Home SOPS) Comparative Database. The Nursing Home SOPS Comparative Database consists of...

  7. Development and Implementation of a Segment/Junction Box Level Database for the ITS Fiber Optic Conduit Network

    DOT National Transportation Integrated Search

    2012-03-01

    This project initiated the development of a computerized database of ITS facilities, including conduits, junction : boxes, cameras, connections, etc. The current system consists of a database of conduit sections of various lengths. : Over the length ...

  8. Cross-checking of Large Evaluated and Experimental Nuclear Reaction Databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeydina, O.; Koning, A.J.; Soppera, N.

    2014-06-15

    Automated methods are presented for the verification of large experimental and evaluated nuclear reaction databases (e.g. EXFOR, JEFF, TENDL). These methods allow an assessment of the overall consistency of the data and detect aberrant values in both evaluated and experimental databases.

  9. A database application for wilderness character monitoring

    Treesearch

    Ashley Adams; Peter Landres; Simon Kingston

    2012-01-01

    The National Park Service (NPS) Wilderness Stewardship Division, in collaboration with the Aldo Leopold Wilderness Research Institute and the NPS Inventory and Monitoring Program, developed a database application to facilitate tracking and trend reporting in wilderness character. The Wilderness Character Monitoring Database allows consistent, scientifically based...

  10. A psycholinguistic database for traditional Chinese character naming.

    PubMed

    Chang, Ya-Ning; Hsu, Chun-Hsien; Tsai, Jie-Li; Chen, Chien-Liang; Lee, Chia-Ying

    2016-03-01

    In this study, we aimed to provide a large-scale set of psycholinguistic norms for 3,314 traditional Chinese characters, along with their naming reaction times (RTs), collected from 140 Chinese speakers. The lexical and semantic variables in the database include frequency, regularity, familiarity, consistency, number of strokes, homophone density, semantic ambiguity rating, phonetic combinability, semantic combinability, and the number of disyllabic compound words formed by a character. Multiple regression analyses were conducted to examine the predictive powers of these variables for the naming RTs. The results demonstrated that these variables could account for a significant portion of variance (55.8%) in the naming RTs. An additional multiple regression analysis was conducted to demonstrate the effects of consistency and character frequency. Overall, the regression results were consistent with the findings of previous studies on Chinese character naming. This database should be useful for research into Chinese language processing, Chinese education, or cross-linguistic comparisons. The database can be accessed via an online inquiry system (http://ball.ling.sinica.edu.tw/namingdatabase/index.html).

  11. Aping humans: age and sex effects in chimpanzee (Pan troglodytes) and human (Homo sapiens) personality.

    PubMed

    King, James E; Weiss, Alexander; Sisco, Melissa M

    2008-11-01

    Ratings of 202 chimpanzees on 43 personality descriptor adjectives were used to calculate scores on five domains analogous to the human Five-Factor Model and a chimpanzee-specific Dominance domain. Male and female chimpanzees were divided into five age groups ranging from juvenile to old adult. Internal consistencies and interrater reliabilities of factors were stable across age groups and approximately 6.8 year retest reliabilities were high. Age-related declines in Extraversion and Openness and increases in Agreeableness and Conscientiousness paralleled human age differences. The mean change in absolute standardized units for all five factors was virtually identical in humans and chimpanzees after adjustment for different developmental rates. Consistent with their aggressive behavior in the wild, male chimpanzees were rated as more aggressive, emotional, and impulsive than females. Chimpanzee sex differences in personality were greater than comparable human gender differences. These findings suggest that chimpanzee and human personality develop via an unfolding maturational process. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  12. The national assessment of shoreline change: A GIS compilation of vector shorelines and associated shoreline change data for the New England and Mid-Atlantic Coasts

    USGS Publications Warehouse

    Himmelstoss, Emily A.; Kratzmann, Meredith G.; Hapke, Cheryl; Thieler, E. Robert; List, Jeffrey

    2010-01-01

    Sandy ocean beaches are a popular recreational destination, often surrounded by communities containing valuable real estate. Development is on the rise despite the fact that coastal infrastructure is subjected to flooding and erosion. As a result, there is an increased demand for accurate information regarding past and present shoreline changes. The U.S. Geological Survey's National Assessment of Shoreline Change Project has compiled a comprehensive database of digital vector shorelines and shoreline-change rates for the New England and Mid-Atlantic Coasts. There is currently no widely accepted standard for analyzing shoreline change. Existing measurement and rate-calculation methods vary from study to study and preclude combining results into statewide or regional assessments. The impetus behind the National Assessment project was to develop a standardized method that is consistent from coast to coast for measuring changes in shoreline position. The goal was to facilitate the process of periodically and systematically updating the results in an internally consistent manner.

  13. Component, Context and Manufacturing Model Library (C2M2L)

    DTIC Science & Technology

    2013-03-01

    Penn State team were stored in a relational database for easy access, storage and maintainability. The relational database consisted of a PostGres ...file into a format that can be imported into the PostGres database. This same custom application was used to generate Microsoft Excel templates...Press Break Forming Equipment 4.14 Manufacturing Model Library Database Structure The data storage mechanism for the ARL PSU MML was a PostGres database

  14. KSC-99pp0311

    NASA Image and Video Library

    1999-03-23

    In the Multi-Payload Processing Facility, Mary Reaves and Richard Rainen, with the Jet Propulsion Laboratory, work on the carrier and horizontal antenna mast for the STS-99 Shuttle Radar Topography Mission (SRTM) while Larry Broms watches. The SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during an 11-day mission in September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  15. KSC-99pp0505

    NASA Image and Video Library

    1999-05-07

    In the Space Station Processing Facility (SSPF), workers (lower right) disconnect the transport vehicle from the Shuttle Radar Topography Mission (SRTM) after moving it into the building for pre-launch preparations. The primary payload on mission STS-99, the SRTM consists of a specially modified radar system that will fly onboard the Space Shuttle during the 11-day mission targeted for launch in September 1999. This radar system will gather data that will result in the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth

  16. Measuring children's regulation of emotion-expressive behavior.

    PubMed

    Bar-Haim, Yair; Bar-Av, Gali; Sadeh, Avi

    2011-04-01

    Emotion regulation has become a pivotal concept in developmental and clinical research. However, the measurement of regulatory processes has proved extremely difficult, particularly in the context of within-subject designs. Here, we describe a formal conceptualization and a new experimental procedure, the Balloons Game, to measure a regulatory component of emotion-expressive behavior. We present the internal consistency and stability of the indices derived from the Balloons Game in a sample of 121 kindergarten children. External validation against measures that have been associated with emotion regulation processes is also provided. The findings suggest that the Balloons Game provides a reliable tool for the study of regulation of emotion expression in young children. PsycINFO Database Record (c) 2011 APA, all rights reserved.

  17. Deaths from international terrorism compared with road crash deaths in OECD countries.

    PubMed

    Wilson, N; Thomson, G

    2005-12-01

    To estimate the relative number of deaths in member countries of the Organisation for Economic Co-operation and Development (OECD) from international terrorism and road crashes. Data on deaths from international terrorism (US State Department database) were collated (1994-2003) and compared to the road injury deaths (year 2000 and 2001 data) from the OECD International Road Transport Accident Database. In the 29 OECD countries for which comparable data were available, the annual average death rate from road injury was approximately 390 times that from international terrorism. The ratio of annual road to international terrorism deaths (averaged over 10 years) was lowest for the United States at 142 times. In 2001, road crash deaths in the US were equal to those from a September 11 attack every 26 days. There is a large difference in the magnitude of these two causes of deaths from injury. Policy makers need to be aware of this when allocating resources to preventing these two avoidable causes of mortality.

  18. A hospital-wide clinical findings dictionary based on an extension of the International Classification of Diseases (ICD).

    PubMed

    Bréant, C; Borst, F; Campi, D; Griesser, V; Momjian, S

    1999-01-01

    The use of a controlled vocabulary set in a hospital-wide clinical information system is of crucial importance for many departmental database systems to communicate and exchange information. In the absence of an internationally recognized clinical controlled vocabulary set, a new extension of the International statistical Classification of Diseases (ICD) is proposed. It expands the scope of the standard ICD beyond diagnosis and procedures to clinical terminology. In addition, the common Clinical Findings Dictionary (CFD) further records the definition of clinical entities. The construction of the vocabulary set and the CFD is incremental and manual. Tools have been implemented to facilitate the tasks of defining/maintaining/publishing dictionary versions. The design of database applications in the integrated clinical information system is driven by the CFD which is part of the Medical Questionnaire Designer tool. Several integrated clinical database applications in the field of diabetes and neuro-surgery have been developed at the HUG.

  19. A hospital-wide clinical findings dictionary based on an extension of the International Classification of Diseases (ICD).

    PubMed Central

    Bréant, C.; Borst, F.; Campi, D.; Griesser, V.; Momjian, S.

    1999-01-01

    The use of a controlled vocabulary set in a hospital-wide clinical information system is of crucial importance for many departmental database systems to communicate and exchange information. In the absence of an internationally recognized clinical controlled vocabulary set, a new extension of the International statistical Classification of Diseases (ICD) is proposed. It expands the scope of the standard ICD beyond diagnosis and procedures to clinical terminology. In addition, the common Clinical Findings Dictionary (CFD) further records the definition of clinical entities. The construction of the vocabulary set and the CFD is incremental and manual. Tools have been implemented to facilitate the tasks of defining/maintaining/publishing dictionary versions. The design of database applications in the integrated clinical information system is driven by the CFD which is part of the Medical Questionnaire Designer tool. Several integrated clinical database applications in the field of diabetes and neuro-surgery have been developed at the HUG. Images Figure 1 PMID:10566451

  20. Intimate partner violence during pregnancy and behavioral problems in children and adolescents: a meta-analysis.

    PubMed

    Silva, Elisabete P; Lemos, Andrea; Andrade, Carlos H S; Ludermir, Ana B

    2018-03-21

    To evaluate the association of intimate partner violence during the gestational period and the development of externalizing and internalizing behavioral problems in children and adolescents. A meta-analysis of cohort and case-control studies was performed, using studies selected from electronic databases. Eligible studies included women who experienced intimate partner violence during pregnancy and their children's behavioral problems. These problems encompass two groups: externalizing problems (expressed by hyperactivity, aggressive and challenging behavior, and delinquency) and internalizing problems (represented by depressive moods, anxiety, and somatic symptoms). The risk of bias was assessed by the Newcastle-Ottawa Quality Assessment Scale (NOS) and the quality of evidence by the Grading of Recommendations, Assessment, Development and Evaluation (GRADE). RevMan 5.3 software was used for the meta-analysis. Of the 687 eligible articles, only seven met all inclusion criteria and consisted of 12,250 mother/child pairs. The age range of the assessed children varied from 10 months to 16 years. The odds of internalizing problems in children exposed to prenatal violence were two-fold higher (OR=2.10, 95% CI: 1.17-3.76) and that of externalizing problems were 1.9-fold higher (95% CI: 1.28-2.83), when compared to children of unexposed mothers. The results of this study are consistent with the hypothesis that women's exposure to intimate partner violence during pregnancy may be associated with behavioral problems of their children, emphasizing the need for greater understanding about the vulnerability of children to adversity in early ages. Copyright © 2018 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  1. Olympic Information in the SPORT Database.

    ERIC Educational Resources Information Center

    Belna, Alison M.; And Others

    1984-01-01

    Profiles the SPORT database, produced by Sport Information Resource Centre, Ottawa, Ontario, which provides extensive coverage of individual sports including practice, training and equipment, recreation, sports medicine, physical education, sport facilities, and international sport history. Olympic coverage in SPORT, sports sciences, online…

  2. PHYTOTOX: DATABASE DEALING WITH THE EFFECT OF ORGANIC CHEMICALS ON TERRESTRIAL VASCULAR PLANTS

    EPA Science Inventory

    A new database, PHYTOTOX, dealing with the direct effects of exogenously supplied organic chemicals on terrestrial vascular plants is described. The database consists of two files, a Reference File and Effects File. The Reference File is a bibliographic file of published research...

  3. The Starlite Project

    DTIC Science & Technology

    1990-09-01

    conflicts. The current prototyping tool also provides a multiversion data object control mechanism. From a series of experiments, we found that the...performance of a multiversion distributed database system is quite sensitive to the size of read-sets and write-sets of transactions. A multiversion database...510-512. (18) Son, S. H. and N. Haghighi, "Performance Evaluation of Multiversion Database Systems," Sixth IEEE International Conference on Data

  4. Are Bibliographic Management Software Search Interfaces Reliable?: A Comparison between Search Results Obtained Using Database Interfaces and the EndNote Online Search Function

    ERIC Educational Resources Information Center

    Fitzgibbons, Megan; Meert, Deborah

    2010-01-01

    The use of bibliographic management software and its internal search interfaces is now pervasive among researchers. This study compares the results between searches conducted in academic databases' search interfaces versus the EndNote search interface. The results show mixed search reliability, depending on the database and type of search…

  5. Software database creature for investment property measurement according to international standards

    NASA Astrophysics Data System (ADS)

    Ponomareva, S. V.; Merzliakova, N. A.

    2018-05-01

    The article deals with investment property measurement and accounting problems at the international, national and enterprise levels. The need to create the software for investment property measurement according to International Accounting Standards was substantiated. The necessary software functions and the processes were described.

  6. 16 CFR 1102.16 - Additional information.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... PUBLICLY AVAILABLE CONSUMER PRODUCT SAFETY INFORMATION DATABASE Content Requirements § 1102.16 Additional... in the Database any additional information it determines to be in the public interest, consistent...

  7. 16 CFR 1102.16 - Additional information.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... PUBLICLY AVAILABLE CONSUMER PRODUCT SAFETY INFORMATION DATABASE Content Requirements § 1102.16 Additional... in the Database any additional information it determines to be in the public interest, consistent...

  8. Citation Analysis of The Korean Journal of Internal Medicine from KoMCI, Web of Science, and Scopus

    PubMed Central

    2011-01-01

    The Korean Journal of Internal Medicine (KJIM) is the international journal published in English by the Korean Association of Internal Medicine. To understand the position of the journal in three different databases, the citation indicators were elucidated. From databases such as Korean Medical Citation Index (KoMCI), Web of Science, and Scopus, citation indicators such as the impact factor, SCImago journal rank (SJR), or Hirsch Index were calculated according to the year and the results were drawn. The KJIM 2010 impact factor increased to 0.623 in Web of Science. That of year 2009 in KoMCI was a 0.149. The 2009 SJR in Scopus was 0.073, with a ranking of 27/72 (37.5%) in the category of internal medicine and 414/1,618 (25.6%) in the category of medicine, miscellaneous. The Hirsch Index from KoMCI, Web of Science and Scopus were 5, 14, and 16, respectively. The KJIM is now cited more by international researchers than Korean researchers, indicating that the content of the journal is now valued at the international level. PMID:21437155

  9. Citation analysis of The Korean Journal of Internal Medicine from KoMCI, Web of Science, and Scopus.

    PubMed

    Huh, Sun

    2011-03-01

    The Korean Journal of Internal Medicine (KJIM) is the international journal published in English by the Korean Association of Internal Medicine. To understand the position of the journal in three different databases, the citation indicators were elucidated. From databases such as Korean Medical Citation Index (KoMCI), Web of Science, and Scopus, citation indicators such as the impact factor, SCImago journal rank (SJR), or Hirsch Index were calculated according to the year and the results were drawn. The KJIM 2010 impact factor increased to 0.623 in Web of Science. That of year 2009 in KoMCI was a 0.149. The 2009 SJR in Scopus was 0.073, with a ranking of 27/72 (37.5%) in the category of internal medicine and 414/1,618 (25.6%) in the category of medicine, miscellaneous. The Hirsch Index from KoMCI, Web of Science and Scopus were 5, 14, and 16, respectively. The KJIM is now cited more by international researchers than Korean researchers, indicating that the content of the journal is now valued at the international level.

  10. Are nutrition messages lost in transmission? Assessing the quality and consistency of diabetes guideline recommendations on the delivery of nutrition therapy.

    PubMed

    Hale, Kelli; Capra, Sandra; Bauer, Judy

    2016-12-01

    To provide an overview of (1) the consistency of Type 2 Diabetes Clinical Practice Guidelines recommendations on the delivery of nutrition therapy and (2) Clinical Practice Guideline quality. Large international clinical practice guideline repositories, diabetes organisation websites, and electronic databases (Pubmed, Scopus), were searched to identify Clinical Practice Guidelines for adults with type 2 diabetes published 2005 to August 2014. Recommendations on the delivery of nutrition therapy were extracted and inductive content analysis was used to analyse consistency. Two researchers independently assessed guideline quality using the AGREE II tool. Nine topics were identified from the recommendations. Overall the consistency of the recommendations was related to guideline type. Compared with nutrition-specific guidelines, the broad ones had a broader focus and included more patient-focused recommendations. The ten Clinical Practice Guidelines assessed included six broad guidelines and four nutrition specific guidelines. Based on AGREE II analysis, the broad guidelines were higher quality than nutrition-specific ones. Broad Clinical Practice Guidelines were higher quality and included more patient-focused recommendations than nutrition-specific ones. Our findings suggest a need for nutrition-specific guidelines to be modified to include greater patient-focus, or for practitioners delivering nutrition therapy to adopt broad Clinical Practice Guidelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. International Collaboration in Data Management for Scientific Ocean Drilling: Preserving Legacy Data While Implementing New Requirements.

    NASA Astrophysics Data System (ADS)

    Rack, F. R.

    2005-12-01

    The Integrated Ocean Drilling Program (IODP: 2003-2013 initial phase) is the successor to the Deep Sea Drilling Project (DSDP: 1968-1983) and the Ocean Drilling Program (ODP: 1985-2003). These earlier scientific drilling programs amassed collections of sediment and rock cores (over 300 kilometers stored in four repositories) and data organized in distributed databases and in print or electronic publications. International members of the IODP have established, through memoranda, the right to have access to: (1) all data, samples, scientific and technical results, all engineering plans, data or other information produced under contract to the program; and, (2) all data from geophysical and other site surveys performed in support of the program which are used for drilling planning. The challenge that faces the individual platform operators and management of IODP is to find the right balance and appropriate synergies among the needs, expectations and requirements of stakeholders. The evolving model for IODP database services consists of the management and integration of data collected onboard the various IODP platforms (including downhole logging and syn-cruise site survey information), legacy data from DSDP and ODP, data derived from post-cruise research and publications, and other IODP-relevant information types, to form a common, program-wide IODP information system (e.g., IODP Portal) which will be accessible to both researchers and the public. The JANUS relational database of ODP was introduced in 1997 and the bulk of ODP shipboard data has been migrated into this system, which is comprised of a relational data model consisting of over 450 tables. The JANUS database includes paleontological, lithostratigraphic, chemical, physical, sedimentological, and geophysical data from a global distribution of sites. For ODP Legs 100 through 210, and including IODP Expeditions 301 through 308, JANUS has been used to store data from 233,835 meters of core recovered, which are comprised of 38,039 cores, with 202,281 core sections stored in repositories, which have resulted in the taking of 2,299,180 samples for scientists and other users (http://iodp.tamu.edu/janusweb/general/dbtable.cgi). JANUS and other IODP databases are viewed as components of an evolving distributed network of databases, supported by metadata catalogs and middleware with XML workflows, that are intended to provide access to DSDP/ODP/IODP cores and sample-based data as well as other distributed geoscience data collections (e.g., CHRONOS, PetDB, SedDB). These data resources can be explored through the use of emerging data visualization environments, such as GeoWall, CoreWall (http://(www.evl.uic.edu/cavern/corewall), a multi-screen display for viewing cores and related data, GeoWall-2 and LambdaVision, a very-high resolution, networked environment for data exploration and visualization, and others. The U.S Implementing Organization (USIO) for the IODP, also known as the JOI Alliance, is a partnership between Joint Oceanographic Institutions (JOI), Texas A&M University, and Lamont-Doherty Earth Observatory of Columbia University. JOI is a consortium of 20 premier oceanographic research institutions that serves the U.S. scientific community by leading large-scale, global research programs in scientific ocean drilling and ocean observing. For more than 25 years, JOI has helped facilitate discovery and advance global understanding of the Earth and its oceans through excellence in program management.

  12. Where are Romanian biomedical journals now and what does the future hold for them? A scientometric analysis.

    PubMed

    Dumitrascu, Dan L

    2018-01-01

    There is a competition between scientific journals in order to achieve leadership in their scientific field. There are several Romanian biomedical journals which are published in English and a smaller number in Romanian. We need a periodical analysis of their visibility and ranking according to scientometric measures. We searched all biomedical journals indexed on international data bases: Web of Science, PubMed, Scopus, Embase, Google Scholar. We analyzed their evaluation factors. Several journals from Romania in the biomedical field are indexed in international databases. Their scientometric indexes are not high. The best journal was acquired by an international publisher and is no longer listed for Romania. There are several Romanian biomedical journals indexed in international databases that deserve periodical analysis. There is a need to improve their ranking.

  13. Consistent Query Answering of Conjunctive Queries under Primary Key Constraints

    ERIC Educational Resources Information Center

    Pema, Enela

    2014-01-01

    An inconsistent database is a database that violates one or more of its integrity constraints. In reality, violations of integrity constraints arise frequently under several different circumstances. Inconsistent databases have long posed the challenge to develop suitable tools for meaningful query answering. A principled approach for querying…

  14. A Graphical Database Interface for Casual, Naive Users.

    ERIC Educational Resources Information Center

    Burgess, Clifford; Swigger, Kathleen

    1986-01-01

    Describes the design of a database interface for infrequent users of computers which consists of a graphical display of a model of a database and a natural language query language. This interface was designed for and tested with physicians at the University of Texas Health Science Center in Dallas. (LRW)

  15. Bibliographical database of radiation biological dosimetry and risk assessment: Part 1, through June 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straume, T.; Ricker, Y.; Thut, M.

    1988-08-29

    This database was constructed to support research in radiation biological dosimetry and risk assessment. Relevant publications were identified through detailed searches of national and international electronic databases and through our personal knowledge of the subject. Publications were numbered and key worded, and referenced in an electronic data-retrieval system that permits quick access through computerized searches on publication number, authors, key words, title, year, and journal name. Photocopies of all publications contained in the database are maintained in a file that is numerically arranged by citation number. This report of the database is provided as a useful reference and overview. Itmore » should be emphasized that the database will grow as new citations are added to it. With that in mind, we arranged this report in order of ascending citation number so that follow-up reports will simply extend this document. The database cite 1212 publications. Publications are from 119 different scientific journals, 27 of these journals are cited at least 5 times. It also contains reference to 42 books and published symposia, and 129 reports. Information relevant to radiation biological dosimetry and risk assessment is widely distributed among the scientific literature, although a few journals clearly dominate. The four journals publishing the largest number of relevant papers are Health Physics, Mutation Research, Radiation Research, and International Journal of Radiation Biology. Publications in Health Physics make up almost 10% of the current database.« less

  16. Drinking Water Database

    NASA Technical Reports Server (NTRS)

    Murray, ShaTerea R.

    2004-01-01

    This summer I had the opportunity to work in the Environmental Management Office (EMO) under the Chemical Sampling and Analysis Team or CS&AT. This team s mission is to support Glenn Research Center (GRC) and EM0 by providing chemical sampling and analysis services and expert consulting. Services include sampling and chemical analysis of water, soil, fbels, oils, paint, insulation materials, etc. One of this team s major projects is the Drinking Water Project. This is a project that is done on Glenn s water coolers and ten percent of its sink every two years. For the past two summers an intern had been putting together a database for this team to record the test they had perform. She had successfully created a database but hadn't worked out all the quirks. So this summer William Wilder (an intern from Cleveland State University) and I worked together to perfect her database. We began be finding out exactly what every member of the team thought about the database and what they would change if any. After collecting this data we both had to take some courses in Microsoft Access in order to fix the problems. Next we began looking at what exactly how the database worked from the outside inward. Then we began trying to change the database but we quickly found out that this would be virtually impossible.

  17. The IUGS/IAGC Task Group on Global Geochemical Baselines

    USGS Publications Warehouse

    Smith, David B.; Wang, Xueqiu; Reeder, Shaun; Demetriades, Alecos

    2012-01-01

    The Task Group on Global Geochemical Baselines, operating under the auspices of both the International Union of Geological Sciences (IUGS) and the International Association of Geochemistry (IAGC), has the long-term goal of establishing a global geochemical database to document the concentration and distribution of chemical elements in the Earth’s surface or near-surface environment. The database and accompanying element distribution maps represent a geochemical baseline against which future human-induced or natural changes to the chemistry of the land surface may be recognized and quantified. In order to accomplish this long-term goal, the activities of the Task Group include: (1) developing partnerships with countries conducting broad-scale geochemical mapping studies; (2) providing consultation and training in the form of workshops and short courses; (3) organizing periodic international symposia to foster communication among the geochemical mapping community; (4) developing criteria for certifying those projects whose data are acceptable in a global geochemical database; (5) acting as a repository for data collected by those projects meeting the criteria for standardization; (6) preparing complete metadata for the certified projects; and (7) preparing, ultimately, a global geochemical database. This paper summarizes the history and accomplishments of the Task Group since its first predecessor project was established in 1988.

  18. An Initial Design of ISO 19152:2012 LADM Based Valuation and Taxation Data Model

    NASA Astrophysics Data System (ADS)

    Çağdaş, V.; Kara, A.; van Oosterom, P.; Lemmen, C.; Işıkdağ, Ü.; Kathmann, R.; Stubkjær, E.

    2016-10-01

    A fiscal registry or database is supposed to record geometric, legal, physical, economic, and environmental characteristics in relation to property units, which are subject to immovable property valuation and taxation. Apart from procedural standards, there is no internationally accepted data standard that defines the semantics of fiscal databases. The ISO 19152:2012 Land Administration Domain Model (LADM), as an international land administration standard focuses on legal requirements, but considers out of scope specifications of external information systems including valuation and taxation databases. However, it provides a formalism which allows for an extension that responds to the fiscal requirements. This paper introduces an initial version of a LADM - Fiscal Extension Module for the specification of databases used in immovable property valuation and taxation. The extension module is designed to facilitate all stages of immovable property taxation, namely the identification of properties and taxpayers, assessment of properties through single or mass appraisal procedures, automatic generation of sales statistics, and the management of tax collection, dealing with arrears and appeals. It is expected that the initial version will be refined through further activities held by a possible joint working group under FIG Commission 7 (Cadastre and Land Management) and FIG Commission 9 (Valuation and the Management of Real Estate) in collaboration with other relevant international bodies.

  19. The International Project. Progress Report.

    ERIC Educational Resources Information Center

    Rutimann, Hans

    The International Project of the Commission on Preservation and Access was begun in June 1988 to explore the feasibility of creating an international database of preserved materials. Its main goals are to: (1) determine the extent to which preservation records exist in other countries; (2) identify the difficulties in converting records to…

  20. International Student Perceptions of Information Needs and Use

    ERIC Educational Resources Information Center

    Yi, Zhixian

    2007-01-01

    This study examines international student information needs and whether education level, age, and gender affect their information use. An e-mail survey revealed that international students need information that supports their academic courses, and those with higher education levels use databases, remote access to library offerings, and e-journals…

  1. A dynamic clinical dental relational database.

    PubMed

    Taylor, D; Naguib, R N G; Boulton, S

    2004-09-01

    The traditional approach to relational database design is based on the logical organization of data into a number of related normalized tables. One assumption is that the nature and structure of the data is known at the design stage. In the case of designing a relational database to store historical dental epidemiological data from individual clinical surveys, the structure of the data is not known until the data is presented for inclusion into the database. This paper addresses the issues concerned with the theoretical design of a clinical dynamic database capable of adapting the internal table structure to accommodate clinical survey data, and presents a prototype database application capable of processing, displaying, and querying the dental data.

  2. Academic impact of a public electronic health database: bibliometric analysis of studies using the general practice research database.

    PubMed

    Chen, Yu-Chun; Wu, Jau-Ching; Haschler, Ingo; Majeed, Azeem; Chen, Tzeng-Ji; Wetter, Thomas

    2011-01-01

    Studies that use electronic health databases as research material are getting popular but the influence of a single electronic health database had not been well investigated yet. The United Kingdom's General Practice Research Database (GPRD) is one of the few electronic health databases publicly available to academic researchers. This study analyzed studies that used GPRD to demonstrate the scientific production and academic impact by a single public health database. A total of 749 studies published between 1995 and 2009 with 'General Practice Research Database' as their topics, defined as GPRD studies, were extracted from Web of Science. By the end of 2009, the GPRD had attracted 1251 authors from 22 countries and been used extensively in 749 studies published in 193 journals across 58 study fields. Each GPRD study was cited 2.7 times by successive studies. Moreover, the total number of GPRD studies increased rapidly, and it is expected to reach 1500 by 2015, twice the number accumulated till the end of 2009. Since 17 of the most prolific authors (1.4% of all authors) contributed nearly half (47.9%) of GPRD studies, success in conducting GPRD studies may accumulate. The GPRD was used mainly in, but not limited to, the three study fields of "Pharmacology and Pharmacy", "General and Internal Medicine", and "Public, Environmental and Occupational Health". The UK and United States were the two most active regions of GPRD studies. One-third of GRPD studies were internationally co-authored. A public electronic health database such as the GPRD will promote scientific production in many ways. Data owners of electronic health databases at a national level should consider how to reduce access barriers and to make data more available for research.

  3. SCEGRAM: An image database for semantic and syntactic inconsistencies in scenes.

    PubMed

    Öhlschläger, Sabine; Võ, Melissa Le-Hoa

    2017-10-01

    Our visual environment is not random, but follows compositional rules according to what objects are usually found where. Despite the growing interest in how such semantic and syntactic rules - a scene grammar - enable effective attentional guidance and object perception, no common image database containing highly-controlled object-scene modifications has been publically available. Such a database is essential in minimizing the risk that low-level features drive high-level effects of interest, which is being discussed as possible source of controversial study results. To generate the first database of this kind - SCEGRAM - we took photographs of 62 real-world indoor scenes in six consistency conditions that contain semantic and syntactic (both mild and extreme) violations as well as their combinations. Importantly, always two scenes were paired, so that an object was semantically consistent in one scene (e.g., ketchup in kitchen) and inconsistent in the other (e.g., ketchup in bathroom). Low-level salience did not differ between object-scene conditions and was generally moderate. Additionally, SCEGRAM contains consistency ratings for every object-scene condition, as well as object-absent scenes and object-only images. Finally, a cross-validation using eye-movements replicated previous results of longer dwell times for both semantic and syntactic inconsistencies compared to consistent controls. In sum, the SCEGRAM image database is the first to contain well-controlled semantic and syntactic object-scene inconsistencies that can be used in a broad range of cognitive paradigms (e.g., verbal and pictorial priming, change detection, object identification, etc.) including paradigms addressing developmental aspects of scene grammar. SCEGRAM can be retrieved for research purposes from http://www.scenegrammarlab.com/research/scegram-database/ .

  4. 16 CFR § 1102.16 - Additional information.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... PUBLICLY AVAILABLE CONSUMER PRODUCT SAFETY INFORMATION DATABASE Content Requirements § 1102.16 Additional... in the Database any additional information it determines to be in the public interest, consistent...

  5. INFOMAT: The international materials assessment and application centre's internet gateway

    NASA Astrophysics Data System (ADS)

    Branquinho, Carmen Lucia; Colodete, Leandro Tavares

    2004-08-01

    INFOMAT is an electronic directory structured to facilitate the search and retrieval of materials science and technology information sources. Linked to the homepage of the International Materials Assessment and Application Centre, INFOMAT presents descriptions of 392 proprietary databases with links to their host systems as well as direct links to over 180 public domain databases and over 2,400 web sites. Among the web sites are associations/unions, governmental and non-governmental institutions, industries, library holdings, market statistics, news services, on-line publications, standardization and intellectual property organizations, and universities/research groups.

  6. Sociologie de la lecture et de la bibliotheque: Choix de dix ans de la litterature speciale hongroise 1978-1987. Contribution a la Base de donnees internationale de Bibliologie (Sociology of Reading and Library Sociology: A Selection from the Hungarian Literature of a Decade 1978-1987. Contribution to the International Database of Bibliology).

    ERIC Educational Resources Information Center

    Kaposvari-Danyi, Eva, Comp.; Lorincz, Judit, Comp.

    This 175-item bibliography was compiled as the Hungarian contribution to an international database. It includes books, chapters of books, periodical articles, manuscripts, and dissertations that deal with bibliology (i.e., the sociology and psychology of book and library use). Citations are restricted to works of Hungarian authors published in…

  7. Rasch analysis of the Multiple Sclerosis Impact Scale (MSIS-29)

    PubMed Central

    Ramp, Melina; Khan, Fary; Misajon, Rose Anne; Pallant, Julie F

    2009-01-01

    Background Multiple Sclerosis (MS) is a degenerative neurological disease that causes impairments, including spasticity, pain, fatigue, and bladder dysfunction, which negatively impact on quality of life. The Multiple Sclerosis Impact Scale (MSIS-29) is a disease-specific health-related quality of life (HRQoL) instrument, developed using the patient's perspective on disease impact. It consists of two subscales assessing the physical (MSIS-29-PHYS) and psychological (MSIS-29-PSYCH) impact of MS. Although previous studies have found support for the psychometric properties of the MSIS-29 using traditional methods of scale evaluation, the scale has not been subjected to a detailed Rasch analysis. Therefore, the objective of this study was to use Rasch analysis to assess the internal validity of the scale, and its response format, item fit, targeting, internal consistency and dimensionality. Methods Ninety-two persons with definite MS residing in the community were recruited from a tertiary hospital database. Patients completed the MSIS-29 as part of a larger study. Rasch analysis was undertaken to assess the psychometric properties of the MSIS-29. Results Rasch analysis showed overall support for the psychometric properties of the two MSIS-29 subscales, however it was necessary to reduce the response format of the MSIS-29-PHYS to a 3-point response scale. Both subscales were unidimensional, had good internal consistency, and were free from item bias for sex and age. Dimensionality testing indicated it was not appropriate to combine the two subscales to form a total MSIS score. Conclusion In this first study to use Rasch analysis to fully assess the psychometric properties of the MSIS-29 support was found for the two subscales but not for the use of the total scale. Further use of Rasch analysis on the MSIS-29 in larger and broader samples is recommended to confirm these findings. PMID:19545445

  8. The Joint Committee for Traceability in Laboratory Medicine (JCTLM) - its history and operation.

    PubMed

    Jones, Graham R D; Jackson, Craig

    2016-01-30

    The Joint Committee for Traceability in Laboratory Medicine (JCTLM) was formed to bring together the sciences of metrology, laboratory medicine and laboratory quality management. The aim of this collaboration is to support worldwide comparability and equivalence of measurement results in clinical laboratories for the purpose of improving healthcare. The JCTLM has its origins in the activities of international metrology treaty organizations, professional societies and federations devoted to improving measurement quality in physical, chemical and medical sciences. The three founding organizations, the International Committee for Weights and Measures (CIPM), the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) and the International Laboratory Accreditation Cooperation (ILAC) are the leaders of this activity. The main service of the JCTLM is a web-based database with a list of reference materials, reference methods and reference measurement services meeting appropriate international standards. This database allows manufacturers to select references for assay traceability and provides support for suppliers of these services. As of mid 2015 the database lists 295 reference materials for 162 analytes, 170 reference measurement procedures for 79 analytes and 130 reference measurement services for 39 analytes. There remains a need for the development and implementation of metrological traceability in many areas of laboratory medicine and the JCTLM will continue to promote these activities into the future. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. A new test for the assessment of working memory in clinical settings: Validation and norming of a month ordering task.

    PubMed

    Buekenhout, Imke; Leitão, José; Gomes, Ana A

    2018-05-24

    Month ordering tasks have been used in experimental settings to obtain measures of working memory (WM) capacity in older/clinical groups based solely on their face validity. We sought to assess the appropriateness of using a month ordering task in other contexts, including clinical settings, as a psychometrically sound WM assessment. To this end, we constructed a month ordering task (ucMOT), studied its reliability (internal consistency and temporal stability), and gathered construct-related and criterion-related validity evidence for its use as a WM assessment. The ucMOT proved to be internally consistent and temporally stable, and analyses of the criterion-related validity evidence revealed that its scores predicted the efficiency of language comprehension processes known to depend crucially on WM resources, namely, processes involved in pronoun interpretation. Furthermore, all ucMOT items discriminated between younger and older age groups; the global scores were significantly correlated with scores on well-established WM tasks and presented lower correlations with instruments that evaluate different (although related) processes, namely, inhibition and processing speed. We conclude that the ucMOT possesses solid psychometric properties. Accordingly, we acquired normative data for the Portuguese population, which we present as a regression-based algorithm that yields z scores adjusted for age, gender, and years of formal education. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. A systematic review of the factor structure and reliability of the Spence Children's Anxiety Scale.

    PubMed

    Orgilés, Mireia; Fernández-Martínez, Iván; Guillén-Riquelme, Alejandro; Espada, José P; Essau, Cecilia A

    2016-01-15

    The Spence Children's Anxiety Scale (SCAS) is a widely used instrument for assessing symptoms of anxiety disorders among children and adolescents. Previous studies have demonstrated its good reliability for children and adolescents from different backgrounds. However, remarkable variability in the reliability of the SCAS across studies and inconsistent results regarding its factor structure has been found. The present study aims to examine the SCAS factor structure by means of a systematic review with narrative synthesis, the mean reliability of the SCAS by means of a meta-analysis, and the influence of the moderators on the SCAS reliability. Databases employed to collect the studies included Scholar Google, PsycARTICLES, PsycINFO, Web of Science, and Scopus since 1997. Twenty-nine and 32 studies, which examined the factor structure and the internal consistency of the SCAS, respectively, were included. The SCAS was found to have strong internal consistency, influenced by different moderators. The systematic review demonstrated that the original six-factor model was supported by most studies. Factorial invariance studies (across age, gender, country) and test-retest reliability of the SCAS were not examined in this study. It is concluded that the SCAS is a reliable instrument for cross-cultural use, and it is suggested that the original six-factor model is appropriate for cross-cultural application. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Uniform standards for genome databases in forest and fruit trees

    USDA-ARS?s Scientific Manuscript database

    TreeGenes and tfGDR serve the international forestry and fruit tree genomics research communities, respectively. These databases hold similar sequence data and provide resources for the submission and recovery of this information in order to enable comparative genomics research. Large-scale genotype...

  12. The database of the Nikolaev Astronomical Observatory as a unit of an international virtual observatory

    NASA Astrophysics Data System (ADS)

    Protsyuk, Yu.; Pinigin, G.; Shulga, A.

    2005-06-01

    Results of the development and organization of the digital database of the Nikolaev Astronomical Observatory (NAO) are presented. At present, three telescopes are connected to the local area network of NAO. All the data obtained, and results of data processing are entered into the common database of NAO. The daily average volume of new astronomical information obtained from the CCD instruments ranges from 300 MB up to 2 GB, depending on the purposes and conditions of observations. The overwhelming majority of the data are stored in the FITS format. Development and further improvement of storage standards, procedures of data handling and data processing are being carried out. It is planned to create an astronomical web portal with the possibility to have interactive access to databases and telescopes. In the future, this resource may become a part of an international virtual observatory. There are the prototypes of search tools with the use of PHP and MySQL. Efforts for getting more links to the Internet are being made.

  13. TWRS technical baseline database manager definition document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acree, C.D.

    1997-08-13

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

  14. Informatics and data quality at collaborative multicenter Breast and Colon Cancer Family Registries.

    PubMed

    McGarvey, Peter B; Ladwa, Sweta; Oberti, Mauricio; Dragomir, Anca Dana; Hedlund, Erin K; Tanenbaum, David Michael; Suzek, Baris E; Madhavan, Subha

    2012-06-01

    Quality control and harmonization of data is a vital and challenging undertaking for any successful data coordination center and a responsibility shared between the multiple sites that produce, integrate, and utilize the data. Here we describe a coordinated effort between scientists and data managers in the Cancer Family Registries to implement a data governance infrastructure consisting of both organizational and technical solutions. The technical solution uses a rule-based validation system that facilitates error detection and correction for data centers submitting data to a central informatics database. Validation rules comprise both standard checks on allowable values and a crosscheck of related database elements for logical and scientific consistency. Evaluation over a 2-year timeframe showed a significant decrease in the number of errors in the database and a concurrent increase in data consistency and accuracy.

  15. Informatics and data quality at collaborative multicenter Breast and Colon Cancer Family Registries

    PubMed Central

    McGarvey, Peter B; Ladwa, Sweta; Oberti, Mauricio; Dragomir, Anca Dana; Hedlund, Erin K; Tanenbaum, David Michael; Suzek, Baris E

    2012-01-01

    Quality control and harmonization of data is a vital and challenging undertaking for any successful data coordination center and a responsibility shared between the multiple sites that produce, integrate, and utilize the data. Here we describe a coordinated effort between scientists and data managers in the Cancer Family Registries to implement a data governance infrastructure consisting of both organizational and technical solutions. The technical solution uses a rule-based validation system that facilitates error detection and correction for data centers submitting data to a central informatics database. Validation rules comprise both standard checks on allowable values and a crosscheck of related database elements for logical and scientific consistency. Evaluation over a 2-year timeframe showed a significant decrease in the number of errors in the database and a concurrent increase in data consistency and accuracy. PMID:22323393

  16. Fullerene data mining using bibliometrics and database tomography

    PubMed

    Kostoff; Braun; Schubert; Toothman; Humenik

    2000-01-01

    Database tomography (DT) is a textual database analysis system consisting of two major components: (1) algorithms for extracting multiword phrase frequencies and phrase proximities (physical closeness of the multiword technical phrases) from any type of large textual database, to augment (2) interpretative capabilities of the expert human analyst. DT was used to derive technical intelligence from a fullerenes database derived from the Science Citation Index and the Engineering Compendex. Phrase frequency analysis by the technical domain experts provided the pervasive technical themes of the fullerenes database, and phrase proximity analysis provided the relationships among the pervasive technical themes. Bibliometric analysis of the fullerenes literature supplemented the DT results with author/journal/institution publication and citation data. Comparisons of fullerenes results with past analyses of similarly structured near-earth space, chemistry, hypersonic/supersonic flow, aircraft, and ship hydrodynamics databases are made. One important finding is that many of the normalized bibliometric distribution functions are extremely consistent across these diverse technical domains and could reasonably be expected to apply to broader chemical topics than fullerenes that span multiple structural classes. Finally, lessons learned about integrating the technical domain experts with the data mining tools are presented.

  17. The Forest Inventory and Analysis Database Version 4.0: Database Description and Users Manual for Phase 3

    Treesearch

    Christopher W. Woodall; Barbara L. Conkling; Michael C. Amacher; John W. Coulston; Sarah Jovan; Charles H. Perry; Beth Schulz; Gretchen C. Smith; Susan Will Wolf

    2010-01-01

    Describes the structure of the Forest Inventory and Analysis Database (FIADB) 4.0 for phase 3 indicators. The FIADB structure provides a consistent framework for storing forest health monitoring data across all ownerships for the entire United States. These data are available to the public.

  18. A Database Design and Development Case: NanoTEK Networks

    ERIC Educational Resources Information Center

    Ballenger, Robert M.

    2010-01-01

    This case provides a real-world project-oriented case study for students enrolled in a management information systems, database management, or systems analysis and design course in which database design and development are taught. The case consists of a business scenario to provide background information and details of the unique operating…

  19. Radio-Frequency Tank Eigenmode Sensor for Propellant Quantity Gauging

    NASA Technical Reports Server (NTRS)

    Zimmerli, Gregory A.; Buchanan, David A.; Follo, Jeffrey C.; Vaden, Karl R.; Wagner, James D.; Asipauskas, Marius; Herlacher, Michael D.

    2010-01-01

    Although there are several methods for determining liquid level in a tank, there are no proven methods to quickly gauge the amount of propellant in a tank while it is in low gravity or under low-settling thrust conditions where propellant sloshing is an issue. Having the ability to quickly and accurately gauge propellant tanks in low-gravity is an enabling technology that would allow a spacecraft crew or mission control to always know the amount of propellant onboard, thus increasing the chances for a successful mission. The Radio Frequency Mass Gauge (RFMG) technique measures the electromagnetic eigenmodes, or natural resonant frequencies, of a tank containing a dielectric fluid. The essential hardware components consist of an RF network analyzer that measures the reflected power from an antenna probe mounted internal to the tank. At a resonant frequency, there is a drop in the reflected power, and these inverted peaks in the reflected power spectrum are identified as the tank eigenmode frequencies using a peak-detection software algorithm. This information is passed to a pattern-matching algorithm, which compares the measured eigenmode frequencies with a database of simulated eigenmode frequencies at various fill levels. A best match between the simulated and measured frequency values occurs at some fill level, which is then reported as the gauged fill level. The database of simulated eigenmode frequencies is created by using RF simulation software to calculate the tank eigenmodes at various fill levels. The input to the simulations consists of a fairly high-fidelity tank model with proper dimensions and including internal tank hardware, the dielectric properties of the fluid, and a defined liquid/vapor interface. Because of small discrepancies between the model and actual hardware, the measured empty tank spectra and simulations are used to create a set of correction factors for each mode (typically in the range of 0.999 1.001), which effectively accounts for the small discrepancies. These correction factors are multiplied to the modes at all fill levels. By comparing several measured modes with the simulations, it is possible to accurately gauge the amount of propellant in the tank. An advantage of the RFMG approach of applying computer simulations and a pattern-matching algorithm is that the Although there are several methods for determining liquid level in a tank, there are no proven methods to quickly gauge the amount of propellant in a tank while it is in low gravity or under low-settling thrust conditions where propellant sloshing is an issue. Having the ability to quickly and accurately gauge propellant tanks in low-gravity is an enabling technology that would allow a spacecraft crew or mission control to always know the amount of propellant onboard, thus increasing the chances for a successful mission. The Radio Frequency Mass Gauge (RFMG) technique measures the electromagnetic eigenmodes, or natural resonant frequencies, of a tank containing a dielectric fluid. The essential hardware components consist of an RF network analyzer that measures the reflected power from an antenna probe mounted internal to the tank. At a resonant frequency, there is a drop in the reflected power, and these inverted peaks in the reflected power spectrum are identified as the tank eigenmode frequencies using a peak-detection software algorithm. This information is passed to a pattern-matching algorithm, which compares the measured eigenmode frequencies with a database of simulated eigenmode frequencies at various fill levels. A best match between the simulated and measured frequency values occurs at some fill level, which is then reported as the gauged fill level. The database of simulated eigenmode frequencies is created by using RF simulation software to calculate the tank eigenmodes at various fill levels. The input to the simulations consists of a fairly high-fidelity tank model with proper dimensions and including internal tank harare, the dielectric properties of the fluid, and a defined liquid/vapor interface. Because of small discrepancies between the model and actual hardware, the measured empty tank spectra and simulations are used to create a set of correction factors for each mode (typically in the range of 0.999 1.001), which effectively accounts for the small discrepancies. These correction factors are multiplied to the modes at all fill levels. By comparing several measured modes with the simulations, it is possible to accurately gauge the amount of propellant in the tank. An advantage of the RFMG approach of applying computer simulations and a pattern-matching algorithm is that the

  20. Readiness of food composition databases and food component analysis systems for nutrigenomics

    USDA-ARS?s Scientific Manuscript database

    The study objective was to discuss the international implications of using nutrigenomics as the basis for individualized health promotion and chronic disease prevention and the challenges it presents to existing nutrient databases and nutrient analysis systems. Definitions and research methods of nu...

  1. [Over- or underestimated? Bibliographic survey of the biomedical periodicals published in Hungary].

    PubMed

    Berhidi, Anna; Horváth, Katalin; Horváth, Gabriella; Vasas, Lívia

    2013-06-30

    This publication - based on an article published in 2006 - emphasises the qualities of the current biomedical periodicals of Hungarian editions. The aim of this study was to analyse how Hungarian journals meet the requirements of the scientific aspect and international visibility. Authors evaluated 93 Hungarian biomedical periodicals by 4 viewpoints of the two criteria mentioned above. 35% of the analysed journals complete the attributes of scientific aspect, 5% the international visibility, 6% fulfill all examined criteria, and 25% are indexed in international databases. 6 biomedical Hungarian periodicals covered by each of the three main bibliographic databases (Medline, Scopus, Web of Science) have the best qualities. Authors recommend to improve viewpoints of the scientific aspect and international visibility. The basis of qualitative adequacy are the accurate authors' guidelines, title, abstract, keywords of the articles in English, and the ability to publish on time.

  2. Virtual Manufacturing Techniques Designed and Applied to Manufacturing Activities in the Manufacturing Integration and Technology Branch

    NASA Technical Reports Server (NTRS)

    Shearrow, Charles A.

    1999-01-01

    One of the identified goals of EM3 is to implement virtual manufacturing by the time the year 2000 has ended. To realize this goal of a true virtual manufacturing enterprise the initial development of a machinability database and the infrastructure must be completed. This will consist of the containment of the existing EM-NET problems and developing machine, tooling, and common materials databases. To integrate the virtual manufacturing enterprise with normal day to day operations the development of a parallel virtual manufacturing machinability database, virtual manufacturing database, virtual manufacturing paradigm, implementation/integration procedure, and testable verification models must be constructed. Common and virtual machinability databases will include the four distinct areas of machine tools, available tooling, common machine tool loads, and a materials database. The machine tools database will include the machine envelope, special machine attachments, tooling capacity, location within NASA-JSC or with a contractor, and availability/scheduling. The tooling database will include available standard tooling, custom in-house tooling, tool properties, and availability. The common materials database will include materials thickness ranges, strengths, types, and their availability. The virtual manufacturing databases will consist of virtual machines and virtual tooling directly related to the common and machinability databases. The items to be completed are the design and construction of the machinability databases, virtual manufacturing paradigm for NASA-JSC, implementation timeline, VNC model of one bridge mill and troubleshoot existing software and hardware problems with EN4NET. The final step of this virtual manufacturing project will be to integrate other production sites into the databases bringing JSC's EM3 into a position of becoming a clearing house for NASA's digital manufacturing needs creating a true virtual manufacturing enterprise.

  3. Biomedical journals and databases in Russia and Russian language in the former Soviet Union and beyond

    PubMed Central

    Vlassov, Vasiliy V; Danishevskiy, Kirill D

    2008-01-01

    In the 20th century, Russian biomedical science experienced a decline from the blossom of the early years to a drastic state. Through the first decades of the USSR, it was transformed to suit the ideological requirements of a totalitarian state and biased directives of communist leaders. Later, depressing economic conditions and isolation from the international research community further impeded its development. Contemporary Russia has inherited a system of medical education quite different from the west as well as counterproductive regulations for the allocation of research funding. The methodology of medical and epidemiological research in Russia is largely outdated. Epidemiology continues to focus on infectious disease and results of the best studies tend to be published in international periodicals. MEDLINE continues to be the best database to search for Russian biomedical publications, despite only a small proportion being indexed. The database of the Moscow Central Medical Library is the largest national database of medical periodicals, but does not provide abstracts and full subject heading codes, and it does not cover even the entire collection of the Library. New databases and catalogs (e.g. Panteleimon) that have appeared recently are incomplete and do not enable effective searching. PMID:18826569

  4. Inequality of obesity and socioeconomic factors in Iran: a systematic review and meta- analyses

    PubMed Central

    Djalalinia, Shirin; Peykari, Niloofar; Qorbani, Mostafa; Larijani, Bagher; Farzadfar, Farshad

    2015-01-01

    Background: Socioeconomic status and demographic factors, such as education, occupation, place of residence, gender, age, and marital status have been reported to be associated with obesity. We conducted a systematic review to summarize evidences on associations between socioeconomic factors and obesity/overweight in Iranian population. Methods: We systematically searched international databases; ISI, PubMed/Medline, Scopus, and national databases Iran-medex, Irandoc, and Scientific Information Database (SID). We refined data for associations between socioeconomic factors and obesity/overweight by sex, age, province, and year. There were no limitations for time and languages. Results: Based on our search strategy we found 151 records; of them 139 were from international databases and the remaining 12 were obtained from national databases. After removing duplicates, via the refining steps, only 119 articles were found related to our study domains. Extracted results were attributed to 146596 person/data from included studies. Increased ages, low educational levels, being married, residence in urban area, as well as female sex were clearly associated with obesity. Conclusion: Results could be useful for better health policy and more planned studies in this field. These also could be used for future complementary analyses. PMID:26793632

  5. Inequality of obesity and socioeconomic factors in Iran: a systematic review and meta- analyses.

    PubMed

    Djalalinia, Shirin; Peykari, Niloofar; Qorbani, Mostafa; Larijani, Bagher; Farzadfar, Farshad

    2015-01-01

    Socioeconomic status and demographic factors, such as education, occupation, place of residence, gender, age, and marital status have been reported to be associated with obesity. We conducted a systematic review to summarize evidences on associations between socioeconomic factors and obesity/overweight in Iranian population. We systematically searched international databases; ISI, PubMed/Medline, Scopus, and national databases Iran-medex, Irandoc, and Scientific Information Database (SID). We refined data for associations between socioeconomic factors and obesity/overweight by sex, age, province, and year. There were no limitations for time and languages. Based on our search strategy we found 151 records; of them 139 were from international databases and the remaining 12 were obtained from national databases. After removing duplicates, via the refining steps, only 119 articles were found related to our study domains. Extracted results were attributed to 146596 person/data from included studies. Increased ages, low educational levels, being married, residence in urban area, as well as female sex were clearly associated with obesity. RESULTS could be useful for better health policy and more planned studies in this field. These also could be used for future complementary analyses.

  6. Biomedical journals and databases in Russia and Russian language in the former Soviet Union and beyond.

    PubMed

    Vlassov, Vasiliy V; Danishevskiy, Kirill D

    2008-09-30

    In the 20th century, Russian biomedical science experienced a decline from the blossom of the early years to a drastic state. Through the first decades of the USSR, it was transformed to suit the ideological requirements of a totalitarian state and biased directives of communist leaders. Later, depressing economic conditions and isolation from the international research community further impeded its development. Contemporary Russia has inherited a system of medical education quite different from the west as well as counterproductive regulations for the allocation of research funding. The methodology of medical and epidemiological research in Russia is largely outdated. Epidemiology continues to focus on infectious disease and results of the best studies tend to be published in international periodicals. MEDLINE continues to be the best database to search for Russian biomedical publications, despite only a small proportion being indexed. The database of the Moscow Central Medical Library is the largest national database of medical periodicals, but does not provide abstracts and full subject heading codes, and it does not cover even the entire collection of the Library. New databases and catalogs (e.g. Panteleimon) that have appeared recently are incomplete and do not enable effective searching.

  7. The Untapped Promise of Secondary Data Sets in International and Comparative Education Policy Research

    ERIC Educational Resources Information Center

    Chudagr, Amita; Luschei, Thomas F.

    2016-01-01

    The objective of this commentary is to call attention to the feasibility and importance of large-scale, systematic, quantitative analysis in international and comparative education research. We contend that although many existing databases are under- or unutilized in quantitative international-comparative research, these resources present the…

  8. An algorithm recommendation for the management of knee osteoarthritis in Europe and internationally: a report from a task force of the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis (ESCEO).

    PubMed

    Bruyère, Olivier; Cooper, Cyrus; Pelletier, Jean-Pierre; Branco, Jaime; Luisa Brandi, Maria; Guillemin, Francis; Hochberg, Marc C; Kanis, John A; Kvien, Tore K; Martel-Pelletier, Johanne; Rizzoli, René; Silverman, Stuart; Reginster, Jean-Yves

    2014-12-01

    Existing practice guidelines for osteoarthritis (OA) analyze the evidence behind each proposed treatment but do not prioritize the interventions in a given sequence. The objective was to develop a treatment algorithm recommendation that is easier to interpret for the prescribing physician based on the available evidence and that is applicable in Europe and internationally. The knee was used as the model OA joint. ESCEO assembled a task force of 13 international experts (rheumatologists, clinical epidemiologists, and clinical scientists). Existing guidelines were reviewed; all interventions listed and recent evidence were retrieved using established databases. A first schematic flow chart with treatment prioritization was discussed in a 1-day meeting and shaped to the treatment algorithm. Fine-tuning occurred by electronic communication and three consultation rounds until consensus. Basic principles consist of the need for a combined pharmacological and non-pharmacological treatment with a core set of initial measures, including information access/education, weight loss if overweight, and an appropriate exercise program. Four multimodal steps are then established. Step 1 consists of background therapy, either non-pharmacological (referral to a physical therapist for re-alignment treatment if needed and sequential introduction of further physical interventions initially and at any time thereafter) or pharmacological. The latter consists of chronic Symptomatic Slow-Acting Drugs for OA (e.g., prescription glucosamine sulfate and/or chondroitin sulfate) with paracetamol at-need; topical NSAIDs are added in the still symptomatic patient. Step 2 consists of the advanced pharmacological management in the persistent symptomatic patient and is centered on the use of oral COX-2 selective or non-selective NSAIDs, chosen based on concomitant risk factors, with intra-articular corticosteroids or hyaluronate for further symptom relief if insufficient. In Step 3, the last pharmacological attempts before surgery are represented by weak opioids and other central analgesics. Finally, Step 4 consists of end-stage disease management and surgery, with classical opioids as a difficult-to-manage alternative when surgery is contraindicated. The proposed treatment algorithm may represent a new framework for the development of future guidelines for the management of OA, more easily accessible to physicians. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Malignant tumors of the liver in children.

    PubMed

    Aronson, Daniel C; Meyers, Rebecka L

    2016-10-01

    This article aims to give an overview of pediatric liver tumors; in particular of the two most frequently occurring groups of hepatoblastomas and hepatocellular carcinomas. Focus lays on achievements gained through worldwide collaboration. We present recent advances in insight, treatment results, and future questions to be asked. Increasing international collaboration between the four major Pediatric Liver Tumor Study Groups (SIOPEL/GPOH, COG, and JPLT) may serve as a paradigm to approach rare tumors. This international effort has been catalyzed by the Children's Hepatic tumor International Collaboration (CHIC) formation of a large collaborative database. Interrogation of this database has led to a new universal risk stratification system for hepatoblastoma using PRETEXT/POSTTEXT staging as a backbone. Pathologists in this international collaboration have established a new histopathological consensus classification for pediatric liver tumors. Concomitantly there have been advances in chemotherapy options, an increased role of liver transplantation for unresectable tumors, and a web portal system developed at www.siopel.org for international education, consultation, and collaboration. These achievements will be further tested and validated in the upcoming Paediatric Hepatic International Tumour Trial (PHITT). Copyright © 2016 Elsevier Inc. All rights reserved.

  10. What can we learn from a decade of database audits? The Duke Clinical Research Institute experience, 1997–2006

    PubMed Central

    Rostami, Reza; Nahm, Meredith; Pieper, Carl F.

    2011-01-01

    Background Despite a pressing and well-documented need for better sharing of information on clinical trials data quality assurance methods, many research organizations remain reluctant to publish descriptions of and results from their internal auditing and quality assessment methods. Purpose We present findings from a review of a decade of internal data quality audits performed at the Duke Clinical Research Institute, a large academic research organization that conducts data management for a diverse array of clinical studies, both academic and industry-sponsored. In so doing, we hope to stimulate discussions that could benefit the wider clinical research enterprise by providing insight into methods of optimizing data collection and cleaning, ultimately helping patients and furthering essential research. Methods We present our audit methodologies, including sampling methods, audit logistics, sample sizes, counting rules used for error rate calculations, and characteristics of audited trials. We also present database error rates as computed according to two analytical methods, which we address in detail, and discuss the advantages and drawbacks of two auditing methods used during this ten-year period. Results Our review of the DCRI audit program indicates that higher data quality may be achieved from a series of small audits throughout the trial rather than through a single large database audit at database lock. We found that error rates trended upward from year to year in the period characterized by traditional audits performed at database lock (1997–2000), but consistently trended downward after periodic statistical process control type audits were instituted (2001–2006). These increases in data quality were also associated with cost savings in auditing, estimated at 1000 hours per year, or the efforts of one-half of a full time equivalent (FTE). Limitations Our findings are drawn from retrospective analyses and are not the result of controlled experiments, and may therefore be subject to unanticipated confounding. In addition, the scope and type of audits we examine here are specific to our institution, and our results may not be broadly generalizable. Conclusions Use of statistical process control methodologies may afford advantages over more traditional auditing methods, and further research will be necessary to confirm the reliability and usability of such techniques. We believe that open and candid discussion of data quality assurance issues among academic and clinical research organizations will ultimately benefit the entire research community in the coming era of increased data sharing and re-use. PMID:19342467

  11. The AMMA information system

    NASA Astrophysics Data System (ADS)

    Brissebrat, Guillaume; Fleury, Laurence; Boichard, Jean-Luc; Cloché, Sophie; Eymard, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim; Asencio, Nicole; Favot, Florence; Roussot, Odile

    2013-04-01

    The AMMA information system aims at expediting data and scientific results communication inside the AMMA community and beyond. It has already been adopted as the data management system by several projects and is meant to become a reference information system about West Africa area for the whole scientific community. The AMMA database and the associated on line tools have been developed and are managed by two French teams (IPSL Database Centre, Palaiseau and OMP Data Service, Toulouse). The complete system has been fully duplicated and is operated by AGRHYMET Regional Centre in Niamey, Niger. The AMMA database contains a wide variety of datasets: - about 250 local observation datasets, that cover geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health...) They come from either operational networks or scientific experiments, and include historical data in West Africa from 1850; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Database users can access all the data using either the portal http://database.amma-international.org or http://amma.agrhymet.ne/amma-data. Different modules are available. The complete catalogue enables to access metadata (i.e. information about the datasets) that are compliant with the international standards (ISO19115, INSPIRE...). Registration pages enable to read and sign the data and publication policy, and to apply for a user database account. The data access interface enables to easily build a data extraction request by selecting various criteria like location, time, parameters... At present, the AMMA database counts more than 740 registered users and process about 80 data requests every month In order to monitor day-to-day meteorological and environment information over West Africa, some quick look and report display websites have been developed. They met the operational needs for the observational teams during the AMMA 2006 (http://aoc.amma-international.org) and FENNEC 2011 (http://fenoc.sedoo.fr) campaigns. But they also enable scientific teams to share physical indices along the monsoon season (http://misva.sedoo.fr from 2011). A collaborative WIKINDX tool has been set on line in order to manage scientific publications and communications of interest to AMMA (http://biblio.amma-international.org). Now the bibliographic database counts about 1200 references. It is the most exhaustive document collection about African Monsoon available for all. Every scientist is invited to make use of the different AMMA on line tools and data. Scientists or project leaders who have data management needs for existing or future datasets over West Africa are welcome to use the AMMA database framework and to contact ammaAdmin@sedoo.fr .

  12. Spectr-W3 Online Database On Atomic Properties Of Atoms And Ions

    NASA Astrophysics Data System (ADS)

    Faenov, A. Ya.; Magunov, A. I.; Pikuz, T. A.; Skobelev, I. Yu.; Loboda, P. A.; Bakshayev, N. N.; Gagarin, S. V.; Komosko, V. V.; Kuznetsov, K. S.; Markelenkov, S. A.

    2002-10-01

    Recent progress in the novel information technologies based on the World-Wide Web (WWW) gives a new possibility for a worldwide exchange of atomic spectral and collisional data. This facilitates joint efforts of the international scientific community in basic and applied research, promising technological developments, and university education programs. Special-purpose atomic databases (ADBs) are needed for an effective employment of large-scale datasets. The ADB SPECTR developed at MISDC of VNIIFTRI has been used during the last decade in several laboratories in the world, including RFNC-VNIITF. The DB SPECTR accumulates a considerable amount of atomic data (about 500,000 records). These data were extracted from publications on experimental and theoretical studies in atomic physics, astrophysics, and plasma spectroscopy during the last few decades. The information for atoms and ions comprises the ionization potentials, the energy levels, the wavelengths and transition probabilities, and, to a lesser extent, -- also the autoionization rates, and the electron-ion collision cross-sections and rates. The data are supplied with source references and comments elucidating the details of computations or measurements. Our goal is to create an interactive WWW information resource based on the extended and updated Web-oriented database version SPECTR-W3 and its further integration into the family of specialized atomic databases on the Internet. The version will incorporate novel experimental and theoretical data. An appropriate revision of the previously accumulated data will be performed from the viewpoint of their consistency to the current state-of-the-art. We are particularly interested in cooperation for storing the atomic collision data. Presently, a software shell with the up-to-date Web-interface is being developed to work with the SPECTR-W3 database. The shell would include the subsystems of information retrieval, input, update, and output in/from the database and present the users a handful of capabilities to formulate the queries with various modes of the search prescriptions, to present the information in tabular, graphic, and alphanumeric form using the formats of the text and HTML documents. The SPECTR-W3 Website is being arranged now and is supposed to be freely accessible round-the-clock on a dedicated Web server at RFNC VNIITF. The Website is being created with the employment of the advanced Internet technologies and database development techniques by using the up-to-date software of the world leading software manufacturers. The SPECTR-W3 ADB FrontPage would also include a feedback channel for the user comments and proposals as well as the hyperlinks to the Websites of the other ADBs and research centers in Europe, the USA, the Middle and Far East, running the investigations in atomic physics, plasma spectroscopy, astrophysics, and in adjacent areas. The effort is being supported by the International Science and Technology Center under the project sharp/mesh/hash1785-01.

  13. Infant-Feeding Intentions and Practices of Internal Medicine Physicians

    PubMed Central

    Serwint, Janet R.; Shuster, Jonathan J.; Levine, David M.

    2016-01-01

    Abstract Background: Personal breastfeeding behavior of physician mothers is associated with their clinical breastfeeding advocacy, which in turn impacts patients' breastfeeding behavior. Internists can play an important role in breastfeeding advocacy as they usually come in contact with mothers longitudinally. Objective: To explore the personal infant-feeding decisions and behavior of physician mothers in internal medicine (IM). Materials and Methods: Physicians with current or previous IM training were isolated from our “Breastfeeding Among Physicians” database. The data in the database were gathered from cross-sectional surveys of 130 physician volunteers, mainly affiliated with the Johns Hopkins University School of Medicine (Baltimore, MD) and the University of Florida College of Medicine (Gainesville, FL). Results: Seventy-two mothers reported current or previous IM training and had 196 infants. Breastfeeding rates were 96% at birth, 77% at 6 months, and 40% at 12 months. Exclusive breastfeeding rates were 78% at birth, 67% at 3 months, and 30% at 6 months. While maternal goal for breastfeeding duration correlated with duration of both exclusive and any breastfeeding, there was a consistent and appreciable disparity between maternal duration goal and actual breastfeeding duration. The participants reported work-related reasons for early supplementation and breastfeeding cessation. Conclusions: We have described for the first time in the literature the personal infant-feeding intentions and behavior of a cohort of IM physician mothers. Workplace interventions to enable internists to maintain breastfeeding after return to work and to achieve their breastfeeding goals might improve the health of these mothers and their infants and positively impact their clinical breastfeeding advocacy. PMID:26918534

  14. Specialty of prescribers associated with prescription opioid fatalities in Utah, 2002-2010.

    PubMed

    Porucznik, Christina A; Johnson, Erin M; Rolfs, Robert T; Sauer, Brian C

    2014-01-01

    Opioid adverse events are widespread, and deaths have been directly attributed to opioids prescribed by medical professionals. Little information exists on the amount of opioids various medical specialties prescribe and the opioid fatality rate that would be expected if prescription opioid-related deaths were independent of medical specialty. To compute the incidence of prescription opioid fatalities by medical specialty in Utah and to calculate the attributable risk (AR) of opioid fatality by medical specialty. Prevalence database study design linking the Utah Controlled Substance Database (CSD) for prescribing data with the Utah Medical Examiner data to identify prescription opioid fatalities. AR were calculated for each medical specialty and year. Opioid prescriptions are common with 23,302,892 recorded in the CSD for 2002-2010, 0.64% of which were associated with a fatality. We attached specialty to 90.2% of opioid prescriptions. Family medicine and internal medicine physicians wrote the largest proportion of prescriptions (24.1% and 10.8%) and were associated with the greatest number of prescription opioid fatalities. The number of active prescriptions at time of death decreased each year. The AR of fatality by provider specialty varied each year with some specialties, such as pain medicine and anesthesiology, consistently associated with more fatalities per 1,000 opioid prescriptions than internal medicine physicians the same year. Primary care providers were the most frequent prescribers and the most often associated with opioid fatalities and should be targeted for education about safe prescribing along with specialties that prescribe less frequently but are associated with a positive AR for opioid fatality. Wiley Periodicals, Inc.

  15. Map and database of Quaternary faults in Venezuela and its offshore regions

    USGS Publications Warehouse

    Audemard, F.A.; Machette, M.N.; Cox, J.W.; Dart, R.L.; Haller, K.M.

    2000-01-01

    As part of the International Lithosphere Program’s “World Map of Major Active Faults,” the U.S. Geological Survey is assisting in the compilation of a series of digital maps of Quaternary faults and folds in Western Hemisphere countries. The maps show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. They are accompanied by databases that describe these features and document current information on their activity in the Quaternary. The project is a key part of the Global Seismic Hazards Assessment Program (ILP Project II-0) for the International Decade for Natural Hazard Disaster Reduction.The project is sponsored by the International Lithosphere Program and funded by the USGS’s National Earthquake Hazards Reduction Program. The primary elements of the project are general supervision and interpretation of geologic/tectonic information, data compilation and entry for fault catalog, database design and management, and digitization and manipulation of data in †ARCINFO. For the compilation of data, we engaged experts in Quaternary faulting, neotectonics, paleoseismology, and seismology.

  16. International spinal cord injury endocrine and metabolic extended data set.

    PubMed

    Bauman, W A; Wecht, J M; Biering-Sørensen, F

    2017-05-01

    The objective of this study was to develop the International Spinal Cord Injury (SCI) Endocrine and Metabolic Extended Data Set (ISCIEMEDS) within the framework of the International SCI Data Sets that would facilitate consistent collection and reporting of endocrine and metabolic findings in the SCI population. This study was conducted in an international setting. The ISCIEMEDS was developed by a working group. The initial ISCIEMEDS was revised based on suggestions from members of the International SCI Data Sets Committee, the International Spinal Cord Society (ISCoS) Executive and Scientific Committees, American Spinal Injury Association (ASIA) Board, other interested organizations, societies and individual reviewers. The data set was posted for two months on ISCoS and ASIA websites for comments. Variable names were standardized, and a suggested database structure for the ISCIEMEDS was provided by the Common Data Elements (CDEs) project at the National Institute on Neurological Disorders and Stroke (NINDS) of the US National Institute of Health (NIH), and are available at https://commondataelements.ninds.nih.gov/SCI.aspx#tab=Data_Standards. The final ISCIEMEDS contains questions on the endocrine and metabolic conditions related to SCI. Because the information may be collected at any time, the date of data collection is important to determine the time after SCI. ISCIEMEDS includes information on carbohydrate metabolism (6 variables), calcium and bone metabolism (12 variables), thyroid function (9 variables), adrenal function (2 variables), gonadal function (7 variables), pituitary function (6 variables), sympathetic nervous system function (1 variable) and renin-aldosterone axis function (2 variables). The complete instructions for data collection and the data sheet itself are freely available on the website of ISCoS (http://www.iscos.org.uk/international-sci-data-sets).

  17. Institutional Review Board approval and innovation in urology: current practice and safety issues.

    PubMed

    Sundaram, Varun; Vemana, Goutham; Bhayani, Sam B

    2014-02-01

    To retrospectively review recent publications describing novel procedures/techniques, and describe the Institutional Review Board (IRB)/ethics approval process and potential ethical dilemmas in their reporting. We searched PubMed for papers about innovative or novel procedures/techniques between 2011 and August 2012. A query of titles/abstracts in the Journal of Urology, Journal of Endourology, European Urology, BJU International, and Urology identified relevant papers. These results were reviewed for human studies that described an innovative technique, procedure, approach, initial series, and/or used new technology. In all, 91 papers met criteria for inclusion; 25 from the Journal of Endourology, 14 from the Journal of Urology, nine from European Urology, 15 from the BJU International and 28 from Urology. IRB/ethics approval was given for an experimental procedure or database in 24% and 22%, respectively. IRB/ethics approval was not mentioned in 52.7% of studies. Published IRB/ethics approvals for innovative techniques are heterogeneous including database, retrospective, and prospective approvals. Given the concept that innovations are likely not in the legal or ethical standard of care, strong consideration should be given to obtaining IRB/ethics approval before the actual procedure, instead of approval to merely report database outcomes. © 2013 The Authors. BJU International © 2013 BJU International.

  18. Validation of intellectual disability coding through hospital morbidity records using an intellectual disability population-based database in Western Australia.

    PubMed

    Bourke, Jenny; Wong, Kingsley; Leonard, Helen

    2018-01-23

    To investigate how well intellectual disability (ID) can be ascertained using hospital morbidity data compared with a population-based data source. All children born in 1983-2010 with a hospital admission in the Western Australian Hospital Morbidity Data System (HMDS) were linked with the Western Australian Intellectual Disability Exploring Answers (IDEA) database. The International Classification of Diseases hospital codes consistent with ID were also identified. The characteristics of those children identified with ID through either or both sources were investigated. Of the 488 905 individuals in the study, 10 218 (2.1%) were identified with ID in either IDEA or HMDS with 1435 (14.0%) individuals identified in both databases, 8305 (81.3%) unique to the IDEA database and 478 (4.7%) unique to the HMDS dataset only. Of those unique to the HMDS dataset, about a quarter (n=124) had died before 1 year of age and most of these (75%) before 1 month. Children with ID who were also coded as such in the HMDS data were more likely to be aged under 1 year, female, non-Aboriginal and have a severe level of ID, compared with those not coded in the HMDS data. The sensitivity of using HMDS to identify ID was 14.7%, whereas the specificity was much higher at 99.9%. Hospital morbidity data are not a reliable source for identifying ID within a population, and epidemiological researchers need to take these findings into account in their study design. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. PFR²: a curated database of planktonic foraminifera 18S ribosomal DNA as a resource for studies of plankton ecology, biogeography and evolution.

    PubMed

    Morard, Raphaël; Darling, Kate F; Mahé, Frédéric; Audic, Stéphane; Ujiié, Yurika; Weiner, Agnes K M; André, Aurore; Seears, Heidi A; Wade, Christopher M; Quillévéré, Frédéric; Douady, Christophe J; Escarguel, Gilles; de Garidel-Thoron, Thibault; Siccha, Michael; Kucera, Michal; de Vargas, Colomban

    2015-11-01

    Planktonic foraminifera (Rhizaria) are ubiquitous marine pelagic protists producing calcareous shells with conspicuous morphology. They play an important role in the marine carbon cycle, and their exceptional fossil record serves as the basis for biochronostratigraphy and past climate reconstructions. A major worldwide sampling effort over the last two decades has resulted in the establishment of multiple large collections of cryopreserved individual planktonic foraminifera samples. Thousands of 18S rDNA partial sequences have been generated, representing all major known morphological taxa across their worldwide oceanic range. This comprehensive data coverage provides an opportunity to assess patterns of molecular ecology and evolution in a holistic way for an entire group of planktonic protists. We combined all available published and unpublished genetic data to build PFR(2), the Planktonic foraminifera Ribosomal Reference database. The first version of the database includes 3322 reference 18S rDNA sequences belonging to 32 of the 47 known morphospecies of extant planktonic foraminifera, collected from 460 oceanic stations. All sequences have been rigorously taxonomically curated using a six-rank annotation system fully resolved to the morphological species level and linked to a series of metadata. The PFR(2) website, available at http://pfr2.sb-roscoff.fr, allows downloading the entire database or specific sections, as well as the identification of new planktonic foraminiferal sequences. Its novel, fully documented curation process integrates advances in morphological and molecular taxonomy. It allows for an increase in its taxonomic resolution and assures that integrity is maintained by including a complete contingency tracking of annotations and assuring that the annotations remain internally consistent. © 2015 John Wiley & Sons Ltd.

  20. Importance of being indexed in important databases--effect on the quantity of published articles in JBUON.

    PubMed

    Vuckovic-Dekic, Ljiljana; Gavrilovic, Dusica

    2016-01-01

    To investigate the dynamics of indexing the Journal of the Balkan Union of Oncology (JBUON) in important biomedical databases, the effects on the quantity and type of published articles, and also the countries of the (co)authors of these papers. The process of the JBUON indexing started with EMBASE/Excerpta Medica, followed in 2006 (PUBMED/MEDLINE) and continued every second year in other important biomedical databases, until 2012 when JBUON became Open Access Journal (for even more information please visit www.jbuon.com). Including the next two years for monitoring the effect of the last indexing, we analyzed 9 volumes consisting of 36 issues that were published from January 2006 to December 2014, with regard to the number and category of articles, the contribution of authors from Balkan and non-Balkan countries, and the (co)authorship in the published articles. In the period 2006-2014, 1165 articles of different categories were published in J BUON. The indexing progress of JBUON immediately increased the submission rate, and enlarged the number of publications, original papers in particular, in every volume of JBUON. Authors from Balkan countries contributed in 80.7% of all articles. The average number of coauthors per original article grew slowly and was higher at the end of the investigated period than at the start (6.6 and 5.8, respectively). The progressing covering of JBUON in important biomedical databases and its visibility on international level attracted the attention of a large readership, and submission rate and the number of published articles grew significantly, particularly the number of original papers. This is the most important consequence of the editorial policy which will hopefully lead to even more progress of JBUON in the near future.

  1. A validated case definition for chronic rhinosinusitis in administrative data: a Canadian perspective.

    PubMed

    Rudmik, Luke; Xu, Yuan; Kukec, Edward; Liu, Mingfu; Dean, Stafford; Quan, Hude

    2016-11-01

    Pharmacoepidemiological research using administrative databases has become increasingly popular for chronic rhinosinusitis (CRS); however, without a validated case definition the cohort evaluated may be inaccurate resulting in biased and incorrect outcomes. The objective of this study was to develop and validate a generalizable administrative database case definition for CRS using International Classification of Diseases, 9th edition (ICD-9)-coded claims. A random sample of 100 patients with a guideline-based diagnosis of CRS and 100 control patients were selected and then linked to a Canadian physician claims database from March 31, 2010, to March 31, 2015. The proportion of CRS ICD-9-coded claims (473.x and 471.x) for each of these 200 patients were reviewed and the validity of 7 different ICD-9-based coding algorithms was evaluated. The CRS case definition of ≥2 claims with a CRS ICD-9 code (471.x or 473.x) within 2 years of the reference case provides a balanced validity with a sensitivity of 77% and specificity of 79%. Applying this CRS case definition to the claims database produced a CRS cohort of 51,000 patients with characteristics that were consistent with published demographics and rates of comorbid asthma, allergic rhinitis, and depression. This study has validated several coding algorithms; based on the results a case definition of ≥2 physician claims of CRS (ICD-9 of 471.x or 473.x) within 2 years provides an optimal level of validity. Future studies will need to validate this administrative case definition from different health system perspectives and using larger retrospective chart reviews from multiple providers. © 2016 ARS-AAOA, LLC.

  2. A technique for routinely updating the ITU-R database using radio occultation electron density profiles

    NASA Astrophysics Data System (ADS)

    Brunini, Claudio; Azpilicueta, Francisco; Nava, Bruno

    2013-09-01

    Well credited and widely used ionospheric models, such as the International Reference Ionosphere or NeQuick, describe the variation of the electron density with height by means of a piecewise profile tied to the F2-peak parameters: the electron density,, and the height, . Accurate values of these parameters are crucial for retrieving reliable electron density estimations from those models. When direct measurements of these parameters are not available, the models compute the parameters using the so-called ITU-R database, which was established in the early 1960s. This paper presents a technique aimed at routinely updating the ITU-R database using radio occultation electron density profiles derived from GPS measurements gathered from low Earth orbit satellites. Before being used, these radio occultation profiles are validated by fitting to them an electron density model. A re-weighted Least Squares algorithm is used for down-weighting unreliable measurements (occasionally, entire profiles) and to retrieve and values—together with their error estimates—from the profiles. These values are used to monthly update the database, which consists of two sets of ITU-R-like coefficients that could easily be implemented in the IRI or NeQuick models. The technique was tested with radio occultation electron density profiles that are delivered to the community by the COSMIC/FORMOSAT-3 mission team. Tests were performed for solstices and equinoxes seasons in high and low-solar activity conditions. The global mean error of the resulting maps—estimated by the Least Squares technique—is between and elec/m for the F2-peak electron density (which is equivalent to 7 % of the value of the estimated parameter) and from 2.0 to 5.6 km for the height (2 %).

  3. Validation of intellectual disability coding through hospital morbidity records using an intellectual disability population-based database in Western Australia

    PubMed Central

    Bourke, Jenny; Wong, Kingsley

    2018-01-01

    Objectives To investigate how well intellectual disability (ID) can be ascertained using hospital morbidity data compared with a population-based data source. Design, setting and participants All children born in 1983–2010 with a hospital admission in the Western Australian Hospital Morbidity Data System (HMDS) were linked with the Western Australian Intellectual Disability Exploring Answers (IDEA) database. The International Classification of Diseases hospital codes consistent with ID were also identified. Main outcome measures The characteristics of those children identified with ID through either or both sources were investigated. Results Of the 488 905 individuals in the study, 10 218 (2.1%) were identified with ID in either IDEA or HMDS with 1435 (14.0%) individuals identified in both databases, 8305 (81.3%) unique to the IDEA database and 478 (4.7%) unique to the HMDS dataset only. Of those unique to the HMDS dataset, about a quarter (n=124) had died before 1 year of age and most of these (75%) before 1 month. Children with ID who were also coded as such in the HMDS data were more likely to be aged under 1 year, female, non-Aboriginal and have a severe level of ID, compared with those not coded in the HMDS data. The sensitivity of using HMDS to identify ID was 14.7%, whereas the specificity was much higher at 99.9%. Conclusion Hospital morbidity data are not a reliable source for identifying ID within a population, and epidemiological researchers need to take these findings into account in their study design. PMID:29362262

  4. Implementing model-based system engineering for the whole lifecycle of a spacecraft

    NASA Astrophysics Data System (ADS)

    Fischer, P. M.; Lüdtke, D.; Lange, C.; Roshani, F.-C.; Dannemann, F.; Gerndt, A.

    2017-09-01

    Design information of a spacecraft is collected over all phases in the lifecycle of a project. A lot of this information is exchanged between different engineering tasks and business processes. In some lifecycle phases, model-based system engineering (MBSE) has introduced system models and databases that help to organize such information and to keep it consistent for everyone. Nevertheless, none of the existing databases approached the whole lifecycle yet. Virtual Satellite is the MBSE database developed at DLR. It has been used for quite some time in Phase A studies and is currently extended for implementing it in the whole lifecycle of spacecraft projects. Since it is unforeseeable which future use cases such a database needs to support in all these different projects, the underlying data model has to provide tailoring and extension mechanisms to its conceptual data model (CDM). This paper explains the mechanisms as they are implemented in Virtual Satellite, which enables extending the CDM along the project without corrupting already stored information. As an upcoming major use case, Virtual Satellite will be implemented as MBSE tool in the S2TEP project. This project provides a new satellite bus for internal research and several different payload missions in the future. This paper explains how Virtual Satellite will be used to manage configuration control problems associated with such a multi-mission platform. It discusses how the S2TEP project starts using the software for collecting the first design information from concurrent engineering studies, then making use of the extension mechanisms of the CDM to introduce further information artefacts such as functional electrical architecture, thus linking more and more processes into an integrated MBSE approach.

  5. Nucleotide Sequence Database Comparison for Routine Dermatophyte Identification by Internal Transcribed Spacer 2 Genetic Region DNA Barcoding.

    PubMed

    Normand, A C; Packeu, A; Cassagne, C; Hendrickx, M; Ranque, S; Piarroux, R

    2018-05-01

    Conventional dermatophyte identification is based on morphological features. However, recent studies have proposed to use the nucleotide sequences of the rRNA internal transcribed spacer (ITS) region as an identification barcode of all fungi, including dermatophytes. Several nucleotide databases are available to compare sequences and thus identify isolates; however, these databases often contain mislabeled sequences that impair sequence-based identification. We evaluated five of these databases on a clinical isolate panel. We selected 292 clinical dermatophyte strains that were prospectively subjected to an ITS2 nucleotide sequence analysis. Sequences were analyzed against the databases, and the results were compared to clusters obtained via DNA alignment of sequence segments. The DNA tree served as the identification standard throughout the study. According to the ITS2 sequence identification, the majority of strains (255/292) belonged to the genus Trichophyton , mainly T. rubrum complex ( n = 184), T. interdigitale ( n = 40), T. tonsurans ( n = 26), and T. benhamiae ( n = 5). Other genera included Microsporum (e.g., M. canis [ n = 21], M. audouinii [ n = 10], Nannizzia gypsea [ n = 3], and Epidermophyton [ n = 3]). Species-level identification of T. rubrum complex isolates was an issue. Overall, ITS DNA sequencing is a reliable tool to identify dermatophyte species given that a comprehensive and correctly labeled database is consulted. Since many inaccurate identification results exist in the DNA databases used for this study, reference databases must be verified frequently and amended in line with the current revisions of fungal taxonomy. Before describing a new species or adding a new DNA reference to the available databases, its position in the phylogenetic tree must be verified. Copyright © 2018 American Society for Microbiology.

  6. Mugshot Identification Database (MID)

    National Institute of Standards and Technology Data Gateway

    NIST Mugshot Identification Database (MID) (Web, free access)   NIST Special Database 18 is being distributed for use in development and testing of automated mugshot identification systems. The database consists of three CD-ROMs, containing a total of 3248 images of variable size using lossless compression. A newer version of the compression/decompression software on the CDROM can be found at the website http://www.nist.gov/itl/iad/ig/nigos.cfm as part of the NBIS package.

  7. Key comparison BIPM.RI(I)-K5 of the air-kerma standards of the SMU, Slovakia and the BIPM in 137Cs gamma radiation

    NASA Astrophysics Data System (ADS)

    Kessler, C.; Burns, D.; Durný, N.

    2018-01-01

    The first direct comparison of the standards for air kerma of the Slovak Institute of Metrology (SMU), Slovakia and of the Bureau International des Poids et Mesures (BIPM) was carried out in the 137Cs radiation beam of the BIPM in June 2017. The comparison result, evaluated as a ratio of the SMU and the BIPM standards for air kerma, is 1.0051 with a combined standard uncertainty of 2.7 × 10-3. The results for an indirect comparison made at the same time are consistent with the direct results at the level of 2 parts in 104. The results are analysed and presented in terms of degrees of equivalence, suitable for entry in the BIPM key comparison database. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCRI, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  8. Key comparison BIPM.RI(I)-K1 of the air-kerma standards of the SMU, Slovakia and the BIPM in 60Co gamma radiation

    NASA Astrophysics Data System (ADS)

    Kessler, C.; Burns, D.; Durný, N.

    2018-01-01

    A key comparison of the standards for air kerma of the Slovak Institute of Metrology (SMU), Slovakia and of the Bureau International des Poids et Mesures (BIPM) was carried out in the 60Co radiation beam of the BIPM in June 2017. The comparison result, evaluated as a ratio of the SMU and the BIPM standards for air kerma, is 1.0042 with a combined standard uncertainty of 2.7 × 10-3. The results for an indirect comparison made at the same time are consistent with the direct results at the level of 2 parts in 104. The results are analysed and presented in terms of degrees of equivalence, suitable for entry in the BIPM key comparison database. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCRI, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  9. A meta-analysis of efficacy of Morus alba Linn. to improve blood glucose and lipid profile.

    PubMed

    Phimarn, Wiraphol; Wichaiyo, Kittisak; Silpsavikul, Khuntawan; Sungthong, Bunleu; Saramunee, Kritsanee

    2017-06-01

    The previous studies have reported the Morus alba may improve blood glucose and lipid profile. The evidence from these studies is not consistent. This meta-analysis was to evaluate efficacy of products derived from M. alba on blood glucose and lipid levels. Literature was reviewed via international database (PubMed, PubMed Central, ScienceDirect, and SciSearch) and Thai databases. Thirteen RCTs with high quality, assessed by Jadad score, were included. M. alba expressed a significant reduction in postprandial glucose (PPG) at 30 min (MD -1.04, 95 % CI -1.36, -0.73), 60 min (MD -0.87, 95 % CI -1.27, -0.48) and 90 min (MD -0.55, 95 % CI -0.87, -0.22). The difference was not found in the levels of other glycaemic (FBS, HbA1C, or HOMA-IR) and lipidaemic (TC, TG, LDL, or HDL) markers. Serious adverse effects were found neither in the control nor in the group received M. alba. Products derived from M. alba can effectively contribute to the reduction in PPG levels, but large-scale RCTs would be informative.

  10. Key comparison BIPM.RI(I)-K1 of the air-kerma standards of the MKEH, Hungary and the BIPM in 60Co gamma radiation

    NASA Astrophysics Data System (ADS)

    Kessler, C.; Burns, D.; Machula, G.

    2018-01-01

    A comparison of the standards for air kerma of the Hungarian Trade Licensing Office (MKEH), Hungary and of the Bureau International des Poids et Mesures (BIPM) was carried out in the 60Co radiation beam of the BIPM in March 2016. The comparison result, evaluated as a ratio of the MKEH and the BIPM standards for air kerma, is 1.0047 with a combined standard uncertainty of 1.9 × 10-3. The results for an indirect comparison made at the same time are consistent with the direct results at the level of 2.6 parts in 103. The results are analysed and presented in terms of degrees of equivalence, suitable for entry in the BIPM key comparison database. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCRI, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  11. Measurement of Air Pollution from Satellites (MAPS) 1994 Correlative Atmospheric Carbon Monoxide Mixing Ratios (DB-1020)

    DOE Data Explorer

    Novelli, Paul [NOAA Climate Monitoring and Diagnostics Lab (CMDL), Boulder, Colorado; Masarie, Ken [Cooperative Institute for Research in Environmental Sciences (CIRES), University of Colorado, Boulder, Colorado

    1998-01-01

    This database offers select carbon monoxide (CO) mixing ratios from eleven field and aircraft measurement programs around the world. Carbon monoxide mixing ratios in the middle troposphere have been examined for short periods of time by using the Measurement of Air Pollution from Satellites (MAPS) instrument. MAPS measures CO from a space platform, using gas filter correlation radiometry. During the 1981 and 1984 MAPS flights, measurement validation was attempted by comparing space-based measurements of CO to those made in the middle troposphere from aircraft. Before the 1994 MAPS flights aboard the space shuttle Endeavour, a correlative measurement team was assembled to provide the National Aeronautics and Space Administration (NASA) with results of their CO field measurement programs during the April and October shuttle missions. To maximize the usefulness of these correlative data, team members agreed to participate in an intercomparison of CO measurements. The correlative data presented in this database provide an internally consistent, ground-based picture of CO in the lower atmosphere during Spring and Fall 1994. The data show the regional importance of two CO sources: fossil-fuel burning in urbanized areas and biomass burning in regions in the Southern Hemisphere.

  12. Association of major depression and diabetes in medically indigent Puerto Rican adults.

    PubMed

    Disdier-Flores, Orville M

    2010-03-01

    Studies have found that major depression and diabetes mellitus are strongly associated. The main goal of this study was to evaluate the association between major depression and diabetes in a large medically indigent population of Puerto Rican adults living on the island. A secondary database analysis through a cross-sectional design was used for this study. Participants were selected from the Puerto Rico Commonwealth Health Plan database, beneficiaries of the public health sector. Adult's subjects with at least one claim during 2002 were included. The final sample consisted of 1,026,625 adult insured. The International Classification of Diseases (ICD-9) was used for disease classifications. The prevalence of diabetes was 14.6% in subjects with major depression and 9.7% for those without major depression (POR 1.59, p < 0.001). The strength of this association remained after adjusting for obesity and sex. Prevalence of diabetes appears to be significantly higher in Puerto Rican adults with major depression compared to those without this psychiatric disorder. Longitudinal prospective studies and randomized controlled trials are needed to shed light on the temporal or causal relationship and to test whether effective prevention and treatment can reduce the risk of developing diabetes.

  13. The development of the Adolescent Nervios Scale: preliminary findings.

    PubMed

    Livanis, Andrew; Tryon, Georgiana Shick

    2010-01-01

    This paper details the construction of a scale to measure the culture-bound syndrome of nervios in Latino early adolescents, ages 11 to 14. Informed by nervios literature and experts, we developed the 31-item Adolescent Nervios Scale (ANS) with items comprised of symptoms representing various psychiatric conditions common to Western culture. In contrast to 277 non-Latino early adolescents who responded to the items as representing disparate constructs, 307 Latino early adolescents responded to ANS items in a unitary fashion. For Latino early adolescents, the ANS demonstrated good internal consistency and stability as well as concurrent, discriminative, and criterion-based validity. The results support the measurement of nervios and its relationship to the school performance and adjustment of Latino youth. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  14. Current status of the international Halley Watch infrared net archive

    NASA Technical Reports Server (NTRS)

    Mcguinness, Brian B.

    1988-01-01

    The primary purposes of the Halley Watch have been to promote Halley observations, coordinate and standardize the observing where useful, and to archive the results in a database readily accessible to cometary scientists. The intention of IHW is to store the observations themselves, along with any information necessary to allow users to understand and use the data, but to exclude interpretations of these data. Each of the archives produced by the IHW will appear in two versions: a printed archive and a digital archive on CD-ROMs. The archive is expected to have a very long lifetime. The IHW has already produced an archive for P/Crommelin. This consists of one printed volume and two 1600 bpi tapes. The Halley archive will contain at least twenty gigabytes of information.

  15. Collection and utilization of Japanese scientific and technological information in Europe and U.S.A. - Report on the Berlin Conference 1989 -

    NASA Astrophysics Data System (ADS)

    Miyakawa, Takayasu; Miwa, Makiko; Obara, Michio

    The 2nd International Conference on Japanese Information in Science Technology and Commerce was held on October 23-25, 1989 at Japanisch-Deutsches Zentrum Berlin, Federal Republic of Germany. During two years since previous Conference at Warwick, England, in 1987, much progresses were made in collecting, using and evaluating Japanese scientific, technological and industrial information in Western countries. On the other hands, overseas supply of Japanese databases and information by Japanese governmental and private organizations have been improved in many aspects. There occurred presentation of papers and valuable exchange of opinions and experiences. The Conference consisted of II Sessions which covered trends and policies, various information sources, analysis and distributions, Japanese language and Kanji processings and direct connection with Japan.

  16. IRIS Toxicological Review of Benzo[a]pyrene (Interagency ...

    EPA Pesticide Factsheets

    In January 2017, EPA finalized the IRIS assessment of Benzo[a]pyrene. The Toxicological Review was reviewed internally by EPA and by other federal agencies and White House Offices before public release. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science discussion materials provided to other agencies, including interagency review drafts of the IRIS Toxicological Review of Benzo[a]pyrene are posted on this site. EPA is undertaking an update of the Integrated Risk Information System (IRIS) health assessment for benzo[a]pyrene (BaP). The outcome of this project is an updated Toxicological Review and IRIS Summary for BaP that will be entered into the IRIS database.

  17. Usefulness and accuracy of MALDI-TOF mass spectrometry as a supplementary tool to identify mosquito vector species and to invest in development of international database.

    PubMed

    Raharimalala, F N; Andrianinarivomanana, T M; Rakotondrasoa, A; Collard, J M; Boyer, S

    2017-09-01

    Arthropod-borne diseases are important causes of morbidity and mortality. The identification of vector species relies mainly on morphological features and/or molecular biology tools. The first method requires specific technical skills and may result in misidentifications, and the second method is time-consuming and expensive. The aim of the present study is to assess the usefulness and accuracy of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) as a supplementary tool with which to identify mosquito vector species and to invest in the creation of an international database. A total of 89 specimens belonging to 10 mosquito species were selected for the extraction of proteins from legs and for the establishment of a reference database. A blind test with 123 mosquitoes was performed to validate the MS method. Results showed that: (a) the spectra obtained in the study with a given species differed from the spectra of the same species collected in another country, which highlights the need for an international database; (b) MALDI-TOF MS is an accurate method for the rapid identification of mosquito species that are referenced in a database; (c) MALDI-TOF MS allows the separation of groups or complex species, and (d) laboratory specimens undergo a loss of proteins compared with those isolated in the field. In conclusion, MALDI-TOF MS is a useful supplementary tool for mosquito identification and can help inform vector control. © 2017 The Royal Entomological Society.

  18. Sports Information Online: Searching the SPORT Database and Tips for Finding Sports Medicine Information Online.

    ERIC Educational Resources Information Center

    Janke, Richard V.; And Others

    1988-01-01

    The first article describes SPORT, a database providing international coverage of athletics and physical education, and compares it to other online services in terms of coverage, thesauri, possible search strategies, and actual usage. The second article reviews available online information on sports medicine. (CLB)

  19. 75 FR 20981 - Proposed Information Collection; Comment Request; Educational Partnership Program (EPP) and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-22

    ... completion of a NOAA student scholar reference form in support of the scholarship application by academic... internal tracking purposes. NOAA OEd grantees are required to update the student tracker database with the... tracker database form, 16 hours; graduate application form, 8 hours; undergraduate application form, 8...

  20. Handling of the demilitarized zone using service providers in SAP

    NASA Astrophysics Data System (ADS)

    Iovan, A.; Robu, R.

    2016-02-01

    External collaboration needs to allow data access from the Internet. In a trusted Internet collaboration scenario where the external user works on the same data like the internal user direct access to the data in the Intranet is required. The paper presents a solution to get access to certain data in the Enterprise Resource Planning system, having the User Interface on a system in the Demilitarized Zone and the database on a system which is located in the trusted area. Using the Service Provider Interface framework, connections between separate systems can be created in different areas of the network. The paper demonstrates how to connect the two systems, one in the Demilitarized Zone and one in the trusted area, using SAP ERP 6.0 with Enhancement Package 7. In order to use the Service Provider Interface SAP Business Suite Foundation component must be installed in both systems. The advantage of using the Service Provider Interface framework is that the external user works on the same data like the internal user (and not on copies). This assures data consistency and less overhead for backup and security systems.

  1. Front-End and Back-End Database Design and Development: Scholar's Academy Case Study

    ERIC Educational Resources Information Center

    Parks, Rachida F.; Hall, Chelsea A.

    2016-01-01

    This case study consists of a real database project for a charter school--Scholar's Academy--and provides background information on the school and its cafeteria processing system. Also included are functional requirements and some illustrative data. Students are tasked with the design and development of a database for the purpose of improving the…

  2. The 2002 RPA Plot Summary database users manual

    Treesearch

    Patrick D. Miles; John S. Vissage; W. Brad Smith

    2004-01-01

    Describes the structure of the RPA 2002 Plot Summary database and provides information on generating estimates of forest statistics from these data. The RPA 2002 Plot Summary database provides a consistent framework for storing forest inventory data across all ownerships across the entire United States. The data represents the best available data as of October 2001....

  3. Checkpointing and Recovery in Distributed and Database Systems

    ERIC Educational Resources Information Center

    Wu, Jiang

    2011-01-01

    A transaction-consistent global checkpoint of a database records a state of the database which reflects the effect of only completed transactions and not the results of any partially executed transactions. This thesis establishes the necessary and sufficient conditions for a checkpoint of a data item (or the checkpoints of a set of data items) to…

  4. Structured Forms Reference Set of Binary Images (SFRS)

    National Institute of Standards and Technology Data Gateway

    NIST Structured Forms Reference Set of Binary Images (SFRS) (Web, free access)   The NIST Structured Forms Database (Special Database 2) consists of 5,590 pages of binary, black-and-white images of synthesized documents. The documents in this database are 12 different tax forms from the IRS 1040 Package X for the year 1988.

  5. Antiepileptic drug use in seven electronic health record databases in Europe: a methodologic comparison.

    PubMed

    de Groot, Mark C H; Schuerch, Markus; de Vries, Frank; Hesse, Ulrik; Oliva, Belén; Gil, Miguel; Huerta, Consuelo; Requena, Gema; de Abajo, Francisco; Afonso, Ana S; Souverein, Patrick C; Alvarez, Yolanda; Slattery, Jim; Rottenkolber, Marietta; Schmiedl, Sven; Van Dijk, Liset; Schlienger, Raymond G; Reynolds, Robert; Klungel, Olaf H

    2014-05-01

    The annual prevalence of antiepileptic drug (AED) prescribing reported in the literature differs considerably among European countries due to use of different type of data sources, time periods, population distribution, and methodologic differences. This study aimed to measure prevalence of AED prescribing across seven European routine health care databases in Spain, Denmark, The Netherlands, the United Kingdom, and Germany using a standardized methodology and to investigate sources of variation. Analyses on the annual prevalence of AEDs were stratified by sex, age, and AED. Overall prevalences were standardized to the European 2008 reference population. Prevalence of any AED varied from 88 per 10,000 persons (The Netherlands) to 144 per 10,000 in Spain and Denmark in 2001. In all databases, prevalence increased linearly: from 6% in Denmark to 15% in Spain each year since 2001. This increase could be attributed entirely to an increase in "new," recently marketed AEDs while prevalence of AEDs that have been available since the mid-1990s, hardly changed. AED use increased with age for both female and male patients up to the ages of 80 to 89 years old and tended to be somewhat higher in female than in male patients between the ages of 40 and 70. No differences between databases in the number of AEDs used simultaneously by a patient were found. We showed that during the study period of 2001-2009, AED prescribing increased in five European Union (EU) countries and that this increase was due entirely to the newer AEDs marketed since the 1990s. Using a standardized methodology, we showed consistent trends across databases and countries over time. Differences in age and sex distribution explained only part of the variation between countries. Therefore, remaining variation in AED use must originate from other differences in national health care systems. Wiley Periodicals, Inc. © 2014 International League Against Epilepsy.

  6. Lessons Learned From Developing Reactor Pressure Vessel Steel Embrittlement Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jy-An John

    Materials behaviors caused by neutron irradiation under fission and/or fusion environments can be little understood without practical examination. Easily accessible material information system with large material database using effective computers is necessary for design of nuclear materials and analyses or simulations of the phenomena. The developed Embrittlement Data Base (EDB) at ORNL is this comprehensive collection of data. EDB database contains power reactor pressure vessel surveillance data, the material test reactor data, foreign reactor data (through bilateral agreements authorized by NRC), and the fracture toughness data. The lessons learned from building EDB program and the associated database management activity regardingmore » Material Database Design Methodology, Architecture and the Embedded QA Protocol are described in this report. The development of IAEA International Database on Reactor Pressure Vessel Materials (IDRPVM) and the comparison of EDB database and IAEA IDRPVM database are provided in the report. The recommended database QA protocol and database infrastructure are also stated in the report.« less

  7. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    USGS Publications Warehouse

    Wang, Lizhu; Riseng, Catherine M.; Mason, Lacey; Werhrly, Kevin; Rutherford, Edward; McKenna, James E.; Castiglione, Chris; Johnson, Lucinda B.; Infante, Dana M.; Sowa, Scott P.; Robertson, Mike; Schaeffer, Jeff; Khoury, Mary; Gaiot, John; Hollenhurst, Tom; Brooks, Colin N.; Coscarelli, Mark

    2015-01-01

    Managing the world's largest and most complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that is comparable across the region. To meet such a need, we developed a spatial classification framework and database — Great Lakes Aquatic Habitat Framework (GLAHF). GLAHF consists of catchments, coastal terrestrial, coastal margin, nearshore, and offshore zones that encompass the entire Great Lakes Basin. The catchments captured in the database as river pour points or coastline segments are attributed with data known to influence physicochemical and biological characteristics of the lakes from the catchments. The coastal terrestrial zone consists of 30-m grid cells attributed with data from the terrestrial region that has direct connection with the lakes. The coastal margin and nearshore zones consist of 30-m grid cells attributed with data describing the coastline conditions, coastal human disturbances, and moderately to highly variable physicochemical and biological characteristics. The offshore zone consists of 1.8-km grid cells attributed with data that are spatially less variable compared with the other aquatic zones. These spatial classification zones and their associated data are nested within lake sub-basins and political boundaries and allow the synthesis of information from grid cells to classification zones, within and among political boundaries, lake sub-basins, Great Lakes, or within the entire Great Lakes Basin. This spatially structured database could help the development of basin-wide management plans, prioritize locations for funding and specific management actions, track protection and restoration progress, and conduct research for science-based decision making.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbott, Jennifer; Sandberg, Tami

    The Wind-Wildlife Impacts Literature Database (WILD), formerly known as the Avian Literature Database, was created in 1997. The goal of the database was to begin tracking the research that detailed the potential impact of wind energy development on birds. The Avian Literature Database was originally housed on a proprietary platform called Livelink ECM from Open- Text and maintained by in-house technical staff. The initial set of records was added by library staff. A vital part of the newly launched Drupal-based WILD database is the Bibliography module. Many of the resources included in the database have digital object identifiers (DOI). Themore » bibliographic information for any item that has a DOI can be imported into the database using this module. This greatly reduces the amount of manual data entry required to add records to the database. The content available in WILD is international in scope, which can be easily discerned by looking at the tags available in the browse menu.« less

  9. Privacy-Aware Location Database Service for Granular Queries

    NASA Astrophysics Data System (ADS)

    Kiyomoto, Shinsaku; Martin, Keith M.; Fukushima, Kazuhide

    Future mobile markets are expected to increasingly embrace location-based services. This paper presents a new system architecture for location-based services, which consists of a location database and distributed location anonymizers. The service is privacy-aware in the sense that the location database always maintains a degree of anonymity. The location database service permits three different levels of query and can thus be used to implement a wide range of location-based services. Furthermore, the architecture is scalable and employs simple functions that are similar to those found in general database systems.

  10. Why do we want the right to die? A systematic review of the international literature on the views of patients, carers and the public on assisted dying.

    PubMed

    Hendry, Maggie; Pasterfield, Diana; Lewis, Ruth; Carter, Ben; Hodgson, Daniel; Wilkinson, Clare

    2013-01-01

    Assisted dying is legal in four European countries and three American states. Elsewhere, particularly in more affluent or mainly Protestant countries, it remains controversial. Dominant headlines feature professional (medical, legal, religious) arguments versus celebrity campaigners; ordinary people are less clearly represented. To synthesise the international evidence of people's views and attitudes towards assisted dying in order to inform current debate about this controversial issue. Systematic review and mixed method synthesis of qualitative and survey data. Eleven electronic databases from inception to October 2011; bibliographies of included studies. Two reviewers independently screened papers and appraised quality. Qualitative results were extracted verbatim; survey results were summarised in a table. Qualitative data were synthesised using framework methods and survey results integrated where they supported, contrasted or added to the themes identified. Sixteen qualitative studies and 94 surveys were included; many participants considered the immediate relevance of assisted dying for them. Themes related to poor quality of life, a good quality of death, potential abuse of assisted dying and the importance of individual stance. People valued autonomy in death as much as in life. Attitudes were diverse, complex and related to definitions of unbearable suffering including physical, psycho-social and existential factors and were consistent regardless of social, economic, legal and health-care contexts. Our review sheds light on ordinary people's perspectives about assisted dying, when they are ill or disabled. Unbearable suffering is a key construct, and common factors are revealed that lead people to ask for help to die. The consistency of international views indicates a mandate for legislative and medical systems worldwide to listen and understand this.

  11. Linguistic validation and reliability properties are weak investigated of most dementia-specific quality of life measurements-a systematic review.

    PubMed

    Dichter, Martin Nikolaus; Schwab, Christian G G; Meyer, Gabriele; Bartholomeyczik, Sabine; Halek, Margareta

    2016-02-01

    For people with dementia, the concept of quality of life (Qol) reflects the disease's impact on the whole person. Thus, Qol is an increasingly used outcome measure in dementia research. This systematic review was performed to identify available dementia-specific Qol measurements and to assess the quality of linguistic validations and reliability studies of these measurements (PROSPERO 2013: CRD42014008725). The MEDLINE, CINAHL, EMBASE, PsycINFO, and Cochrane Methodology Register databases were systematically searched without any date restrictions. Forward and backward citation tracking were performed on the basis of selected articles. A total of 70 articles addressing 19 dementia-specific Qol measurements were identified; nine measurements were adapted to nonorigin countries. The quality of the linguistic validations varied from insufficient to good. Internal consistency was the most frequently tested reliability property. Most of the reliability studies lacked internal validity. Qol measurements for dementia are insufficiently linguistic validated and not well tested for reliability. None of the identified measurements can be recommended without further research. The application of international guidelines and quality criteria is strongly recommended for the performance of linguistic validations and reliability studies of dementia-specific Qol measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Theory analysis for Pender's health promotion model (HPM) by Barnum's criteria: a critical perspective.

    PubMed

    Khoshnood, Zohreh; Rayyani, Masoud; Tirgari, Batool

    2018-01-13

    Background Analysis of nursing theoretical works and its role in knowledge development is presented as an essential process of critical reflection. Health promotion model (HPM) focuses on helping people achieve higher levels of well-being and identifies background factors that influence health behaviors. Objectives This paper aims to evaluate, and critique HPM by Barnum's criteria. Methods The present study reviewed books and articles derived from Proquest, PubMed, Blackwell Databases. The method of evaluation for this model is based on Barnum's criteria for analysis, application and evaluation of nursing theories. The criteria selected by Barnum embrace both internal and external criticism. Internal criticism deals with how theory components fit with each other (internal construction of theory) and external criticism deals with the way in which theory relates to the extended world (which considers theory in its relationships to human beings, nursing, and health). Results The electronic database search yielded over 27,717 titles and abstracts. Following removal of duplicates, 18,963 titles and abstracts were screened using the inclusion criteria and 1278 manuscripts were retrieved. Of these, 80 were specific to HPM and 23 to analysis of any theory in nursing relating to the aim of this article. After final selection using the inclusion criteria for this review, 28 manuscripts were identified as examining the factors contributing to theory analysis. Evaluation of health promotion theory showed that the philosophical claims and their content are consistent and clear. HPM has a logical structure and was applied to diverse age groups from differing cultures with varying health concerns. Conclusion In conclusion, among the strategies for theory critique, the Barnum approach is structured and accurate, considers theory in its relationship to human beings, community psychiatric nursing, and health. While according to Pender, nursing assessment, diagnosis and interventions are utilized to operationalize the HPM through practical application and research.

  13. Brazilian Science between National and Foreign Journals: Methodology for Analyzing the Production and Impact in Emerging Scientific Communities.

    PubMed

    Strehl, Letícia; Calabró, Luciana; Souza, Diogo Onofre; Amaral, Lívio

    2016-01-01

    In recent decades, we have observed an intensification of science, technology and innovation activities in Brazil. The increase in production of scientific papers indexed in international databases, however, has not been accompanied by an equivalent increase in the impact of publications. This paper presents a methodology for analyzing production and the impact of certain research areas in Brazil related to two aspects: the origin of the journals (national or foreign) and international collaboration. These two variables were selected for being of particular importance in understanding the context of scientific production and communication in countries with emerging economies. The sample consisted of papers written by Brazilian researchers in 19 subfields of knowledge published from 2002 to 2011, totaling 85,082 papers. To calculate the impact, we adopted a normalized indicator called the relative subfield citedness (Rw) using a window of 5 years to obtain measurements evaluated in 2 different years: 2007 and 2012. The data on papers and citations were collected from the Web of Science database. From the results, we note that most of the subfields have presented, from one quinquennium to another, improved performance in the world production rankings. Regarding publication in national and foreign journals, we observed a trend in the distribution maintenance of production of the subfields based on the origin of the journal. Specifically, for impact, we identified a lower Rw pattern for Brazilian papers when they were published in national journals in all subfields. When Brazilian products are published in foreign journals, we observed a higher impact for those papers, even surpassing the average global impact in some subfields. For international collaboration, we analyzed the percentage of participation of foreign researchers and the connection between collaboration and the impact of papers, especially emphasizing the distinction of hyperauthorship papers in terms of production and impact.

  14. ANZSoilML: An Australian - New Zealand standard for exchange of soil data

    NASA Astrophysics Data System (ADS)

    Simons, Bruce; Wilson, Peter; Ritchie, Alistair; Cox, Simon

    2013-04-01

    The Australian-New Zealand soil information exchange standard (ANZSoilML) is a GML-based standard designed to allow the discovery, query and delivery of soil and landscape data via standard Open Geospatial Consortium (OGC) Web Feature Services. ANZSoilML modifies the Australian soil exchange standard (OzSoilML), which is based on the Australian Soil Information Transfer and Evaluation System (SITES) database design and exchange protocols, to meet the New Zealand National Soils Database requirements. The most significant change was the removal of the lists of CodeList terms in OzSoilML, which were based on the field methods specified in the 'Australian Soil and Land Survey Field Handbook'. These were replaced with empty CodeLists as placeholders to external vocabularies to allow the use of New Zealand vocabularies without violating the data model. Testing of the use of these separately governed Australian and New Zealand vocabularies has commenced. ANZSoilML attempts to accommodate the proposed International Organization for Standardization ISO/DIS 28258 standard for soil quality. For the most part, ANZSoilML is consistent with the ISO model, although major differences arise as a result of: • The need to specify the properties appropriate for each feature type; • The inclusion of soil-related 'Landscape' features; • Allowing the mapping of soil surfaces, bodies, layers and horizons, independent of the soil profile; • Allowing specifying the relationships between the various soil features; • Specifying soil horizons as specialisations of soil layers; • Removing duplication of features provided by the ISO Observation & Measurements standard. The International Union of Soil Sciences (IUSS) Working Group on Soil Information Standards (WG-SIS) aims to develop, promote and maintain a standard to facilitate the exchange of soils data and information. Developing an international exchange standard that is compatible with existing and emerging national and regional standards is a considerable challenge. ANZSoilML is proposed as a profile of the more generalised SoilML model being progressed through the IUSS Working Group.

  15. Brazilian Science between National and Foreign Journals: Methodology for Analyzing the Production and Impact in Emerging Scientific Communities

    PubMed Central

    Calabró, Luciana; Souza, Diogo Onofre; Amaral, Lívio

    2016-01-01

    In recent decades, we have observed an intensification of science, technology and innovation activities in Brazil. The increase in production of scientific papers indexed in international databases, however, has not been accompanied by an equivalent increase in the impact of publications. This paper presents a methodology for analyzing production and the impact of certain research areas in Brazil related to two aspects: the origin of the journals (national or foreign) and international collaboration. These two variables were selected for being of particular importance in understanding the context of scientific production and communication in countries with emerging economies. The sample consisted of papers written by Brazilian researchers in 19 subfields of knowledge published from 2002 to 2011, totaling 85,082 papers. To calculate the impact, we adopted a normalized indicator called the relative subfield citedness (Rw) using a window of 5 years to obtain measurements evaluated in 2 different years: 2007 and 2012. The data on papers and citations were collected from the Web of Science database. From the results, we note that most of the subfields have presented, from one quinquennium to another, improved performance in the world production rankings. Regarding publication in national and foreign journals, we observed a trend in the distribution maintenance of production of the subfields based on the origin of the journal. Specifically, for impact, we identified a lower Rw pattern for Brazilian papers when they were published in national journals in all subfields. When Brazilian products are published in foreign journals, we observed a higher impact for those papers, even surpassing the average global impact in some subfields. For international collaboration, we analyzed the percentage of participation of foreign researchers and the connection between collaboration and the impact of papers, especially emphasizing the distinction of hyperauthorship papers in terms of production and impact. PMID:27171223

  16. Diabetic retinopathy screening using deep neural network.

    PubMed

    Ramachandran, Nishanthan; Hong, Sheng Chiong; Sime, Mary J; Wilson, Graham A

    2017-09-07

    There is a burgeoning interest in the use of deep neural network in diabetic retinal screening. To determine whether a deep neural network could satisfactorily detect diabetic retinopathy that requires referral to an ophthalmologist from a local diabetic retinal screening programme and an international database. Retrospective audit. Diabetic retinal photos from Otago database photographed during October 2016 (485 photos), and 1200 photos from Messidor international database. Receiver operating characteristic curve to illustrate the ability of a deep neural network to identify referable diabetic retinopathy (moderate or worse diabetic retinopathy or exudates within one disc diameter of the fovea). Area under the receiver operating characteristic curve, sensitivity and specificity. For detecting referable diabetic retinopathy, the deep neural network had an area under receiver operating characteristic curve of 0.901 (95% confidence interval 0.807-0.995), with 84.6% sensitivity and 79.7% specificity for Otago and 0.980 (95% confidence interval 0.973-0.986), with 96.0% sensitivity and 90.0% specificity for Messidor. This study has shown that a deep neural network can detect referable diabetic retinopathy with sensitivities and specificities close to or better than 80% from both an international and a domestic (New Zealand) database. We believe that deep neural networks can be integrated into community screening once they can successfully detect both diabetic retinopathy and diabetic macular oedema. © 2017 Royal Australian and New Zealand College of Ophthalmologists.

  17. IDOMAL: an ontology for malaria.

    PubMed

    Topalis, Pantelis; Mitraka, Elvira; Bujila, Ioana; Deligianni, Elena; Dialynas, Emmanuel; Siden-Kiamos, Inga; Troye-Blomberg, Marita; Louis, Christos

    2010-08-10

    Ontologies are rapidly becoming a necessity for the design of efficient information technology tools, especially databases, because they permit the organization of stored data using logical rules and defined terms that are understood by both humans and machines. This has as consequence both an enhanced usage and interoperability of databases and related resources. It is hoped that IDOMAL, the ontology of malaria will prove a valuable instrument when implemented in both malaria research and control measures. The OBOEdit2 software was used for the construction of the ontology. IDOMAL is based on the Basic Formal Ontology (BFO) and follows the rules set by the OBO Foundry consortium. The first version of the malaria ontology covers both clinical and epidemiological aspects of the disease, as well as disease and vector biology. IDOMAL is meant to later become the nucleation site for a much larger ontology of vector borne diseases, which will itself be an extension of a large ontology of infectious diseases (IDO). The latter is currently being developed in the frame of a large international collaborative effort. IDOMAL, already freely available in its first version, will form part of a suite of ontologies that will be used to drive IT tools and databases specifically constructed to help control malaria and, later, other vector-borne diseases. This suite already consists of the ontology described here as well as the one on insecticide resistance that has been available for some time. Additional components are being developed and introduced into IDOMAL.

  18. Assessment on EXPERT Descent and Landing System Aerodynamics

    NASA Astrophysics Data System (ADS)

    Wong, H.; Muylaert, J.; Northey, D.; Riley, D.

    2009-01-01

    EXPERT is a re-entry vehicle designed for validation of aero-thermodynamic models, numerical schemes in Computational Fluid Dynamics codes and test facilities for measuring flight data under an Earth re-entry environment. This paper addresses the design for the descent and landing sequence for EXPERT. It includes the descent sequence, the choice of drogue and main parachutes, and the parachute deployment condition, which can be supersonic or subsonic. The analysis is based mainly on an engineering tool, PASDA, together with some hand calculations for parachute sizing and design. The tool consists of a detailed 6-DoF simulation performed with the aerodynamics database of the vehicle, an empirical wakes model and the International Standard Atmosphere database. The aerodynamics database for the vehicle is generated by DNW experimental data and CFD codes within the framework of an ESA contract to CIRA. The analysis will be presented in terms of altitude, velocity, accelerations, angle-of- attack, pitch angle and angle of rigging line. Discussion on the advantages and disadvantages of each parachute deployment condition is included in addition to some comparison with the available data based on a Monte-Carlo method from a Russian company, FSUE NIIPS. Sensitivity on wind speed to the performance of EXPERT is shown to be strong. Supersonic deployment of drogue shows a better performance in stability at the expense of a larger G-load than those from the subsonic deployment of drogue. Further optimization on the parachute design is necessary in order to fulfill all the EXPERT specifications.

  19. Telescience Support Center Data System Software

    NASA Technical Reports Server (NTRS)

    Rahman, Hasan

    2010-01-01

    The Telescience Support Center (TSC) team has developed a databasedriven, increment-specific Data Require - ment Document (DRD) generation tool that automates much of the work required for generating and formatting the DRD. It creates a database to load the required changes to configure the TSC data system, thus eliminating a substantial amount of labor in database entry and formatting. The TSC database contains the TSC systems configuration, along with the experimental data, in which human physiological data must be de-commutated in real time. The data for each experiment also must be cataloged and archived for future retrieval. TSC software provides tools and resources for ground operation and data distribution to remote users consisting of PIs (principal investigators), bio-medical engineers, scientists, engineers, payload specialists, and computer scientists. Operations support is provided for computer systems access, detailed networking, and mathematical and computational problems of the International Space Station telemetry data. User training is provided for on-site staff and biomedical researchers and other remote personnel in the usage of the space-bound services via the Internet, which enables significant resource savings for the physical facility along with the time savings versus traveling to NASA sites. The software used in support of the TSC could easily be adapted to other Control Center applications. This would include not only other NASA payload monitoring facilities, but also other types of control activities, such as monitoring and control of the electric grid, chemical, or nuclear plant processes, air traffic control, and the like.

  20. The structure and dipole moment of globular proteins in solution and crystalline states: use of NMR and X-ray databases for the numerical calculation of dipole moment.

    PubMed

    Takashima, S

    2001-04-05

    The large dipole moment of globular proteins has been well known because of the detailed studies using dielectric relaxation and electro-optical methods. The search for the origin of these dipolemoments, however, must be based on the detailed knowledge on protein structure with atomic resolutions. At present, we have two sources of information on the structure of protein molecules: (1) x-ray databases obtained in crystalline state; (2) NMR databases obtained in solution state. While x-ray databases consist of only one model, NMR databases, because of the fluctuation of the protein folding in solution, consist of a number of models, thus enabling the computation of dipole moment repeated for all these models. The aim of this work, using these databases, is the detailed investigation on the interdependence between the structure and dipole moment of protein molecules. The dipole moment of protein molecules has roughly two components: one dipole moment is due to surface charges and the other, core dipole moment, is due to polar groups such as N--H and C==O bonds. The computation of surface charge dipole moment consists of two steps: (A) calculation of the pK shifts of charged groups for electrostatic interactions and (B) calculation of the dipole moment using the pK corrected for electrostatic shifts. The dipole moments of several proteins were computed using both NMR and x-ray databases. The dipole moments of these two sets of calculations are, with a few exceptions, in good agreement with one another and also with measured dipole moments.

  1. A computer-based information system for epilepsy and electroencephalography.

    PubMed

    Finnerup, N B; Fuglsang-Frederiksen, A; Røssel, P; Jennum, P

    1999-08-01

    This paper describes a standardised computer-based information system for electroencephalography (EEG) focusing on epilepsy. The system was developed using a prototyping approach. It is based on international recommendations for EEG examination, interpretation and terminology, international guidelines for epidemiological studies on epilepsy and classification of epileptic seizures and syndromes and international classification of diseases. It is divided into: (1) clinical information and epilepsy relevant data; and (2) EEG data, which is hierarchically structured including description and interpretation of EEG. Data is coded but is supplemented with unrestricted text. The resulting patient database can be integrated with other clinical databases and with the patient record system and may facilitate clinical and epidemiological research and development of standards and guidelines for EEG description and interpretation. The system is currently used for teleconsultation between Gentofte and Lisbon.

  2. A Unified Approach to Joint Regional/Teleseismic Calibration and Event Location with a 3D Earth Model

    DTIC Science & Technology

    2010-09-01

    raytracing and travel-time calculation in 3D Earth models, such as the finite-difference eikonal method (e.g., Podvin and Lecomte, 1991), fast...by Reiter and Rodi (2009) in constructing JWM. Two teleseismic data sets were considered, both extracted from the EHB database (Engdahl et al...extracted from the updated EHB database distributed by the International Seismological Centre (http://www.isc.ac.uk/EHB/index.html). The new database

  3. Development of expert systems for analyzing electronic documents

    NASA Astrophysics Data System (ADS)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  4. RNAcentral: A vision for an international database of RNA sequences

    PubMed Central

    Bateman, Alex; Agrawal, Shipra; Birney, Ewan; Bruford, Elspeth A.; Bujnicki, Janusz M.; Cochrane, Guy; Cole, James R.; Dinger, Marcel E.; Enright, Anton J.; Gardner, Paul P.; Gautheret, Daniel; Griffiths-Jones, Sam; Harrow, Jen; Herrero, Javier; Holmes, Ian H.; Huang, Hsien-Da; Kelly, Krystyna A.; Kersey, Paul; Kozomara, Ana; Lowe, Todd M.; Marz, Manja; Moxon, Simon; Pruitt, Kim D.; Samuelsson, Tore; Stadler, Peter F.; Vilella, Albert J.; Vogel, Jan-Hinnerk; Williams, Kelly P.; Wright, Mathew W.; Zwieb, Christian

    2011-01-01

    During the last decade there has been a great increase in the number of noncoding RNA genes identified, including new classes such as microRNAs and piRNAs. There is also a large growth in the amount of experimental characterization of these RNA components. Despite this growth in information, it is still difficult for researchers to access RNA data, because key data resources for noncoding RNAs have not yet been created. The most pressing omission is the lack of a comprehensive RNA sequence database, much like UniProt, which provides a comprehensive set of protein knowledge. In this article we propose the creation of a new open public resource that we term RNAcentral, which will contain a comprehensive collection of RNA sequences and fill an important gap in the provision of biomedical databases. We envision RNA researchers from all over the world joining a federated RNAcentral network, contributing specialized knowledge and databases. RNAcentral would centralize key data that are currently held across a variety of databases, allowing researchers instant access to a single, unified resource. This resource would facilitate the next generation of RNA research and help drive further discoveries, including those that improve food production and human and animal health. We encourage additional RNA database resources and research groups to join this effort. We aim to obtain international network funding to further this endeavor. PMID:21940779

  5. Software Tools Streamline Project Management

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query-Based Document Management (QBDM) is a tool that enables content or context searches, either simple or hierarchical, across a variety of databases. The system enables users to specify notification subscriptions where they associate "contexts of interest" and "events of interest" to one or more documents or collection(s) of documents. Based on these subscriptions, users receive notification when the events of interest occur within the contexts of interest for associated document or collection(s) of documents. Users can also associate at least one notification time as part of the notification subscription, with at least one option for the time period of notifications.

  6. Nuclear Science References Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritychenko, B., E-mail: pritychenko@bnl.gov; Běták, E.; Singh, B.

    2014-06-15

    The Nuclear Science References (NSR) database together with its associated Web interface, is the world's only comprehensive source of easily accessible low- and intermediate-energy nuclear physics bibliographic information for more than 210,000 articles since the beginning of nuclear science. The weekly-updated NSR database provides essential support for nuclear data evaluation, compilation and research activities. The principles of the database and Web application development and maintenance are described. Examples of nuclear structure, reaction and decay applications are specifically included. The complete NSR database is freely available at the websites of the National Nuclear Data Center (http://www.nndc.bnl.gov/nsr) and the International Atomic Energymore » Agency (http://www-nds.iaea.org/nsr)« less

  7. The Impact of the Programme for International Student Assessment on Academic Journals

    ERIC Educational Resources Information Center

    Dominguez, Maria; Vieira, Maria-Jose; Vidal, Javier

    2012-01-01

    The aim of this study is to assess the impact of PISA (Programme for International Student Assessment) on international scientific journals. A bibliometric analysis was conducted of publications included in three main scientific publication databases: Eric, EBSCOhost and the ISI Web of Knowledge, from 2002 to 2010. The paper focused on four main…

  8. TEDS-M 2008 User Guide for the International Database

    ERIC Educational Resources Information Center

    Brese, Falk, Ed.

    2012-01-01

    The Teacher Education Study in Mathematics or TEDS-M is a study conducted under the aegis of the International Association for the Evaluation of Educational Achievement (IEA). The lead research center for the study is the International Study Center at Michigan State University (ISC/MSU). The ISC/MSU worked from 2006 to 2011 with the International…

  9. A database of natural products and chemical entities from marine habitat

    PubMed Central

    Babu, Padavala Ajay; Puppala, Suma Sree; Aswini, Satyavarapu Lakshmi; Vani, Metta Ramya; Kumar, Chinta Narasimha; Prasanna, Tallapragada

    2008-01-01

    Marine compound database consists of marine natural products and chemical entities, collected from various literature sources, which are known to possess bioactivity against human diseases. The database is constructed using html code. The 12 categories of 182 compounds are provided with the source, compound name, 2-dimensional structure, bioactivity and clinical trial information. The database is freely available online and can be accessed at http://www.progenebio.in/mcdb/index.htm PMID:19238254

  10. International exploration of Mars. A special bibliography

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This bibliography lists 173 reports, articles, and other documents introduced into the NASA Scientific and Technical Information Database on the exploration of Mars. Historical references are cited for background. The bibliography was created for the 1991 session of the International Space University.

  11. An alternative database approach for management of SNOMED CT and improved patient data queries.

    PubMed

    Campbell, W Scott; Pedersen, Jay; McClay, James C; Rao, Praveen; Bastola, Dhundy; Campbell, James R

    2015-10-01

    SNOMED CT is the international lingua franca of terminologies for human health. Based in Description Logics (DL), the terminology enables data queries that incorporate inferences between data elements, as well as, those relationships that are explicitly stated. However, the ontologic and polyhierarchical nature of the SNOMED CT concept model make it difficult to implement in its entirety within electronic health record systems that largely employ object oriented or relational database architectures. The result is a reduction of data richness, limitations of query capability and increased systems overhead. The hypothesis of this research was that a graph database (graph DB) architecture using SNOMED CT as the basis for the data model and subsequently modeling patient data upon the semantic core of SNOMED CT could exploit the full value of the terminology to enrich and support advanced data querying capability of patient data sets. The hypothesis was tested by instantiating a graph DB with the fully classified SNOMED CT concept model. The graph DB instance was tested for integrity by calculating the transitive closure table for the SNOMED CT hierarchy and comparing the results with transitive closure tables created using current, validated methods. The graph DB was then populated with 461,171 anonymized patient record fragments and over 2.1 million associated SNOMED CT clinical findings. Queries, including concept negation and disjunction, were then run against the graph database and an enterprise Oracle relational database (RDBMS) of the same patient data sets. The graph DB was then populated with laboratory data encoded using LOINC, as well as, medication data encoded with RxNorm and complex queries performed using LOINC, RxNorm and SNOMED CT to identify uniquely described patient populations. A graph database instance was successfully created for two international releases of SNOMED CT and two US SNOMED CT editions. Transitive closure tables and descriptive statistics generated using the graph database were identical to those using validated methods. Patient queries produced identical patient count results to the Oracle RDBMS with comparable times. Database queries involving defining attributes of SNOMED CT concepts were possible with the graph DB. The same queries could not be directly performed with the Oracle RDBMS representation of the patient data and required the creation and use of external terminology services. Further, queries of undefined depth were successful in identifying unknown relationships between patient cohorts. The results of this study supported the hypothesis that a patient database built upon and around the semantic model of SNOMED CT was possible. The model supported queries that leveraged all aspects of the SNOMED CT logical model to produce clinically relevant query results. Logical disjunction and negation queries were possible using the data model, as well as, queries that extended beyond the structural IS_A hierarchy of SNOMED CT to include queries that employed defining attribute-values of SNOMED CT concepts as search parameters. As medical terminologies, such as SNOMED CT, continue to expand, they will become more complex and model consistency will be more difficult to assure. Simultaneously, consumers of data will increasingly demand improvements to query functionality to accommodate additional granularity of clinical concepts without sacrificing speed. This new line of research provides an alternative approach to instantiating and querying patient data represented using advanced computable clinical terminologies. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. The International Permafrost Association: current initiatives for cryospheric research

    NASA Astrophysics Data System (ADS)

    Schollaen, Karina; Lewkowicz, Antoni G.; Christiansen, Hanne H.; Romanovsky, Vladimir E.; Lantuit, Hugues; Schrott, Lothar; Sergeev, Dimitry; Wei, Ma

    2015-04-01

    The International Permafrost Association (IPA), founded in 1983, has as its objectives to foster the dissemination of knowledge concerning permafrost and to promote cooperation among persons and national or international organizations engaged in scientific investigation and engineering work on permafrost. The IPA's primary responsibilities are convening International Permafrost Conferences, undertaking special projects such as preparing databases, maps, bibliographies, and glossaries, and coordinating international field programs and networks. Membership is through adhering national or multinational organizations or as individuals in countries where no Adhering Body exists. The IPA is governed by its Executive Committee and a Council consisting of representatives from 26 Adhering Bodies having interests in some aspect of theoretical, basic and applied frozen ground research, including permafrost, seasonal frost, artificial freezing and periglacial phenomena. This presentation details the IPA core products, achievements and activities as well as current projects in cryospheric research. One of the most important core products is the circumpolar permafrost map. The IPA also fosters and supports the activities of the Global Terrestrial Network on Permafrost (GTN-P) sponsored by the Global Terrestrial Observing System, GTOS, and the Global Climate Observing System, GCOS, whose long-term goal is to obtain a comprehensive view of the spatial structure, trends, and variability of changes in the active layer thickness and permafrost temperature. A further important initiative of the IPA are the biannually competitively-funded Action Groups which work towards the production of well-defined products over a period of two years. Current IPA Action Groups are working on highly topical and interdisciplinary issues, such as the development of a regional Palaeo-map of Permafrost in Eurasia, the integration of multidisciplinary knowledge about the use of thermokarst and permafrost landscapes, and defining permafrost research priorities - a roadmap for the future. The latter project is a joint effort with the Climate and Cryosphere initiative (CliC) and a contribution to the upcoming International Conference on Arctic Research Planning III (ICARP III). The product stemming from the effort will consist of a journal publication listing permafrost research priorities and putting them into context. In all of these activities, the IPA emphasizes the involvement of young researchers (especially through the Permafrost Young Researchers Network and APECS) as well as its collaboration with international partner organizations such as IASC, SCAR, CliC, IACS, IUGS and WMO.

  13. Aerodynamic Analyses and Database Development for Ares I Vehicle First Stage Separation

    NASA Technical Reports Server (NTRS)

    Pamadi, Bandu N.; Pei, Jing; Pinier, Jeremy T.; Klopfer, Goetz H.; Holland, Scott D.; Covell, Peter F.

    2011-01-01

    This paper presents the aerodynamic analysis and database development for first stage separation of Ares I A106 crew launch vehicle configuration. Separate 6-DOF databases were created for the first stage and upper stage and each database consists of three components: (a) isolated or freestream coefficients, (b) power-off proximity increments, and (c) power-on proximity increments. The isolated and power-off incremental databases were developed using data from 1% scaled model tests in AEDC VKF Tunnel A. The power-on proximity increments were developed using OVERFLOW CFD solutions. The database also includes incremental coefficients for one BDM and one USM failure scenarios.

  14. ECOTOX knowledgebase: New tools for data visualization and database interoperability

    EPA Science Inventory

    The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, curated database that summarizes toxicology data fromsingle chemical exposure studies to terrestrial and aquatic organisms. The ECOTOX Knowledgebase provides risk assessors and researchers consistent information on toxi...

  15. Advanced Traffic Management Systems (ATMS) research analysis database system

    DOT National Transportation Integrated Search

    2001-06-01

    The ATMS Research Analysis Database Systems (ARADS) consists of a Traffic Software Data Dictionary (TSDD) and a Traffic Software Object Model (TSOM) for application to microscopic traffic simulation and signal optimization domains. The purpose of thi...

  16. Regional heterogeneity in limbic maturational changes: evidence from integrating cortical thickness, volumetric and diffusion tensor imaging measures.

    PubMed

    Grieve, Stuart M; Korgaonkar, Mayuresh S; Clark, C Richard; Williams, Leanne M

    2011-04-01

    Magnetic resonance imaging (MRI) studies of structural brain development have suggested that the limbic system is relatively preserved in comparison to other brain regions with healthy aging. The goal of this study was to systematically investigate age-related changes of the limbic system using measures of cortical thickness, volumetric and diffusion characteristics. We also investigated if the "relative preservation" concept is consistent across the individual sub-regions of the limbic system. T1 weighted structural MRI and Diffusion Tensor Imaging data from 476 healthy participants from the Brain Resource International Database was used for this study. Age-related changes in grey matter (GM)/white matter (WM) volume, cortical thickness, diffusional characteristics for the pericortical WM and for the fiber tracts associated with the limbic regions were quantified. A regional variability in the aging patterns across the limbic system was present. Four important patterns of age-related changes were highlighted for the limbic sub-regions: 1. early maturation of GM with late loss in the hippocampus and amygdala; 2. an extreme pattern of GM preservation in the entorhinal cortex; 3. a flat pattern of reduced GM loss in the anterior cingulate and the parahippocampus and; 4. accelerated GM loss in the isthmus and posterior cingulate. The GM volumetric data and cortical thickness measures proved to be internally consistent, while the diffusional measures provided complementary data that seem consistent with the GM trends identified. This heterogeneity can be hypothesized to be associated with age-related changes of cognitive function specialized for that region and direct connections to the other brain regions sub-serving these functions. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Psychometric analysis of the PTSD Checklist-5 (PCL-5) among treatment-seeking military service members.

    PubMed

    Wortmann, Jennifer H; Jordan, Alexander H; Weathers, Frank W; Resick, Patricia A; Dondanville, Katherine A; Hall-Clark, Brittany; Foa, Edna B; Young-McCaughan, Stacey; Yarvis, Jeffrey S; Hembree, Elizabeth A; Mintz, Jim; Peterson, Alan L; Litz, Brett T

    2016-11-01

    The Posttraumatic Stress Disorder Checklist (PCL-5; Weathers et al., 2013) was recently revised to reflect the changed diagnostic criteria for posttraumatic stress disorder (PTSD) in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5; American Psychiatric Association, 2013). We investigated the psychometric properties of PCL-5 scores in a large cohort (N = 912) of military service members seeking PTSD treatment while stationed in garrison. We examined the internal consistency, convergent and discriminant validity, and DSM-5 factor structure of PCL-5 scores, their sensitivity to clinical change relative to PTSD Symptom Scale-Interview (PSS-I; Foa, Riggs, Dancu, & Rothbaum, 1993) scores, and their diagnostic utility for predicting a PTSD diagnosis based on various measures and scoring rules. PCL-5 scores exhibited high internal consistency. There was strong agreement between the order of hypothesized and observed correlations among PCL-5 and criterion measure scores. The best-fitting structural model was a 7-factor hybrid model (Armour et al., 2015), which demonstrated closer fit than all other models evaluated, including the DSM-5 model. The PCL-5's sensitivity to clinical change, pre- to posttreatment, was comparable with that of the PSS-I. Optimally efficient cut scores for predicting PTSD diagnosis were consistent with prior research with service members (Hoge, Riviere, Wilk, Herrell, & Weathers, 2014). The results indicate that the PCL-5 is a psychometrically sound measure of DSM-5 PTSD symptoms that is useful for identifying provisional PTSD diagnostic status, quantifying PTSD symptom severity, and detecting clinical change over time in PTSD symptoms among service members seeking treatment. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  18. A Tephra Database With an Intelligent Correlation System, Mono-Inyo Volcanic Chain, CA

    NASA Astrophysics Data System (ADS)

    Bursik, M.; Rogova, G.

    2004-12-01

    We are assembling a web-accessible, relational database of information on past eruptions of the Mono-Inyo volcanic chain, eastern California. The PostgreSQL database structure follows the North American Data Model and CordLink. The database allows us to extract the features diagnostic of particular pyroclastic layers, as well as lava domes and flows. The features include depth in the section, layer thickness and internal stratigraphy, mineral assemblage, major and trace element composition, tephra componentry and granulometry, and radiocarbon age. Our working hypotheses are that 1) the database will prove useful for unraveling the complex recent volcanic history of the Mono-Inyo chain 2) aided by the use of an intelligent correlation system integrated into the database system. The Mono-Inyo chain consists of domes, craters and flows that stretch for 50 km north-south, subparallel to the Sierran range front fault system. Almost all eruptions within the chain probably occurred less than 50,000 years ago. Because of the variety of magma and eruption types, and the migration of source regions in time and space, it is nontrivial to discern patterns of behaviour. We have explored the use of multiple artificial neural networks combined within the framework of the Dempster-Shafer theory of evidence to construct a hybrid information processing system as an aid in the correlation of Mono-Inyo pyroclastic layers. It is hoped that such a system could provide information useful to discerning eruptive patterns that would otherwise be difficult to sort and categorize. In a test case on tephra layers at known sites, the intelligent correlation system was able to categorize observations correctly 96% of the time. In a test case with layers at one unknown site, and using a pairwise comparison of the unknown site with the known sites, a one-to-one correlation between the unknown site and the known sites was found to sometimes be poor. Such a result could be used to aid a stratigrapher in rethinking or questioning a proposed correlation. This rethinking might not happen without the input from the intelligent system.

  19. A Gromov-Hausdorff Framework with Diffusion Geometry for Topologically-Robust Non-Rigid Shape Matching

    DTIC Science & Technology

    2009-02-01

    topology changes. We used a subset of the TOSCA shape database , [10], consisting of four different objects: cat, dog, male, and female. Each of the...often encountered as acquisition imperfections when the shapes are acquired using a 3D scanner. We used a subset of the TOSCA shape database , consisting...object recognition, Point Based Graphics, Prague, 2007. 18 44. A. Spira and R. Kimmel, An efficient solution to the eikonal equation on parametric

  20. Protein, fat, moisture, and cooking yields from a nationwide study of retail beef cuts.

    USDA-ARS?s Scientific Manuscript database

    Nutrient data from the U.S. Department of Agriculture (USDA) are an important resource for U.S. and international databases. To ensure the data for retail beef cuts in USDA’s National Nutrient Database for Standard Reference (SR) are current, a comprehensive, nationwide, multiyear study was conducte...

  1. Fatty acid, cholesterol, vitamin, and mineral content of cooked beef cuts from a national study

    USDA-ARS?s Scientific Manuscript database

    The U.S. Department of Agriculture (USDA) provides foundational nutrient data for U.S. and international databases. For currency of retail beef data in USDA’s database, a nationwide comprehensive study obtained samples by primal categories using a statistically based sampling plan, resulting in 72 ...

  2. Improving the Scalability of an Exact Approach for Frequent Item Set Hiding

    ERIC Educational Resources Information Center

    LaMacchia, Carolyn

    2013-01-01

    Technological advances have led to the generation of large databases of organizational data recognized as an information-rich, strategic asset for internal analysis and sharing with trading partners. Data mining techniques can discover patterns in large databases including relationships considered strategically relevant to the owner of the data.…

  3. A cluster-based approach to selecting representative stimuli from the International Affective Picture System (IAPS) database.

    PubMed

    Constantinescu, Alexandra C; Wolters, Maria; Moore, Adam; MacPherson, Sarah E

    2017-06-01

    The International Affective Picture System (IAPS; Lang, Bradley, & Cuthbert, 2008) is a stimulus database that is frequently used to investigate various aspects of emotional processing. Despite its extensive use, selecting IAPS stimuli for a research project is not usually done according to an established strategy, but rather is tailored to individual studies. Here we propose a standard, replicable method for stimulus selection based on cluster analysis, which re-creates the group structure that is most likely to have produced the valence arousal, and dominance norms associated with the IAPS images. Our method includes screening the database for outliers, identifying a suitable clustering solution, and then extracting the desired number of stimuli on the basis of their level of certainty of belonging to the cluster they were assigned to. Our method preserves statistical power in studies by maximizing the likelihood that the stimuli belong to the cluster structure fitted to them, and by filtering stimuli according to their certainty of cluster membership. In addition, although our cluster-based method is illustrated using the IAPS, it can be extended to other stimulus databases.

  4. Nuclear Reactors and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cason, D.L.; Hicks, S.C.

    1992-01-01

    This publication Nuclear Reactors and Technology (NRT) announces on a monthly basis the current worldwide information available from the open literature on nuclear reactors and technology, including all aspects of power reactors, components and accessories, fuel elements, control systems, and materials. This publication contains the abstracts of DOE reports, journal articles, conference papers, patents, theses, and monographs added to the Energy Science and Technology Database during the past month. Also included are US information obtained through acquisition programs or interagency agreements and international information obtained through the International Energy Agency`s Energy Technology Data Exchange or government-to-government agreements. The digests inmore » NRT and other citations to information on nuclear reactors back to 1948 are available for online searching and retrieval on the Energy Science and Technology Database and Nuclear Science Abstracts (NSA) database. Current information, added daily to the Energy Science and Technology Database, is available to DOE and its contractors through the DOE Integrated Technical Information System. Customized profiles can be developed to provide current information to meet each user`s needs.« less

  5. Sports medicine clinical trial research publications in academic medical journals between 1996 and 2005: an audit of the PubMed MEDLINE database.

    PubMed

    Nichols, A W

    2008-11-01

    To identify sports medicine-related clinical trial research articles in the PubMed MEDLINE database published between 1996 and 2005 and conduct a review and analysis of topics of research, experimental designs, journals of publication and the internationality of authorships. Sports medicine research is international in scope with improving study methodology and an evolution of topics. Structured review of articles identified in a search of a large electronic medical database. PubMed MEDLINE database. Sports medicine-related clinical research trials published between 1996 and 2005. Review and analysis of articles that meet inclusion criteria. Articles were examined for study topics, research methods, experimental subject characteristics, journal of publication, lead authors and journal countries of origin and language of publication. The search retrieved 414 articles, of which 379 (345 English language and 34 non-English language) met the inclusion criteria. The number of publications increased steadily during the study period. Randomised clinical trials were the most common study type and the "diagnosis, management and treatment of sports-related injuries and conditions" was the most popular study topic. The knee, ankle/foot and shoulder were the most frequent anatomical sites of study. Soccer players and runners were the favourite study subjects. The American Journal of Sports Medicine had the highest number of publications and shared the greatest international diversity of authorships with the British Journal of Sports Medicine. The USA, Australia, Germany and the UK produced a good number of the lead authorships. In all, 91% of articles and 88% of journals were published in English. Sports medicine-related research is internationally diverse, clinical trial publications are increasing and the sophistication of research design may be improving.

  6. MIPS: a database for genomes and protein sequences.

    PubMed Central

    Mewes, H W; Heumann, K; Kaps, A; Mayer, K; Pfeiffer, F; Stocker, S; Frishman, D

    1999-01-01

    The Munich Information Center for Protein Sequences (MIPS-GSF), Martinsried near Munich, Germany, develops and maintains genome oriented databases. It is commonplace that the amount of sequence data available increases rapidly, but not the capacity of qualified manual annotation at the sequence databases. Therefore, our strategy aims to cope with the data stream by the comprehensive application of analysis tools to sequences of complete genomes, the systematic classification of protein sequences and the active support of sequence analysis and functional genomics projects. This report describes the systematic and up-to-date analysis of genomes (PEDANT), a comprehensive database of the yeast genome (MYGD), a database reflecting the progress in sequencing the Arabidopsis thaliana genome (MATD), the database of assembled, annotated human EST clusters (MEST), and the collection of protein sequence data within the framework of the PIR-International Protein Sequence Database (described elsewhere in this volume). MIPS provides access through its WWW server (http://www.mips.biochem.mpg.de) to a spectrum of generic databases, including the above mentioned as well as a database of protein families (PROTFAM), the MITOP database, and the all-against-all FASTA database. PMID:9847138

  7. Strengthening primary health care in low- and middle-income countries: generating evidence through evaluation.

    PubMed

    Rule, John; Ngo, Duc Anh; Oanh, Tran Thi Mai; Asante, Augustine; Doyle, Jennifer; Roberts, Graham; Taylor, Richard

    2014-07-01

    Since the publication of the World Health Report 2008, there has been renewed interest in the potential of primary health care (PHC) to deliver global health policy agendas. The WHO Western Pacific Regional Strategy 2010 states that health systems in low- and middle-income countries (LMICs) can be strengthened using PHC values as core principles. This review article explores the development of an evidence-based approach for assessing the effectiveness of PHC programs and interventions in LMICs. A realist review method was used to investigate whether there is any internationally consistent approach to evaluating PHC. Studies from LMICs using an explicit methodology or framework for measuring PHC effectiveness were collated. Databases of published articles were searched, and a review of gray literature was undertaken to identify relevant reports. The review found no consistent approach for assessing the effectiveness of PHC interventions in LMICs. An innovative approach used in China, which developed a set of core community health facility indicators based on stakeholder input, does show some potential for use in other LMIC contexts. © 2013 APJPH.

  8. A comprehensive clinical assessment tool to inform policy and practice: applications of the minimum data set.

    PubMed

    Mor, Vincent

    2004-04-01

    The Minimum Data Set (MDS) for nursing home (NH) resident assessment, designed to assess elders functional status and care needs, exemplifies how the information needs of clinical practice are congruent with those of research. Building on a review of the published literature, this article describes the development of the MDS, its reliability and validity testing, as well as the variety of different policy and research uses to which it has been applied. Interrater reliability of items and internal consistency of MDS summary scales is generally good to excellent. Validation studies reveal good correspondence to research quality instruments for cognition, activities of daily living, and diagnoses with more variable results for vision, pain, mood, and behavior scales. To date, no consistent evidence suggests that applications of MDS data for case-mix reimbursement and quality indicator monitoring systematically bias the data. Although facility variation in data quality could compromise some applications, creation of the MDS as a clinical tool for care planning provides an example of how assessment tools with clinical use can be used in administrative databases for research and policy applications.

  9. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  10. Validation of chronic obstructive pulmonary disease (COPD) diagnoses in healthcare databases: a systematic review protocol.

    PubMed

    Rimland, Joseph M; Abraha, Iosief; Luchetta, Maria Laura; Cozzolino, Francesco; Orso, Massimiliano; Cherubini, Antonio; Dell'Aquila, Giuseppina; Chiatti, Carlos; Ambrosio, Giuseppe; Montedori, Alessandro

    2016-06-01

    Healthcare databases are useful sources to investigate the epidemiology of chronic obstructive pulmonary disease (COPD), to assess longitudinal outcomes in patients with COPD, and to develop disease management strategies. However, in order to constitute a reliable source for research, healthcare databases need to be validated. The aim of this protocol is to perform the first systematic review of studies reporting the validation of codes related to COPD diagnoses in healthcare databases. MEDLINE, EMBASE, Web of Science and the Cochrane Library databases will be searched using appropriate search strategies. Studies that evaluated the validity of COPD codes (such as the International Classification of Diseases 9th Revision and 10th Revision system; the Real codes system or the International Classification of Primary Care) in healthcare databases will be included. Inclusion criteria will be: (1) the presence of a reference standard case definition for COPD; (2) the presence of at least one test measure (eg, sensitivity, positive predictive values, etc); and (3) the use of a healthcare database (including administrative claims databases, electronic healthcare databases or COPD registries) as a data source. Pairs of reviewers will independently abstract data using standardised forms and will assess quality using a checklist based on the Standards for Reporting of Diagnostic accuracy (STARD) criteria. This systematic review protocol has been produced in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocol (PRISMA-P) 2015 statement. Ethics approval is not required. Results of this study will be submitted to a peer-reviewed journal for publication. The results from this systematic review will be used for outcome research on COPD and will serve as a guide to identify appropriate case definitions of COPD, and reference standards, for researchers involved in validating healthcare databases. CRD42015029204. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. Seventy Years of RN Effectiveness: A Database Development Project to Inform Best Practice.

    PubMed

    Lulat, Zainab; Blain-McLeod, Julie; Grinspun, Doris; Penney, Tasha; Harripaul-Yhap, Anastasia; Rey, Michelle

    2018-03-23

    The appropriate nursing staff mix is imperative to the provision of quality care. Nurse staffing levels and staff mix vary from country to country, as well as between care settings. Understanding how staffing skill mix impacts patient, organizational, and financial outcomes is critical in order to allow policymakers and clinicians to make evidence-informed staffing decisions. This paper reports on the methodology for creation of an electronic database of studies exploring the effectiveness of Registered Nurses (RNs) on clinical and patient outcomes, organizational and nurse outcomes, and financial outcomes. Comprehensive literature searches were conducted in four electronic databases. Inclusion criteria for the database included studies published from 1946 to 2016, peer-reviewed international literature, and studies focused on RNs in all health-care disciplines, settings, and sectors. Masters-prepared nurse researchers conducted title and abstract screening and relevance review to determine eligibility of studies for the database. High-level analysis was conducted to determine key outcomes and the frequency at which they appeared within the database. Of the initial 90,352 records, a total of 626 abstracts were included within the database. Studies were organized into three groups corresponding to clinical and patient outcomes, organizational and nurse-related outcomes, and financial outcomes. Organizational and nurse-related outcomes represented the largest category in the database with 282 studies, followed by clinical and patient outcomes with 244 studies, and lastly financial outcomes, which included 124 studies. The comprehensive database of evidence for RN effectiveness is freely available at https://rnao.ca/bpg/initiatives/RNEffectiveness. The database will serve as a resource for the Registered Nurses' Association of Ontario, as well as a tool for researchers, clinicians, and policymakers for making evidence-informed staffing decisions. © 2018 The Authors. Worldviews on Evidence-Based Nursing published by Wiley Periodicals, Inc. on behalf of Sigma Theta Tau International The Honor Society of Nursing.

  12. National briefing summaries: Nuclear fuel cycle and waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, K.J.; Bradley, D.J.; Fletcher, J.F.

    Since 1976, the International Program Support Office (IPSO) at the Pacific Northwest Laboratory (PNL) has collected and compiled publicly available information concerning foreign and international radioactive waste management programs. This National Briefing Summaries is a printout of an electronic database that has been compiled and is maintained by the IPSO staff. The database contains current information concerning the radioactive waste management programs (with supporting information on nuclear power and the nuclear fuel cycle) of most of the nations (except eastern European countries) that now have or are contemplating nuclear power, and of the multinational agencies that are active in radioactivemore » waste management. Information in this document is included for three additional countries (China, Mexico, and USSR) compared to the prior issue. The database and this document were developed in response to needs of the US Department of Energy.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kogalovskii, M.R.

    This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.

  14. Multidimensional classification of magma types for altered igneous rocks and application to their tectonomagmatic discrimination and igneous provenance of siliciclastic sediments

    NASA Astrophysics Data System (ADS)

    Verma, Surendra P.; Rivera-Gómez, M. Abdelaly; Díaz-González, Lorena; Pandarinath, Kailasa; Amezcua-Valdez, Alejandra; Rosales-Rivera, Mauricio; Verma, Sanjeet K.; Quiroz-Ruiz, Alfredo; Armstrong-Altrin, John S.

    2017-05-01

    A new multidimensional scheme consistent with the International Union of Geological Sciences (IUGS) is proposed for the classification of igneous rocks in terms of four magma types: ultrabasic, basic, intermediate, and acid. Our procedure is based on an extensive database of major element composition of a total of 33,868 relatively fresh rock samples having a multinormal distribution (initial database with 37,215 samples). Multinormally distributed database in terms of log-ratios of samples was ascertained by a new computer program DOMuDaF, in which the discordancy test was applied at the 99.9% confidence level. Isometric log-ratio (ilr) transformation was used to provide overall percent correct classification of 88.7%, 75.8%, 88.0%, and 80.9% for ultrabasic, basic, intermediate, and acid rocks, respectively. Given the known mathematical and uncertainty propagation properties, this transformation could be adopted for routine applications. The incorrect classification was mainly for the "neighbour" magma types, e.g., basic for ultrabasic and vice versa. Some of these misclassifications do not have any effect on multidimensional tectonic discrimination. For an efficient application of this multidimensional scheme, a new computer program MagClaMSys_ilr (MagClaMSys-Magma Classification Major-element based System) was written, which is available for on-line processing on http://tlaloc.ier.unam.mx/index.html. This classification scheme was tested from newly compiled data for relatively fresh Neogene igneous rocks and was found to be consistent with the conventional IUGS procedure. The new scheme was successfully applied to inter-laboratory data for three geochemical reference materials (basalts JB-1 and JB-1a, and andesite JA-3) from Japan and showed that the inferred magma types are consistent with the rock name (basic for basalts JB-1 and JB-1a and intermediate for andesite JA-3). The scheme was also successfully applied to five case studies of older Archaean to Mesozoic igneous rocks. Similar or more reliable results were obtained from existing tectonomagmatic discrimination diagrams when used in conjunction with the new computer program as compared to the IUGS scheme. The application to three case studies of igneous provenance of sedimentary rocks was demonstrated as a novel approach. Finally, we show that the new scheme is more robust for post-emplacement compositional changes than the conventional IUGS procedure.

  15. Solubility Database

    National Institute of Standards and Technology Data Gateway

    SRD 106 IUPAC-NIST Solubility Database (Web, free access)   These solubilities are compiled from 18 volumes (Click here for List) of the International Union for Pure and Applied Chemistry(IUPAC)-NIST Solubility Data Series. The database includes liquid-liquid, solid-liquid, and gas-liquid systems. Typical solvents and solutes include water, seawater, heavy water, inorganic compounds, and a variety of organic compounds such as hydrocarbons, halogenated hydrocarbons, alcohols, acids, esters and nitrogen compounds. There are over 67,500 solubility measurements and over 1800 references.

  16. "XANSONS for COD": a new small BOINC project in crystallography

    NASA Astrophysics Data System (ADS)

    Neverov, Vladislav S.; Khrapov, Nikolay P.

    2018-04-01

    "XANSONS for COD" (http://xansons4cod.com) is a new BOINC project aimed at creating the open-access database of simulated x-ray and neutron powder diffraction patterns for nanocrystalline phase of materials from the collection of the Crystallography Open Database (COD). The project uses original open-source software XaNSoNS to simulate diffraction patterns on CPU and GPU. This paper describes the scientific problem this project solves, the project's internal structure, its operation principles and organization of the final database.

  17. A 5.8S nuclear ribosomal RNA gene sequence database: applications to ecology and evolution

    NASA Technical Reports Server (NTRS)

    Cullings, K. W.; Vogler, D. R.

    1998-01-01

    We complied a 5.8S nuclear ribosomal gene sequence database for animals, plants, and fungi using both newly generated and GenBank sequences. We demonstrate the utility of this database as an internal check to determine whether the target organism and not a contaminant has been sequenced, as a diagnostic tool for ecologists and evolutionary biologists to determine the placement of asexual fungi within larger taxonomic groups, and as a tool to help identify fungi that form ectomycorrhizae.

  18. Application of new type of distributed multimedia databases to networked electronic museum

    NASA Astrophysics Data System (ADS)

    Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki

    1999-01-01

    Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed domains. The result also indicate that the system can effectively work even if the system becomes large.

  19. ECOTOX Knowledgebase: New tools for data visualization and database interoperability -Poster

    EPA Science Inventory

    The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, curated database that summarizes toxicology data from single chemical exposure studies to terrestrial and aquatic organisms. The ECOTOX Knowledgebase provides risk assessors and researchers consistent information on tox...

  20. ECOTOX Knowledgebase: New tools for data visualization and database interoperability (poster)

    EPA Science Inventory

    The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, curated database that summarizes toxicology data from single chemical exposure studies to terrestrial and aquatic organisms. The ECOTOX Knowledgebase provides risk assessors and researchers consistent information on tox...

  1. Content Based Image Matching for Planetary Science

    NASA Astrophysics Data System (ADS)

    Deans, M. C.; Meyer, C.

    2006-12-01

    Planetary missions generate large volumes of data. With the MER rovers still functioning on Mars, PDS contains over 7200 released images from the Microscopic Imagers alone. These data products are only searchable by keys such as the Sol, spacecraft clock, or rover motion counter index, with little connection to the semantic content of the images. We have developed a method for matching images based on the visual textures in images. For every image in a database, a series of filters compute the image response to localized frequencies and orientations. Filter responses are turned into a low dimensional descriptor vector, generating a 37 dimensional fingerprint. For images such as the MER MI, this represents a compression ratio of 99.9965% (the fingerprint is approximately 0.0035% the size of the original image). At query time, fingerprints are quickly matched to find images with similar appearance. Image databases containing several thousand images are preprocessed offline in a matter of hours. Image matches from the database are found in a matter of seconds. We have demonstrated this image matching technique using three sources of data. The first database consists of 7200 images from the MER Microscopic Imager. The second database consists of 3500 images from the Narrow Angle Mars Orbital Camera (MOC-NA), which were cropped into 1024×1024 sub-images for consistency. The third database consists of 7500 scanned archival photos from the Apollo Metric Camera. Example query results from all three data sources are shown. We have also carried out user tests to evaluate matching performance by hand labeling results. User tests verify approximately 20% false positive rate for the top 14 results for MOC NA and MER MI data. This means typically 10 to 12 results out of 14 match the query image sufficiently. This represents a powerful search tool for databases of thousands of images where the a priori match probability for an image might be less than 1%. Qualitatively, correct matches can also be confirmed by verifying MI images taken in the same z-stack, or MOC image tiles taken from the same image strip. False negatives are difficult to quantify as it would mean finding matches in the database of thousands of images that the algorithm did not detect.

  2. Virus taxonomy: the database of the International Committee on Taxonomy of Viruses (ICTV)

    PubMed Central

    Dempsey, Donald M; Hendrickson, Robert Curtis; Orton, Richard J; Siddell, Stuart G; Smith, Donald B

    2018-01-01

    Abstract The International Committee on Taxonomy of Viruses (ICTV) is charged with the task of developing, refining, and maintaining a universal virus taxonomy. This task encompasses the classification of virus species and higher-level taxa according to the genetic and biological properties of their members; naming virus taxa; maintaining a database detailing the currently approved taxonomy; and providing the database, supporting proposals, and other virus-related information from an open-access, public web site. The ICTV web site (http://ictv.global) provides access to the current taxonomy database in online and downloadable formats, and maintains a complete history of virus taxa back to the first release in 1971. The ICTV has also published the ICTV Report on Virus Taxonomy starting in 1971. This Report provides a comprehensive description of all virus taxa covering virus structure, genome structure, biology and phylogenetics. The ninth ICTV report, published in 2012, is available as an open-access online publication from the ICTV web site. The current, 10th report (http://ictv.global/report/), is being published online, and is replacing the previous hard-copy edition with a completely open access, continuously updated publication. No other database or resource exists that provides such a comprehensive, fully annotated compendium of information on virus taxa and taxonomy. PMID:29040670

  3. IMGT, the International ImMunoGeneTics database.

    PubMed Central

    Lefranc, M P; Giudicelli, V; Busin, C; Bodmer, J; Müller, W; Bontrop, R; Lemaitre, M; Malik, A; Chaume, D

    1998-01-01

    IMGT, the international ImMunoGeneTics database, is an integrated database specialising in Immunoglobulins (Ig), T cell Receptors (TcR) and Major Histocompatibility Complex (MHC) of all vertebrate species, created by Marie-Paule Lefranc, CNRS, Montpellier II University, Montpellier, France (lefranc@ligm.crbm.cnrs-mop.fr). IMGT includes three databases: LIGM-DB (for Ig and TcR), MHC/HLA-DB and PRIMER-DB (the last two in development). IMGT comprises expertly annotated sequences and alignment tables. LIGM-DB contains more than 23 000 Immunoglobulin and T cell Receptor sequences from 78 species. MHC/HLA-DB contains Class I and Class II Human Leucocyte Antigen alignment tables. An IMGT tool, DNAPLOT, developed for Ig, TcR and MHC sequence alignments, is also available. IMGT works in close collaboration with the EMBL database. IMGT goals are to establish a common data access to all immunogenetics data, including nucleotide and protein sequences, oligonucleotide primers, gene maps and other genetic data of Ig, TcR and MHC molecules, and to provide a graphical user friendly data access. IMGT has important implications in medical research (repertoire in autoimmune diseases, AIDS, leukemias, lymphomas), therapeutical approaches (antibody engineering), genome diversity and genome evolution studies. IMGT is freely available at http://imgt.cnusc.fr:8104 PMID:9399859

  4. Incidence and risk factors for surgical site infection after open reduction and internal fixation of tibial plateau fracture: A systematic review and meta-analysis.

    PubMed

    Shao, Jiashen; Chang, Hengrui; Zhu, Yanbin; Chen, Wei; Zheng, Zhanle; Zhang, Huixin; Zhang, Yingze

    2017-05-01

    This study aimed to quantitatively summarize the risk factors associated with surgical site infection after open reduction and internal fixation of tibial plateau fracture. Medline, Embase, CNKI, Wanfang database and Cochrane central database were searched for relevant original studies from database inception to October 2016. Eligible studies had to meet quality assessment criteria according to the Newcastle-Ottawa Scale, and had to evaluate the risk factors for surgical site infection after open reduction and internal fixation of tibial plateau fracture. Stata 11.0 software was used for this meta-analysis. Eight studies involving 2214 cases of tibial plateau fracture treated by open reduction and internal fixation and 219 cases of surgical site infection were included in this meta-analysis. The following parameters were identified as significant risk factors for surgical site infection after open reduction and internal fixation of tibial plateau fracture (p < 0.05): open fracture (OR 3.78; 95% CI 2.71-5.27), compartment syndrome (OR 3.53; 95% CI 2.13-5.86), operative time (OR 2.15; 95% CI 1.53-3.02), tobacco use (OR 2.13; 95% CI 1.13-3.99), and external fixation (OR 2.07; 95% CI 1.05-4.09). Other factors, including male sex, were not identified as risk factors for surgical site infection. Patients with the abovementioned medical conditions are at risk of surgical site infection after open reduction and internal fixation of tibial plateau fracture. Surgeons should be cognizant of these risks and give relevant preoperative advice. Copyright © 2017. Published by Elsevier Ltd.

  5. Seismic Search Engine: A distributed database for mining large scale seismic data

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Vaidya, S.; Kuzma, H. A.

    2009-12-01

    The International Monitoring System (IMS) of the CTBTO collects terabytes worth of seismic measurements from many receiver stations situated around the earth with the goal of detecting underground nuclear testing events and distinguishing them from other benign, but more common events such as earthquakes and mine blasts. The International Data Center (IDC) processes and analyzes these measurements, as they are collected by the IMS, to summarize event detections in daily bulletins. Thereafter, the data measurements are archived into a large format database. Our proposed Seismic Search Engine (SSE) will facilitate a framework for data exploration of the seismic database as well as the development of seismic data mining algorithms. Analogous to GenBank, the annotated genetic sequence database maintained by NIH, through SSE, we intend to provide public access to seismic data and a set of processing and analysis tools, along with community-generated annotations and statistical models to help interpret the data. SSE will implement queries as user-defined functions composed from standard tools and models. Each query is compiled and executed over the database internally before reporting results back to the user. Since queries are expressed with standard tools and models, users can easily reproduce published results within this framework for peer-review and making metric comparisons. As an illustration, an example query is “what are the best receiver stations in East Asia for detecting events in the Middle East?” Evaluating this query involves listing all receiver stations in East Asia, characterizing known seismic events in that region, and constructing a profile for each receiver station to determine how effective its measurements are at predicting each event. The results of this query can be used to help prioritize how data is collected, identify defective instruments, and guide future sensor placements.

  6. Attrition and success rates of accelerated students in nursing courses: a systematic review.

    PubMed

    Doggrell, Sheila Anne; Schaffer, Sally

    2016-01-01

    There is a comprehensive literature on the academic outcomes (attrition and success) of students in traditional/baccalaureate nursing programs, but much less is known about the academic outcomes of students in accelerated nursing programs. The aim of this systematic review is to report on the attrition and success rates (either internal examination or NCLEX-RN) of accelerated students, compared to traditional students. For the systematic review, the databases (Pubmed, Cinahl and PsychINFO) and Google Scholar were searched using the search terms 'accelerated' or 'accreditation for prior learning', 'fast-track' or 'top up' and 'nursing' with 'attrition' or 'retention' or 'withdrawal' or 'success' from 1994 to January 2016. All relevant articles were included, regardless of quality. The findings of 19 studies of attrition rates and/or success rates for accelerated students are reported. For international accelerated students, there were only three studies, which are heterogeneous, and have major limitations. One of three studies has lower attrition rates, and one has shown higher success rates, than traditional students. In contrast, another study has shown high attrition and low success for international accelerated students. For graduate accelerated students, most of the studies are high quality, and showed that they have rates similar or better than traditional students. Thus, five of six studies have shown similar or lower attrition rates. Four of these studies with graduate accelerated students and an additional seven studies of success rates only, have shown similar or better success rates, than traditional students. There are only three studies of non-university graduate accelerated students, and these had weaknesses, but were consistent in reporting higher attrition rates than traditional students. The paucity and weakness of information available makes it unclear as to the attrition and/or success of international accelerated students in nursing programs. The good information available suggests that accelerated programs may be working reasonably well for the graduate students. However, the limited information available for non-university graduate students is weak, but consistent, in suggesting they may struggle in accelerated courses. Further studies are needed to determine the attrition and success rates of accelerated students, particularly for international and non-university graduate students.

  7. Update in Outpatient General Internal Medicine: Practice-Changing Evidence Published in 2015.

    PubMed

    Szostek, Jason H; Wieland, Mark L; Post, Jason A; Sundsted, Karna K; Mauck, Karen F

    2016-08-01

    Identifying new practice-changing articles is challenging. To determine the 2015 practice-changing articles most relevant to outpatient general internal medicine, 3 internists independently reviewed the titles and abstracts of original articles, synopses of single studies and syntheses, and databases of syntheses. For original articles, internal medicine journals with the 7 highest impact factors were reviewed: New England Journal of Medicine, Lancet, Journal of the American Medical Association (JAMA), British Medical Journal, Public Library of Science Medicine, Annals of Internal Medicine, and JAMA Internal Medicine. For synopses of single studies and syntheses, collections in American College of Physicians Journal Club, Journal Watch, and Evidence-Based Medicine were reviewed. For databases of synthesis, Evidence Updates and the Cochrane Library were reviewed. More than 100 articles were identified. Criteria for inclusion were as follows: clinical relevance, potential for practice change, and strength of evidence. Clusters of important articles around one topic were considered as a single-candidate series. The 5 authors used a modified Delphi method to reach consensus on inclusion of 7 topics for in-depth appraisal. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Brief Report: Databases in the Asia-Pacific Region: The Potential for a Distributed Network Approach.

    PubMed

    Lai, Edward Chia-Cheng; Man, Kenneth K C; Chaiyakunapruk, Nathorn; Cheng, Ching-Lan; Chien, Hsu-Chih; Chui, Celine S L; Dilokthornsakul, Piyameth; Hardy, N Chantelle; Hsieh, Cheng-Yang; Hsu, Chung Y; Kubota, Kiyoshi; Lin, Tzu-Chieh; Liu, Yanfang; Park, Byung Joo; Pratt, Nicole; Roughead, Elizabeth E; Shin, Ju-Young; Watcharathanakij, Sawaeng; Wen, Jin; Wong, Ian C K; Yang, Yea-Huei Kao; Zhang, Yinghong; Setoguchi, Soko

    2015-11-01

    This study describes the availability and characteristics of databases in Asian-Pacific countries and assesses the feasibility of a distributed network approach in the region. A web-based survey was conducted among investigators using healthcare databases in the Asia-Pacific countries. Potential survey participants were identified through the Asian Pharmacoepidemiology Network. Investigators from a total of 11 databases participated in the survey. Database sources included four nationwide claims databases from Japan, South Korea, and Taiwan; two nationwide electronic health records from Hong Kong and Singapore; a regional electronic health record from western China; two electronic health records from Thailand; and cancer and stroke registries from Taiwan. We identified 11 databases with capabilities for distributed network approaches. Many country-specific coding systems and terminologies have been already converted to international coding systems. The harmonization of health expenditure data is a major obstacle for future investigations attempting to evaluate issues related to medical costs.

  9. Antidepressants for depressive disorder in children and adolescents: a database of randomised controlled trials.

    PubMed

    Zhang, Yuqing; Zhou, Xinyu; Pu, Juncai; Zhang, Hanping; Yang, Lining; Liu, Lanxiang; Zhou, Chanjuan; Yuan, Shuai; Jiang, Xiaofeng; Xie, Peng

    2018-05-31

    In recent years, whether, when and how to use antidepressants to treat depressive disorder in children and adolescents has been hotly debated. Relevant evidence on this topic has increased rapidly. In this paper, we present the construction and content of a database of randomised controlled trials of antidepressants to treat depressive disorder in children and adolescents. This database can be freely accessed via our website and will be regularly updated. Major bibliographic databases (PubMed, the Cochrane Library, Web of Science, Embase, CINAHL, PsycINFO and LiLACS), international trial registers and regulatory agencies' websites were systematically searched for published and unpublished studies up to April 30, 2017. We included randomised controlled trials in which the efficacy or tolerability of any oral antidepressant was compared with that of a control group or any other treatment. In total, 7377 citations from bibliographical databases and 3289 from international trial registers and regulatory agencies' websites were identified. Of these, 53 trials were eligible for inclusion in the final database. Selected data were extracted from each study, including characteristics of the participants (the study population, setting, diagnostic criteria, type of depression, age, sex, and comorbidity), characteristics of the treatment conditions (the treatment conditions, general information, and detail of pharmacotherapy and psychotherapy) and study characteristics (the sponsor, country, number of sites, blinding method, sample size, treatment duration, depression scales, other scales, and primary outcome measure used, and side-effect monitoring method). Moreover, the risk of bias for each trial were assessed. This database provides information on nearly all randomised controlled trials of antidepressants in children and adolescents. By using this database, researchers can improve research efficiency, avoid inadvertent errors and easily focus on the targeted subgroups in which they are interested. For authors of subsequent reviews, they could only use this database to insure that they have completed a comprehensive review, rather than relied solely on the data from this database. We expect this database could help to promote research on evidence-based practice in the treatment of depressive disorder in children and adolescents. The database could be freely accessed in our website: http://xiepengteam.cn/research/evidence-based-medicine .

  10. The Alaska Volcano Observatory Website a Tool for Information Management and Dissemination

    NASA Astrophysics Data System (ADS)

    Snedigar, S. F.; Cameron, C. E.; Nye, C. J.

    2006-12-01

    The Alaska Volcano Observatory's (AVO's) website served as a primary information management tool during the 2006 eruption of Augustine Volcano. The AVO website is dynamically generated from a database back- end. This system enabled AVO to quickly and easily update the website, and provide content based on user- queries to the database. During the Augustine eruption, the new AVO website was heavily used by members of the public (up to 19 million hits per day), and this was largely because the AVO public pages were an excellent source of up-to-date information. There are two different, yet fully integrated parts of the website. An external, public site (www.avo.alaska.edu) allows the general public to track eruptive activity by viewing the latest photographs, webcam images, webicorder graphs, and official information releases about activity at the volcano, as well as maps, previous eruption information, bibliographies, and rich information about other Alaska volcanoes. The internal half of the website hosts diverse geophysical and geological data (as browse images) in a format equally accessible by AVO staff in different locations. In addition, an observation log allows users to enter information about anything from satellite passes to seismic activity to ash fall reports into a searchable database. The individual(s) on duty at the watch office use forms on the internal website to post a summary of the latest activity directly to the public website, ensuring that the public website is always up to date. The internal website also serves as a starting point for monitoring Alaska's volcanoes. AVO's extensive image database allows AVO personnel to upload many photos, diagrams, and videos which are then available to be browsed by anyone in the AVO community. Selected images are viewable from the public page. The primary webserver is housed at the University of Alaska Fairbanks, and holds a MySQL database with over 200 tables and several thousand lines of php code gluing the database and website together. The database currently holds 95 GB of data. Webcam images and webicorder graphs are pulled from servers in Anchorage every few minutes. Other servers in Fairbanks generate earthquake location plots and spectrograms.

  11. Publication rates of public health theses in international and national peer-review journals in Turkey.

    PubMed

    Sipahi, H; Durusoy, R; Ergin, I; Hassoy, H; Davas, A; Karababa, Ao

    2012-01-01

    Thesis is an important part of specialisation and doctorate education and requires intense work. The aim of this study was to investigate the publication rates of Turkish Public Health Doctorate Theses (PHDT) and Public Health Specialization (PHST) theses in international and Turkish national peer-review journals and to analyze the distribution of research areas. List of all theses upto 30 September 2009 were retrieved from theses database of the Council of Higher Education of the Republic of Turkey. The publication rates of these theses were found by searching PubMed, Science Citation Index-Expanded, Turkish Academic Network and Information Center (ULAKBIM) Turkish Medical Database, and Turkish Medline databases for the names of thesis author and mentor. The theses which were published in journals indexed either in PubMed or SCI-E were considered as international publications. Our search yielded a total of 538 theses (243 PHDT, 295 PHST). It was found that the overall publication rate in Turkish national journals was 18%. The overall publication rate in international journals was 11.9%. Overall the most common research area was occupational health. Publication rates of Turkish PHDT and PHST are low. A better understanding of factors affecting this publication rate is important for public health issues where national data is vital for better intervention programs and develop better public health policies.

  12. Publication Rates of Public Health Theses in International and National Peer-Review Journals in Turkey

    PubMed Central

    Sipahi, H; Durusoy, R; Ergin, I; Hassoy, H; Davas, A; Karababa, AO

    2012-01-01

    Background: Thesis is an important part of specialisation and doctorate education and requires intense work. The aim of this study was to investigate the publication rates of Turkish Public Health Doctorate Theses (PHDT) and Public Health Specialization (PHST) theses in international and Turkish national peer-review journals and to analyze the distribution of research areas. Methods: List of all theses upto 30 September 2009 were retrieved from theses database of the Council of Higher Education of the Republic of Turkey. The publication rates of these theses were found by searching PubMed, Science Citation Index-Expanded, Turkish Academic Network and Information Center (ULAKBIM) Turkish Medical Database, and Turkish Medline databases for the names of thesis author and mentor. The theses which were published in journals indexed either in PubMed or SCI-E were considered as international publications. Results: Our search yielded a total of 538 theses (243 PHDT, 295 PHST). It was found that the overall publication rate in Turkish national journals was 18%. The overall publication rate in international journals was 11.9%. Overall the most common research area was occupational health. Conclusion: Publication rates of Turkish PHDT and PHST are low. A better understanding of factors affecting this publication rate is important for public health issues where national data is vital for better intervention programs and develop better public health policies. PMID:23193503

  13. Nursing diagnoses for the elderly using the International Classification for Nursing Practice and the activities of living model.

    PubMed

    de Medeiros, Ana Claudia Torres; da Nóbrega, Maria Miriam Lima; Rodrigues, Rosalina Aparecida Partezani; Fernandes, Maria das Graças Melo

    2013-01-01

    To develop nursing diagnosis statements for the elderly based on the Activities of Living Model and on the International Classification for Nursing Practice. Descriptive and exploratory study, put in practice in two stages: 1) collection of terms and concepts that are considered clinically and culturally relevant for nursing care delivered to the elderly, in order to develop a database of terms and 2) development of nursing diagnosis statements for the elderly in primary health care, based on the guidelines of the International Council of Nurses and on the database of terms for nursing practice involving the elderly. 414 terms were identified and submitted to the content validation process, with the participation of ten nursing experts, which resulted in 263 validated terms. These terms were submitted to cross mapping with the terms of the International Classification for Nursing Practice, resulting in the identification of 115 listed terms and 148 non-listed terms, which constituted the database of terms, from which 127 nursing diagnosis statements were prepared and classified into factors that affect the performance of the elderly's activities of living - 69 into biological factors, 19 into psychological, 31 into sociocultural, five into environmental, and three into political-economic factors. After clinical validation, these statements can serve as a guide for nursing consultations with elderly patients, without ignoring clinical experience, critical thinking and decision-making.

  14. Analysis of the Internal Consistency of Three Autism Scales. Brief Report.

    ERIC Educational Resources Information Center

    Sturmey, Peter; And Others

    1992-01-01

    Analyses of the internal consistency of three autism scales--the Autism Behavior Checklist (ABC), the Real Life Rating Scale (RLRS), and the Childhood Autism Rating Scale (CARS)--were conducted with 34 children with pervasive developmental disabilities. Good internal consistency was found for the CARS. Adequate full-scale consistency was found for…

  15. External validation and comparison with other models of the International Metastatic Renal-Cell Carcinoma Database Consortium prognostic model: a population-based study

    PubMed Central

    Heng, Daniel Y C; Xie, Wanling; Regan, Meredith M; Harshman, Lauren C; Bjarnason, Georg A; Vaishampayan, Ulka N; Mackenzie, Mary; Wood, Lori; Donskov, Frede; Tan, Min-Han; Rha, Sun-Young; Agarwal, Neeraj; Kollmannsberger, Christian; Rini, Brian I; Choueiri, Toni K

    2014-01-01

    Summary Background The International Metastatic Renal-Cell Carcinoma Database Consortium model offers prognostic information for patients with metastatic renal-cell carcinoma. We tested the accuracy of the model in an external population and compared it with other prognostic models. Methods We included patients with metastatic renal-cell carcinoma who were treated with first-line VEGF-targeted treatment at 13 international cancer centres and who were registered in the Consortium’s database but had not contributed to the initial development of the Consortium Database model. The primary endpoint was overall survival. We compared the Database Consortium model with the Cleveland Clinic Foundation (CCF) model, the International Kidney Cancer Working Group (IKCWG) model, the French model, and the Memorial Sloan-Kettering Cancer Center (MSKCC) model by concordance indices and other measures of model fit. Findings Overall, 1028 patients were included in this study, of whom 849 had complete data to assess the Database Consortium model. Median overall survival was 18·8 months (95% 17·6–21·4). The predefined Database Consortium risk factors (anaemia, thrombocytosis, neutrophilia, hypercalcaemia, Karnofsky performance status <80%, and <1 year from diagnosis to treatment) were independent predictors of poor overall survival in the external validation set (hazard ratios ranged between 1·27 and 2·08, concordance index 0·71, 95% CI 0·68–0·73). When patients were segregated into three risk categories, median overall survival was 43·2 months (95% CI 31·4–50·1) in the favourable risk group (no risk factors; 157 patients), 22·5 months (18·7–25·1) in the intermediate risk group (one to two risk factors; 440 patients), and 7·8 months (6·5–9·7) in the poor risk group (three or more risk factors; 252 patients; p<0·0001; concordance index 0·664, 95% CI 0·639–0·689). 672 patients had complete data to test all five models. The concordance index of the CCF model was 0·662 (95% CI 0·636–0·687), of the French model 0·640 (0·614–0·665), of the IKCWG model 0·668 (0·645–0·692), and of the MSKCC model 0·657 (0·632–0·682). The reported versus predicted number of deaths at 2 years was most similar in the Database Consortium model compared with the other models. Interpretation The Database Consortium model is now externally validated and can be applied to stratify patients by risk in clinical trials and to counsel patients about prognosis. PMID:23312463

  16. A brief tool to differentiate factors contributing to insomnia complaints.

    PubMed

    Townsend, Donald; Kazaglis, Louis; Savik, Kay; Smerud, Adam; Iber, Conrad

    2017-03-01

    A complaint of insomnia may have many causes. A brief tool examining contributing factors may be useful for nonsleep specialists. This study describes the development of the Insomnia Symptoms Assessment (ISA) for examining insomnia complaints. ISA questions were designed to identify symptoms that may represent 1 of 8 possible factors contributing to insomnia symptoms, including delayed sleep phase syndrome (DSPS), shift work sleep disorder (SWSD), obstructive sleep apnea (OSA), mental health, chronic pain, restless leg syndrome (RLS), poor sleep hygiene, and psychophysiological insomnia (PI). The ISA was completed by 346 new patients. Patients met with a sleep specialist who determined primary and secondary diagnoses. Mean age was 45 (18-85) years and 51% were male. Exploratory factor analysis (n = 217) and confirmatory factor analysis (n = 129) supported 5 factors with good internal consistency (Cronbach's alpha), including RLS (.72), OSA (.60), SWSD (.67), DSPS (.64), and PI (.80). Thirty percent had 1 sleep diagnosis with a mean of 2.2 diagnoses per patient. No diagnosis was entered for 1.2% of patients. The receiver operating characteristics were examined and the area under the curves calculated as an indication of convergent validity for the primary diagnosis (N = 346) were .97 for SWSD, .78 for OSA, .67 for DSPS, .54 for PI, and .80 for RLS. The ISA demonstrated good internal consistency and corresponds well to expert diagnoses. Next steps include setting sensitivity/specificity cutoffs to suggest initial treatment recommendations for use in other settings. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Evaluation of the psychometric properties of the Pediatric Parenting Stress Inventory (PPSI).

    PubMed

    Devine, Katie A; Heckler, Charles E; Katz, Ernest R; Fairclough, Diane L; Phipps, Sean; Sherman-Bien, Sandra; Dolgin, Michael J; Noll, Robert B; Askins, Martha A; Butler, Robert W; Sahler, Olle Jane Z

    2014-02-01

    This work evaluated the psychometric properties of the Pediatric Parenting Stress Inventory (PPSI), a new measure of problems and distress experienced by parents of children with chronic illnesses. This secondary data analysis used baseline data from 1 sample of English-, Spanish-, and Hebrew-speaking mothers of children recently diagnosed with cancer (n = 449) and 1 sample of English- and Spanish-speaking mothers of children recently diagnosed with cancer (n = 399) who participated in 2 problem-solving skills training interventions. The PPSI was administered at baseline with other measures of maternal distress. Factor structure was evaluated using exploratory factor analysis (EFA) on the first sample and confirmatory factor analysis (CFA) on both samples. Internal consistency was evaluated using Cronbach's alpha. Construct validity was assessed via Spearman correlations with measures of maternal distress. EFA resulted in a stable four-factor solution with 35 items. CFA indicated that the four-factor solution demonstrated reasonable fit in both samples. Internal consistency of the subscales and full scale was adequate to excellent. Construct validity was supported by moderate to strong correlations with measures of maternal distress, depression, and posttraumatic stress symptoms. The PPSI demonstrated good psychometric properties in assessing current problems and distress experienced by mothers of children newly diagnosed with cancer. This tool may be used to identify individualized targets for intervention in families of children with cancer. Future studies could evaluate the utility and psychometrics of the PPSI with other pediatric populations. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  18. Databases in the Central Government : State-of-the-art and the Future

    NASA Astrophysics Data System (ADS)

    Ohashi, Tomohiro

    Management and Coordination Agency, Prime Minister’s Office, conducted a survey by questionnaire against all Japanese Ministries and Agencies, in November 1985, on a subject of the present status of databases produced or planned to be produced by the central government. According to the results, the number of the produced databases has been 132 in 19 Ministries and Agencies. Many of such databases have been possessed by Defence Agency, Ministry of Construction, Ministry of Agriculture, Forestry & Fisheries, and Ministry of International Trade & Industries and have been in the fields of architecture & civil engineering, science & technology, R & D, agriculture, forestry and fishery. However the ratio of the databases available for other Ministries and Agencies has amounted to only 39 percent of all produced databases and the ratio of the databases unavailable for them has amounted to 60 percent of all of such databases, because of in-house databases and so forth. The outline of such results of the survey is reported and the databases produced by the central government are introduced under the items of (1) databases commonly used by all Ministries and Agencies, (2) integrated databases, (3) statistical databases and (4) bibliographic databases. The future problems are also described from the viewpoints of technology developments and mutual uses of databases.

  19. 47 CFR 1.10005 - What is IBFS?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false What is IBFS? 1.10005 Section 1.10005 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE International Bureau Filing System § 1.10005 What is IBFS? (a) The International Bureau Filing System (IBFS) is a database...

  20. 47 CFR 1.10005 - What is IBFS?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false What is IBFS? 1.10005 Section 1.10005 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE International Bureau Filing System § 1.10005 What is IBFS? (a) The International Bureau Filing System (IBFS) is a database...

  1. Structured Forms Reference Set of Binary Images II (SFRS2)

    National Institute of Standards and Technology Data Gateway

    NIST Structured Forms Reference Set of Binary Images II (SFRS2) (Web, free access)   The second NIST database of structured forms (Special Database 6) consists of 5,595 pages of binary, black-and-white images of synthesized documents containing hand-print. The documents in this database are 12 different tax forms with the IRS 1040 Package X for the year 1988.

  2. National assessment of shoreline change: A GIS compilation of vector shorelines and associated shoreline change data for the sandy shorelines of Kauai, Oahu, and Maui, Hawaii

    USGS Publications Warehouse

    Romine, Bradley M.; Fletcher, Charles H.; Genz, Ayesha S.; Barbee, Matthew M.; Dyer, Matthew; Anderson, Tiffany R.; Lim, S. Chyn; Vitousek, Sean; Bochicchio, Christopher; Richmond, Bruce M.

    2012-01-01

    Sandy ocean beaches are a popular recreational destination, and often are surrounded by communities that consist of valuable real estate. Development is increasing despite the fact that coastal infrastructure may be repeatedly subjected to flooding and erosion. As a result, the demand for accurate information regarding past and present shoreline changes is increasing. Working with researchers from the University of Hawaii, investigators with the U.S. Geological Survey's National Assessment of Shoreline Change Project have compiled a comprehensive database of digital vector shorelines and shoreline-change rates for the islands of Kauai, Oahu, and Maui, Hawaii. No widely accepted standard for analyzing shoreline change currently exists. Current measurement and rate-calculation methods vary from study to study, precluding the combination of study results into statewide or regional assessments. The impetus behind the National Assessment was to develop a standardized method for measuring changes in shoreline position that is consistent from coast to coast. The goal was to facilitate the process of periodically and systematically updating the measurements in an internally consistent manner. A detailed report on shoreline change for Kauai, Maui, and Oahu that contains a discussion of the data presented here is available and cited in the Geospatial Data section of this report.

  3. The National assessment of shoreline shange—A GIS compilation of vector shorelines and associated shoreline change data for the Pacific Northwest coast

    USGS Publications Warehouse

    Kratzmann, Meredith G.; Himmelstoss, Emily A.; Ruggiero, Peter; Thieler, E. Robert; Reid, David

    2013-01-01

    Sandy ocean beaches are a popular recreational destination and are often surrounded by communities that consist of valuable real estate. Development along sandy coastal areas is increasing despite the fact that coastal infrastructure may be repeatedly subjected to flooding and erosion. As a result, the demand for accurate information regarding past and present shoreline changes is increasing. Investigators with the U.S. Geological Survey's National Assessment of Shoreline Change Project have compiled a comprehensive database of digital vector shorelines and rates of shoreline change for the Pacific Northwest coast including the states of Washington and Oregon. No widely accepted standard for analyzing shoreline change currently exists. Current measurement and methods for calculating rates of change vary from study to study, precluding the combination of study results into statewide or regional assessments. The impetus behind the national assessment was to develop a standardized method that is consistent from coast to coast for measuring changes in shoreline position. The goal was to facilitate the process of periodically and systematically updating the measurements in an internally consistent manner. A detailed report on shoreline change for the Pacific Northwest coast that contains a discussion of the data presented here is available and cited in the Geospatial Data section of this report.

  4. Bibliometric trend and patent analysis in nano-alloys research for period 2000-2013.

    PubMed

    Živković, Dragana; Niculović, Milica; Manasijević, Dragan; Minić, Duško; Ćosović, Vladan; Sibinović, Maja

    2015-05-04

    This paper presents an overview of current situation in nano-alloys investigations based on bibliometric and patent analysis. Bibliometric analysis data, for period from 2000 to September 2013, were obtained using Scopus database as selected index database, whereas analyzed parameters were: number of scientific papers per years, authors, countries, affiliations, subject areas and document types. Analysis of nano-alloys patents was done with specific database, using the International Patent Classification and Patent Scope for the period from 2003 to 2013 year. Information found in this database was the number of patents, patent classification by country, patent applicators, main inventors and pub date.

  5. Bibliometric trend and patent analysis in nano-alloys research for period 2000-2013.

    PubMed

    Živković, Dragana; Niculović, Milica; Manasijević, Dragan; Minić, Duško; Ćosović, Vladan; Sibinović, Maja

    2015-01-01

    This paper presents an overview of current situation in nano-alloys investigations based on bibliometric and patent analysis. Bibliometric analysis data, for the period 2000 to 2013, were obtained using Scopus database as selected index database, whereas analyzed parameters were: number of scientific papers per year, authors, countries, affiliations, subject areas and document types. Analysis of nano-alloys patents was done with specific database, using the International Patent Classification and Patent Scope for the period 2003 to 2013. Information found in this database was the number of patents, patent classification by country, patent applicators, main inventors and publication date.

  6. Application of Knowledge Discovery in Databases Methodologies for Predictive Models for Pregnancy Adverse Events

    ERIC Educational Resources Information Center

    Taft, Laritza M.

    2010-01-01

    In its report "To Err is Human", The Institute of Medicine recommended the implementation of internal and external voluntary and mandatory automatic reporting systems to increase detection of adverse events. Knowledge Discovery in Databases (KDD) allows the detection of patterns and trends that would be hidden or less detectable if analyzed by…

  7. Nutrient database improvement project: Separable components and proximate composition of retail cuts from the beef loin and round

    USDA-ARS?s Scientific Manuscript database

    Beef nutrition research has become increasingly important domestically and internationally for the beef industry and its consumers. The objective of this study was to analyze the nutrient composition of ten beef loin and round cuts to update the nutrient data in the USDA National Nutrient Database f...

  8. Marginal regression analysis of recurrent events with coarsened censoring times.

    PubMed

    Hu, X Joan; Rosychuk, Rhonda J

    2016-12-01

    Motivated by an ongoing pediatric mental health care (PMHC) study, this article presents weakly structured methods for analyzing doubly censored recurrent event data where only coarsened information on censoring is available. The study extracted administrative records of emergency department visits from provincial health administrative databases. The available information of each individual subject is limited to a subject-specific time window determined up to concealed data. To evaluate time-dependent effect of exposures, we adapt the local linear estimation with right censored survival times under the Cox regression model with time-varying coefficients (cf. Cai and Sun, Scandinavian Journal of Statistics 2003, 30, 93-111). We establish the pointwise consistency and asymptotic normality of the regression parameter estimator, and examine its performance by simulation. The PMHC study illustrates the proposed approach throughout the article. © 2016, The International Biometric Society.

  9. Assessing guilt toward the former spouse.

    PubMed

    Wietzker, Anne; Buysse, Ann

    2012-09-01

    Divorce is often accompanied by feelings of guilt toward the former spouse. So far, no scale has been available to measure such feelings. For this purpose, the authors developed the Guilt in Separation Scale (GiSS). Content validity was assured by using experts and lay experts to generate and select items. Exploratory analyses were run on samples of 214 divorced individuals and confirmatory analyses on 458 individuals who were in the process of divorcing. Evidence was provided for the reliability and construct validity of the GiSS. The internal consistency was high (α = .91), as were the 6-month and 12-month test-retest reliabilities (r = .72 and r = .76, respectively). The GiSS was related to shame, regret, compassion, locus of cause of the separation, unfaithfulness, and psychological functioning. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  10. The CTBTO Link to the database of the International Seismological Centre (ISC)

    NASA Astrophysics Data System (ADS)

    Bondar, I.; Storchak, D. A.; Dando, B.; Harris, J.; Di Giacomo, D.

    2011-12-01

    The CTBTO Link to the database of the International Seismological Centre (ISC) is a project to provide access to seismological data sets maintained by the ISC using specially designed interactive tools. The Link is open to National Data Centres and to the CTBTO. By means of graphical interfaces and database queries tailored to the needs of the monitoring community, the users are given access to a multitude of products. These include the ISC and ISS bulletins, covering the seismicity of the Earth since 1904; nuclear and chemical explosions; the EHB bulletin; the IASPEI Reference Event list (ground truth database); and the IDC Reviewed Event Bulletin. The searches are divided into three main categories: The Area Based Search (a spatio-temporal search based on the ISC Bulletin), the REB search (a spatio-temporal search based on specific events in the REB) and the IMS Station Based Search (a search for historical patterns in the reports of seismic stations close to a particular IMS seismic station). The outputs are HTML based web-pages with a simplified version of the ISC Bulletin showing the most relevant parameters with access to ISC, GT, EHB and REB Bulletins in IMS1.0 format for single or multiple events. The CTBTO Link offers a tool to view REB events in context within the historical seismicity, look at observations reported by non-IMS networks, and investigate station histories and residual patterns for stations registered in the International Seismographic Station Registry.

  11. The Protein-DNA Interface database

    PubMed Central

    2010-01-01

    The Protein-DNA Interface database (PDIdb) is a repository containing relevant structural information of Protein-DNA complexes solved by X-ray crystallography and available at the Protein Data Bank. The database includes a simple functional classification of the protein-DNA complexes that consists of three hierarchical levels: Class, Type and Subtype. This classification has been defined and manually curated by humans based on the information gathered from several sources that include PDB, PubMed, CATH, SCOP and COPS. The current version of the database contains only structures with resolution of 2.5 Å or higher, accounting for a total of 922 entries. The major aim of this database is to contribute to the understanding of the main rules that underlie the molecular recognition process between DNA and proteins. To this end, the database is focused on each specific atomic interface rather than on the separated binding partners. Therefore, each entry in this database consists of a single and independent protein-DNA interface. We hope that PDIdb will be useful to many researchers working in fields such as the prediction of transcription factor binding sites in DNA, the study of specificity determinants that mediate enzyme recognition events, engineering and design of new DNA binding proteins with distinct binding specificity and affinity, among others. Finally, due to its friendly and easy-to-use web interface, we hope that PDIdb will also serve educational and teaching purposes. PMID:20482798

  12. The Protein-DNA Interface database.

    PubMed

    Norambuena, Tomás; Melo, Francisco

    2010-05-18

    The Protein-DNA Interface database (PDIdb) is a repository containing relevant structural information of Protein-DNA complexes solved by X-ray crystallography and available at the Protein Data Bank. The database includes a simple functional classification of the protein-DNA complexes that consists of three hierarchical levels: Class, Type and Subtype. This classification has been defined and manually curated by humans based on the information gathered from several sources that include PDB, PubMed, CATH, SCOP and COPS. The current version of the database contains only structures with resolution of 2.5 A or higher, accounting for a total of 922 entries. The major aim of this database is to contribute to the understanding of the main rules that underlie the molecular recognition process between DNA and proteins. To this end, the database is focused on each specific atomic interface rather than on the separated binding partners. Therefore, each entry in this database consists of a single and independent protein-DNA interface.We hope that PDIdb will be useful to many researchers working in fields such as the prediction of transcription factor binding sites in DNA, the study of specificity determinants that mediate enzyme recognition events, engineering and design of new DNA binding proteins with distinct binding specificity and affinity, among others. Finally, due to its friendly and easy-to-use web interface, we hope that PDIdb will also serve educational and teaching purposes.

  13. Endo-beta-N-acetylglucosaminidase, an enzyme involved in processing of free oligosaccharides in the cytosol.

    PubMed

    Suzuki, Tadashi; Yano, Keiichi; Sugimoto, Seiji; Kitajima, Ken; Lennarz, William J; Inoue, Sadako; Inoue, Yasuo; Emori, Yasufumi

    2002-07-23

    Formation of oligosaccharides occurs both in the cytosol and in the lumen of the endoplasmic reticulum (ER). Luminal oligosaccharides are transported into the cytosol to ensure that they do not interfere with proper functioning of the glycan-dependent quality control machinery in the lumen of the ER for newly synthesized glycoproteins. Once in the cytosol, free oligosaccharides are catabolized, possibly to maximize the reutilization of the component sugars. An endo-beta-N-acetylglucosaminidase (ENGase) is a key enzyme involved in the processing of free oligosaccharides in the cytosol. This enzyme activity has been widely described in animal cells, but the gene encoding this enzyme activity has not been reported. Here, we report the identification of the gene encoding human cytosolic ENGase. After 11 steps, the enzyme was purified 150,000-fold to homogeneity from hen oviduct, and several internal amino acid sequences were analyzed. Based on the internal sequence and examination of expressed sequence tag (EST) databases, we identified the human orthologue of the purified protein. The human protein consists of 743 aa and has no apparent signal sequence, supporting the idea that this enzyme is localized in the cytosol. By expressing the cDNA of the putative human ENGase in COS-7 cells, the enzyme activity in the soluble fraction was enhanced 100-fold over the basal level, confirming that the human gene identified indeed encodes for ENGase. Careful gene database surveys revealed the occurrence of ENGase homologues in Drosophila melanogaster, Caenorhabditis elegans, and Arabidopsis thaliana, indicating the broad occurrence of ENGase in higher eukaryotes. This gene was expressed in a variety of human tissues, suggesting that this enzyme is involved in basic biological processes in eukaryotic cells.

  14. Diagnostic rate of primary aldosteronism in Emilia-Romagna, Northern Italy, during 16 years (2000-2015).

    PubMed

    Rossi, Ermanno; Perazzoli, Franco; Negro, Aurelio; Magnani, Antonia

    2017-08-01

    Although primary aldosteronism is considered the most common form of endocrine hypertension, the diagnostic rate of primary aldosteronism in the territory is unknown. The aims of the current study were to compare the number of patients discharged with International Classification of Diseases 9 Clinical Modification codes compatible with primary aldosteronism from all the hospitals in Emilia-Romagna during 16 years (from 2000 to 2015) with the number of expected cases of primary aldosteronism, and to compare the number of patients with primary aldosteronism who underwent adrenalectomy in the period 2000-2015 with the number of expected cases of unilateral primary aldosteronism. We accessed the Database of the Emilia-Romagna Health Service to select all patients from the age of 20 years discharged with International Classification of Diseases 9 Clinical Modification codes compatible with primary aldosteronism and, among them, those who underwent adrenalectomy in the same period. The prevalence of hypertension in Emilia-Romagna from the age of 20 years was drawn from the Health Search Database. The population from the age of 20 years in Emilia-Romagna has been drawn from the Italian National Statistical Institute. We hypothesized a prevalence of primary aldosteronism of 5% among hypertensive patients and a prevalence of unilateral subtypes of 30% among the primary aldosteronism patients. A total of 992 patients have been discharged with codes consistent with primary aldosteronism during 16 years in Emilia-Romagna, that is 1.9% of the expected cases of primary aldosteronism. A total of 160 of them underwent adrenalectomy in the same period, which corresponds to 1% of the expected cases of unilateral primary aldosteronism in Emilia-Romagna. Our results clearly indicate that primary aldosteronism is dramatically underdiagnosed and undertreated.

  15. International Space Station Mechanisms and Maintenance Flight Control Documentation and Training Development

    NASA Technical Reports Server (NTRS)

    Daugherty, Colin C.

    2010-01-01

    International Space Station (ISS) crew and flight controller training documentation is used to aid in training operations. The Generic Simulations References SharePoint (Gen Sim) site is a database used as an aid during flight simulations. The Gen Sim site is used to make individual mission segment timelines, data, and flight information easily accessible to instructors. The Waste and Hygiene Compartment (WHC) training schematic includes simple and complex fluid schematics, as well as overall hardware locations. It is used as a teaching aid during WHC lessons for both ISS crew and flight controllers. ISS flight control documentation is used to support all aspects of ISS mission operations. The Quick Look Database and Consolidated Tool Page are imagery-based references used in real-time to help the Operations Support Officer (OSO) find data faster and improve discussions with the Flight Director and Capsule Communicator (CAPCOM). A Quick Look page was created for the Permanent Multipurpose Module (PMM) by locating photos of the module interior, labeling specific hardware, and organizing them in schematic form to match the layout of the PMM interior. A Tool Page was created for the Maintenance Work Area (MWA) by gathering images, detailed drawings, safety information, procedures, certifications, demonstration videos, and general facts of each MWA component and displaying them in an easily accessible and consistent format. Participation in ISS mechanisms and maintenance lessons, mission simulation On-the-Job Training (OJT), and real-time flight OJT was used as an opportunity to train for day-to-day operations as an OSO, as well as learn how to effectively respond to failures and emergencies during mission simulations and real-time flight operations.

  16. Diversity and clinical impact of Acinetobacter baumannii colonization and infection at a military medical center.

    PubMed

    Petersen, Kyle; Cannegieter, Suzanne C; van der Reijden, Tanny J; van Strijen, Beppie; You, David M; Babel, Britta S; Philip, Andrew I; Dijkshoorn, Lenie

    2011-01-01

    The epidemiology of Acinetobacter baumannii emerging in combat casualties is poorly understood. We analyzed 65 (54 nonreplicate) Acinetobacter isolates from 48 patients (46 hospitalized and 2 outpatient trainees entering the military) from October 2004 to October 2005 for genotypic similarities, time-space relatedness, and antibiotic susceptibility. Clinical and surveillance cultures were compared by amplified fragment length polymorphism (AFLP) genomic fingerprinting to each other and to strains of a reference database. Antibiotic susceptibility was determined, and multiplex PCR was performed for OXA-23-like, -24-like, -51-like, and -58-like carbapenemases. Records were reviewed for overlapping hospital stays of the most frequent genotypes, and risk ratios were calculated for any association of genotype with severity of Acute Physiology and Chronic Health Evaluation II (APACHE II) score or injury severity score (ISS) and previous antibiotic use. Nineteen genotypes were identified; two predominated, one consistent with an emerging novel international clone and the other unique to our database. Both predominant genotypes were carbapenem resistant, were present at another hospital before patients' admission to our facility, and were associated with higher APACHE II scores, higher ISSs, and previous carbapenem antibiotics in comparison with other genotypes. One predominated in wound and respiratory isolates, and the other predominated in wound and skin surveillance samples. Several other genotypes were identified as European clones I to III. Acinetobacter genotypes from recruits upon entry to the military, unlike those in hospitalized patients, did not include carbapenem-resistant genotypes. Acinetobacter species isolated from battlefield casualties are diverse, including genotypes belonging to European clones I to III. Two carbapenem-resistant genotypes were epidemic, one of which appeared to belong to a novel international clone.

  17. Isopropanolic black cohosh extract and recurrence-free survival after breast cancer.

    PubMed

    Henneicke-von Zepelin, H H; Meden, H; Kostev, K; Schröder-Bernhardi, D; Stammwitz, U; Becher, H

    2007-03-01

    To investigate the influence of an isopropanolic Cimicifuga racemosa extract (iCR) on recurrence-free survival after breast cancer, including estrogen-dependent tumors. This pharmacoepidemiologic observational retrospective cohort study examined breast cancer patients treated at general, gynecological and internal facilities linked to a medical database in Germany. The main endpoint was disease-free survival following a diagnosis of breast cancer. The impact of treatment with iCR following diagnosis was analyzed by Cox-proportional hazards models, controlling for age and other confounders. Of 18,861 patients, a total of 1,102 had received an iCR therapy. The mean overall observation time was 3.6 years. Results showed that iCR was not associated with an increase in the risk of recurrence but associated with prolonged disease-free survival. After 2 years following initial diagnosis, 14% of the control group had developed a recurrence, while the iCR group reached this proportion after 6.5 years. The primary Cox regression model controlling for age, tamoxifen use and other confounders demonstrated a protractive effect of iCR on the rate of recurrence (hazard ratio 0.83, 95% confidence interval 0.69 0.99). This effect remained consistent throughout all variations of the statistical model, including subgroup analyses. TNM status was unknown but did not bias the iCR treatment decision as investigated separately. Hence, it was assumed to be equally distributed between treatment groups. Correlation analyses showed good internal and external validity of the database. An increase in the risk of breast cancer recurrence for women having had iCR treatment, compared to women not treated with iCR is unlikely.

  18. Informetrics: Exploring Databases as Analytical Tools.

    ERIC Educational Resources Information Center

    Wormell, Irene

    1998-01-01

    Advanced online search facilities and information retrieval techniques have increased the potential of bibliometric research. Discusses three case studies carried out by the Centre for Informetric Studies at the Royal School of Library Science (Denmark) on the internationality of international journals, informetric analyses on the World Wide Web,…

  19. Academic Impact of a Public Electronic Health Database: Bibliometric Analysis of Studies Using the General Practice Research Database

    PubMed Central

    Chen, Yu-Chun; Wu, Jau-Ching; Haschler, Ingo; Majeed, Azeem; Chen, Tzeng-Ji; Wetter, Thomas

    2011-01-01

    Background Studies that use electronic health databases as research material are getting popular but the influence of a single electronic health database had not been well investigated yet. The United Kingdom's General Practice Research Database (GPRD) is one of the few electronic health databases publicly available to academic researchers. This study analyzed studies that used GPRD to demonstrate the scientific production and academic impact by a single public health database. Methodology and Findings A total of 749 studies published between 1995 and 2009 with ‘General Practice Research Database’ as their topics, defined as GPRD studies, were extracted from Web of Science. By the end of 2009, the GPRD had attracted 1251 authors from 22 countries and been used extensively in 749 studies published in 193 journals across 58 study fields. Each GPRD study was cited 2.7 times by successive studies. Moreover, the total number of GPRD studies increased rapidly, and it is expected to reach 1500 by 2015, twice the number accumulated till the end of 2009. Since 17 of the most prolific authors (1.4% of all authors) contributed nearly half (47.9%) of GPRD studies, success in conducting GPRD studies may accumulate. The GPRD was used mainly in, but not limited to, the three study fields of “Pharmacology and Pharmacy”, “General and Internal Medicine”, and “Public, Environmental and Occupational Health”. The UK and United States were the two most active regions of GPRD studies. One-third of GRPD studies were internationally co-authored. Conclusions A public electronic health database such as the GPRD will promote scientific production in many ways. Data owners of electronic health databases at a national level should consider how to reduce access barriers and to make data more available for research. PMID:21731733

  20. Summary of Research 1992

    DTIC Science & Technology

    1992-12-01

    Tutorial on Their Data Sharing," The International Journal on Very Large Data Bases (VLDB Journal ), Vol. 1, No. 1, July 1992. Hsiao, D. K., "Federated...Databases and Systems: A Tutorial on Their Resource Consolidation," The International Journal on Very Large Data Bases (VLDB Journal ), Vol. 1, No. 2...Game: Normal Approximation," accepted extensions of games and considers for publication by International possible applications. Journal of Game Theory

Top