Sample records for including database management

  1. Database Access Systems.

    ERIC Educational Resources Information Center

    Dalrymple, Prudence W.; Roderer, Nancy K.

    1994-01-01

    Highlights the changes that have occurred from 1987-93 in database access systems. Topics addressed include types of databases, including CD-ROMs; enduser interface; database selection; database access management, including library instruction and use of primary literature; economic issues; database users; the search process; and improving…

  2. [Selected aspects of computer-assisted literature management].

    PubMed

    Reiss, M; Reiss, G

    1998-01-01

    We want to report about our own experiences with a database manager. Bibliography database managers are used to manage information resources: specifically, to maintain a database to references and create bibliographies and reference lists for written works. A database manager allows to enter summary information (record) for articles, book sections, books, dissertations, conference proceedings, and so on. Other features that may be included in a database manager include the ability to import references from different sources, such as MEDLINE. The word processing components allow to generate reference list and bibliographies in a variety of different styles, generates a reference list from a word processor manuscript. The function and the use of the software package EndNote 2 for Windows are described. Its advantages in fulfilling different requirements for the citation style and the sort order of reference lists are emphasized.

  3. Database Management: Building, Changing and Using Databases. Collected Papers and Abstracts of the Mid-Year Meeting of the American Society for Information Science (15th, Portland, Oregon, May 1986).

    ERIC Educational Resources Information Center

    American Society for Information Science, Washington, DC.

    This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…

  4. [The future of clinical laboratory database management system].

    PubMed

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  5. Implementation of a data management software system for SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, Kenneth

    1986-01-01

    The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.

  6. Application of cloud database in the management of clinical data of patients with skin diseases.

    PubMed

    Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning

    2015-04-01

    To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.

  7. [Establishement for regional pelvic trauma database in Hunan Province].

    PubMed

    Cheng, Liang; Zhu, Yong; Long, Haitao; Yang, Junxiao; Sun, Buhua; Li, Kanghua

    2017-04-28

    To establish a database for pelvic trauma in Hunan Province, and to start the work of multicenter pelvic trauma registry.
 Methods: To establish the database, literatures relevant to pelvic trauma were screened, the experiences from the established trauma database in China and abroad were learned, and the actual situations for pelvic trauma rescue in Hunan Province were considered. The database for pelvic trauma was established based on the PostgreSQL and the advanced programming language Java 1.6.
 Results: The complex procedure for pelvic trauma rescue was described structurally. The contents for the database included general patient information, injurious condition, prehospital rescue, conditions in admission, treatment in hospital, status on discharge, diagnosis, classification, complication, trauma scoring and therapeutic effect. The database can be accessed through the internet by browser/servicer. The functions for the database include patient information management, data export, history query, progress report, video-image management and personal information management.
 Conclusion: The database with whole life cycle pelvic trauma is successfully established for the first time in China. It is scientific, functional, practical, and user-friendly.

  8. Flight Deck Interval Management Display. [Elements, Information and Annunciations Database User Guide

    NASA Technical Reports Server (NTRS)

    Lancaster, Jeff; Dillard, Michael; Alves, Erin; Olofinboba, Olu

    2014-01-01

    The User Guide details the Access Database provided with the Flight Deck Interval Management (FIM) Display Elements, Information, & Annunciations program. The goal of this User Guide is to support ease of use and the ability to quickly retrieve and select items of interest from the Database. The Database includes FIM Concepts identified in a literature review preceding the publication of this document. Only items that are directly related to FIM (e.g., spacing indicators), which change or enable FIM (e.g., menu with control buttons), or which are affected by FIM (e.g., altitude reading) are included in the database. The guide has been expanded from previous versions to cover database structure, content, and search features with voiced explanations.

  9. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2013-06-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. Copyright 2013 by JohnWiley & Sons, Inc.

  10. The Perfect Marriage: Integrated Word Processing and Data Base Management Programs.

    ERIC Educational Resources Information Center

    Pogrow, Stanley

    1983-01-01

    Discussion of database integration and how it operates includes recommendations on compatible brand name word processing and database management programs, and a checklist for evaluating essential and desirable features of the available programs. (MBR)

  11. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2002-08-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, and relational databases, as well as ACeDB. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system.

  12. A New Approach To Secure Federated Information Bases Using Agent Technology.

    ERIC Educational Resources Information Center

    Weippi, Edgar; Klug, Ludwig; Essmayr, Wolfgang

    2003-01-01

    Discusses database agents which can be used to establish federated information bases by integrating heterogeneous databases. Highlights include characteristics of federated information bases, including incompatible database management systems, schemata, and frequently changing context; software agent technology; Java agents; system architecture;…

  13. Netlib services and resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Browne, S.V.; Green, S.C.; Moore, K.

    1994-04-01

    The Netlib repository, maintained by the University of Tennessee and Oak Ridge National Laboratory, contains freely available software, documents, and databases of interest to the numerical, scientific computing, and other communities. This report includes both the Netlib User`s Guide and the Netlib System Manager`s Guide, and contains information about Netlib`s databases, interfaces, and system implementation. The Netlib repository`s databases include the Performance Database, the Conferences Database, and the NA-NET mail forwarding and Whitepages Databases. A variety of user interfaces enable users to access the Netlib repository in the manner most convenient and compatible with their networking capabilities. These interfaces includemore » the Netlib email interface, the Xnetlib X Windows client, the netlibget command-line TCP/IP client, anonymous FTP, anonymous RCP, and gopher.« less

  14. Databases for multilevel biophysiology research available at Physiome.jp.

    PubMed

    Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki

    2015-01-01

    Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications.

  15. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    ERIC Educational Resources Information Center

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  16. Tautomerism in chemical information management systems

    NASA Astrophysics Data System (ADS)

    Warr, Wendy A.

    2010-06-01

    Tautomerism has an impact on many of the processes in chemical information management systems including novelty checking during registration into chemical structure databases; storage of structures; exact and substructure searching in chemical structure databases; and depiction of structures retrieved by a search. The approaches taken by 27 different software vendors and database producers are compared. It is hoped that this comparison will act as a discussion document that could ultimately improve databases and software for researchers in the future.

  17. Legacy2Drupal - Conversion of an existing oceanographic relational database to a semantically enabled Drupal content management system

    NASA Astrophysics Data System (ADS)

    Maffei, A. R.; Chandler, C. L.; Work, T.; Allen, J.; Groman, R. C.; Fox, P. A.

    2009-12-01

    Content Management Systems (CMSs) provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have previously designed customized schemas for their metadata. The WHOI Ocean Informatics initiative and the NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) have jointly sponsored a project to port an existing, relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal6 Content Management System. The goal was to translate all the existing database tables, input forms, website reports, and other features present in the existing system to employ Drupal CMS features. The replacement features include Drupal content types, CCK node-reference fields, themes, RDB, SPARQL, workflow, and a number of other supporting modules. Strategic use of some Drupal6 CMS features enables three separate but complementary interfaces that provide access to oceanographic research metadata via the MySQL database: 1) a Drupal6-powered front-end; 2) a standard SQL port (used to provide a Mapserver interface to the metadata and data; and 3) a SPARQL port (feeding a new faceted search capability being developed). Future plans include the creation of science ontologies, by scientist/technologist teams, that will drive semantically-enabled faceted search capabilities planned for the site. Incorporation of semantic technologies included in the future Drupal 7 core release is also anticipated. Using a public domain CMS as opposed to proprietary middleware, and taking advantage of the many features of Drupal 6 that are designed to support semantically-enabled interfaces will help prepare the BCO-DMO database for interoperability with other ecosystem databases.

  18. Enabling heterogenous multi-scale database for emergency service functions through geoinformation technologies

    NASA Astrophysics Data System (ADS)

    Bhanumurthy, V.; Venugopala Rao, K.; Srinivasa Rao, S.; Ram Mohan Rao, K.; Chandra, P. Satya; Vidhyasagar, J.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Geographical Information Science (GIS) is now graduated from traditional desktop system to Internet system. Internet GIS is emerging as one of the most promising technologies for addressing Emergency Management. Web services with different privileges are playing an important role in dissemination of the emergency services to the decision makers. Spatial database is one of the most important components in the successful implementation of Emergency Management. It contains spatial data in the form of raster, vector, linked with non-spatial information. Comprehensive data is required to handle emergency situation in different phases. These database elements comprise core data, hazard specific data, corresponding attribute data, and live data coming from the remote locations. Core data sets are minimum required data including base, thematic, infrastructure layers to handle disasters. Disaster specific information is required to handle a particular disaster situation like flood, cyclone, forest fire, earth quake, land slide, drought. In addition to this Emergency Management require many types of data with spatial and temporal attributes that should be made available to the key players in the right format at right time. The vector database needs to be complemented with required resolution satellite imagery for visualisation and analysis in disaster management. Therefore, the database is interconnected and comprehensive to meet the requirement of an Emergency Management. This kind of integrated, comprehensive and structured database with appropriate information is required to obtain right information at right time for the right people. However, building spatial database for Emergency Management is a challenging task because of the key issues such as availability of data, sharing policies, compatible geospatial standards, data interoperability etc. Therefore, to facilitate using, sharing, and integrating the spatial data, there is a need to define standards to build emergency database systems. These include aspects such as i) data integration procedures namely standard coding scheme, schema, meta data format, spatial format ii) database organisation mechanism covering data management, catalogues, data models iii) database dissemination through a suitable environment, as a standard service for effective service dissemination. National Database for Emergency Management (NDEM) is such a comprehensive database for addressing disasters in India at the national level. This paper explains standards for integrating, organising the multi-scale and multi-source data with effective emergency response using customized user interfaces for NDEM. It presents standard procedure for building comprehensive emergency information systems for enabling emergency specific functions through geospatial technologies.

  19. A Database Management System for Interlibrary Loan.

    ERIC Educational Resources Information Center

    Chang, Amy

    1990-01-01

    Discusses the increasing complexity of dealing with interlibrary loan requests and describes a database management system for interlibrary loans used at Texas Tech University. System functions are described, including file control, records maintenance, and report generation, and the impact on staff productivity is discussed. (CLB)

  20. Database & information tools for transportation research management : Connecticut transportation research peer exchange report of a thematic peer exchange.

    DOT National Transportation Integrated Search

    2006-05-01

    Specific objectives of the Peer Exchange were: : Discuss and exchange information about databases and other software : used to support the program-cycles managed by state transportation : research offices. Elements of the program cycle include: :...

  1. The future application of GML database in GIS

    NASA Astrophysics Data System (ADS)

    Deng, Yuejin; Cheng, Yushu; Jing, Lianwen

    2006-10-01

    In 2004, the Geography Markup Language (GML) Implementation Specification (version 3.1.1) was published by Open Geospatial Consortium, Inc. Now more and more applications in geospatial data sharing and interoperability depend on GML. The primary purpose of designing GML is for exchange and transportation of geo-information by standard modeling and encoding of geography phenomena. However, the problems of how to organize and access lots of GML data effectively arise in applications. The research on GML database focuses on these problems. The effective storage of GML data is a hot topic in GIS communities today. GML Database Management System (GDBMS) mainly deals with the problem of storage and management of GML data. Now two types of XML database, namely Native XML Database, and XML-Enabled Database are classified. Since GML is an application of the XML standard to geographic data, the XML database system can also be used for the management of GML. In this paper, we review the status of the art of XML database, including storage, index and query languages, management systems and so on, then move on to the GML database. At the end, the future prospect of GML database in GIS application is presented.

  2. Configuration management program plan for Hanford site systems engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kellie, C.L.

    This plan establishes the integrated management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford Site Technical Baseline.

  3. Choosing the Right Database Management Program.

    ERIC Educational Resources Information Center

    Vockell, Edward L.; Kopenec, Donald

    1989-01-01

    Provides a comparison of four database management programs commonly used in schools: AppleWorks, the DOS 3.3 and ProDOS versions of PFS, and MECC's Data Handler. Topics discussed include information storage, spelling checkers, editing functions, search strategies, graphs, printout formats, library applications, and HyperCard. (LRW)

  4. Interactive, Automated Management of Icing Data

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.

    2009-01-01

    IceVal DatAssistant is software (see figure) that provides an automated, interactive solution for the management of data from research on aircraft icing. This software consists primarily of (1) a relational database component used to store ice shape and airfoil coordinates and associated data on operational and environmental test conditions and (2) a graphically oriented database access utility, used to upload, download, process, and/or display data selected by the user. The relational database component consists of a Microsoft Access 2003 database file with nine tables containing data of different types. Included in the database are the data for all publicly releasable ice tracings with complete and verifiable test conditions from experiments conducted to date in the Glenn Research Center Icing Research Tunnel. Ice shapes from computational simulations with the correspond ing conditions performed utilizing the latest version of the LEWICE ice shape prediction code are likewise included, and are linked to the equivalent experimental runs. The database access component includes ten Microsoft Visual Basic 6.0 (VB) form modules and three VB support modules. Together, these modules enable uploading, downloading, processing, and display of all data contained in the database. This component also affords the capability to perform various database maintenance functions for example, compacting the database or creating a new, fully initialized but empty database file.

  5. Do Librarians Really Do That? Or Providing Custom, Fee-Based Services.

    ERIC Educational Resources Information Center

    Whitmore, Susan; Heekin, Janet

    This paper describes some of the fee-based, custom services provided by National Institutes of Health (NIH) Library to NIH staff, including knowledge management, clinical liaisons, specialized database searching, bibliographic database development, Web resource guide development, and journal management. The first section discusses selecting the…

  6. Implementation of Risk Management in NASA's CEV Project- Ensuring Mission Success

    NASA Astrophysics Data System (ADS)

    Perera, Jeevan; Holsomback, Jerry D.

    2005-12-01

    Most project managers know that Risk Management (RM) is essential to good project management. At NASA, standards and procedures to manage risk through a tiered approach have been developed - from the global agency-wide requirements down to a program or project's implementation. The basic methodology for NASA's risk management strategy includes processes to identify, analyze, plan, track, control, communicate and document risks. The identification, characterization, mitigation plan, and mitigation responsibilities associated with specific risks are documented to help communicate, manage, and effectuate appropriate closure. This approach helps to ensure more consistent documentation and assessment and provides a means of archiving lessons learned for future identification or mitigation activities.A new risk database and management tool was developed by NASA in 2002 and since has been used successfully to communicate, document and manage a number of diverse risks for the International Space Station, Space Shuttle, and several other NASA projects and programs including at the Johnson Space Center. Organizations use this database application to effectively manage and track each risk and gain insight into impacts from other organization's viewpoint to develop integrated solutions. Schedule, cost, technical and safety issues are tracked in detail through this system.Risks are tagged within the system to ensure proper review, coordination and management at the necessary management level. The database is intended as a day-to- day tool for organizations to manage their risks and elevate those issues that need coordination from above. Each risk is assigned to a managing organization and a specific risk owner who generates mitigation plans as appropriate. In essence, the risk owner is responsible for shepherding the risk through closure. The individual that identifies a new risk does not necessarily get assigned as the risk owner. Whoever is in the best position to effectuate comprehensive closure is assigned as the risk owner. Each mitigation plan includes the specific tasks that will be conducted to either decrease the likelihood of the risk occurring and/or lessen the severity of the consequences if they do occur. As each mitigation task is completed, the responsible managing organization records the completion of the task in the risk database and then re-scores the risk considering the task's results. By keeping scores updated, a managing organization's current top risks and risk posture can be readily identified including the status of any risk in the system.A number of metrics measure risk process trends from data contained in the database. This allows for trend analysis to further identify improvements to the process and assist in the management of all risks. The metrics will also scrutinize both the effectiveness and compliance of risk management requirements.The risk database is an evolving tool and will be continuously improved with capabilities requested by the NASA project community. This paper presents the basic foundations of risk management, the elements necessary for effective risk management, and the capabilities of this new risk database and how it is implemented to support NASA's risk management needs.

  7. Tourism through Travel Club: A Database Project

    ERIC Educational Resources Information Center

    Pratt, Renée M. E.; Smatt, Cindi T.; Wynn, Donald E.

    2017-01-01

    This applied database exercise utilizes a scenario-based case study to teach the basics of Microsoft Access and database management in introduction to information systems and introduction to database course. The case includes background information on a start-up business (i.e., Carol's Travel Club), description of functional business requirements,…

  8. A Summary of Pavement and Material-Related Databases within the Texas Department of Transportation

    DOT National Transportation Integrated Search

    1999-09-01

    This report summarizes important content and operational details about five different materials and pavement databases currently used by the Texas Department of Transportation (TxDOT). These databases include the Pavement Management Information Syste...

  9. Integrating RFID technique to design mobile handheld inventory management system

    NASA Astrophysics Data System (ADS)

    Huang, Yo-Ping; Yen, Wei; Chen, Shih-Chung

    2008-04-01

    An RFID-based mobile handheld inventory management system is proposed in this paper. Differing from the manual inventory management method, the proposed system works on the personal digital assistant (PDA) with an RFID reader. The system identifies electronic tags on the properties and checks the property information in the back-end database server through a ubiquitous wireless network. The system also provides a set of functions to manage the back-end inventory database and assigns different levels of access privilege according to various user categories. In the back-end database server, to prevent improper or illegal accesses, the server not only stores the inventory database and user privilege information, but also keeps track of the user activities in the server including the login and logout time and location, the records of database accessing, and every modification of the tables. Some experimental results are presented to verify the applicability of the integrated RFID-based mobile handheld inventory management system.

  10. Improving data management and dissemination in web based information systems by semantic enrichment of descriptive data aspects

    NASA Astrophysics Data System (ADS)

    Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan

    2010-10-01

    The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data retrieval, analysis, and distribution. The graphical user interfaces facilitate metadata cataloguing, data warehousing, web sensor data analysis and thematic mapping.

  11. Configuration management program plan for Hanford site systems engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, A.G.

    This plan establishes the integrated configuration management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford site technical baseline.

  12. Database Access Manager for the Software Engineering Laboratory (DAMSEL) user's guide

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Operating instructions for the Database Access Manager for the Software Engineering Laboratory (DAMSEL) system are presented. Step-by-step instructions for performing various data entry and report generation activities are included. Sample sessions showing the user interface display screens are also included. Instructions for generating reports are accompanied by sample outputs for each of the reports. The document groups the available software functions by the classes of users that may access them.

  13. The Cocoa Shop: A Database Management Case

    ERIC Educational Resources Information Center

    Pratt, Renée M. E.; Smatt, Cindi T.

    2015-01-01

    This is an example of a real-world applicable case study, which includes background information on a small local business (i.e., TCS), description of functional business requirements, and sample data. Students are asked to design and develop a database to improve the management of the company's customers, products, and purchases by emphasizing…

  14. Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database

    PubMed Central

    Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.

    2010-01-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938

  15. Federated web-accessible clinical data management within an extensible neuroimaging database.

    PubMed

    Ozyurt, I Burak; Keator, David B; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R; Bockholt, Jeremy; Grethe, Jeffrey S

    2010-12-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site.

  16. Managing Automation: A Process, Not a Project.

    ERIC Educational Resources Information Center

    Hoffmann, Ellen

    1988-01-01

    Discussion of issues in management of library automation includes: (1) hardware, including systems growth and contracts; (2) software changes, vendor relations, local systems, and microcomputer software; (3) item and authority databases; (4) automation and library staff, organizational structure, and managing change; and (5) environmental issues,…

  17. The Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Kirby, Michael

    2014-06-01

    The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere; 2) an extensive data management system for managing local and remote caches, cataloging, querying, moving, and tracking the use of data; 3) custom and generic database applications for calibrations, beam information, and other purposes; 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.

  18. Developing an ontological explosion knowledge base for business continuity planning purposes.

    PubMed

    Mohammadfam, Iraj; Kalatpour, Omid; Golmohammadi, Rostam; Khotanlou, Hasan

    2013-01-01

    Industrial accidents are among the most known challenges to business continuity. Many organisations have lost their reputation following devastating accidents. To manage the risks of such accidents, it is necessary to accumulate sufficient knowledge regarding their roots, causes and preventive techniques. The required knowledge might be obtained through various approaches, including databases. Unfortunately, many databases are hampered by (among other things) static data presentations, a lack of semantic features, and the inability to present accident knowledge as discrete domains. This paper proposes the use of Protégé software to develop a knowledge base for the domain of explosion accidents. Such a structure has a higher capability to improve information retrieval compared with common accident databases. To accomplish this goal, a knowledge management process model was followed. The ontological explosion knowledge base (EKB) was built for further applications, including process accident knowledge retrieval and risk management. The paper will show how the EKB has a semantic feature that enables users to overcome some of the search constraints of existing accident databases.

  19. Database Software for Social Studies. A MicroSIFT Quarterly Report.

    ERIC Educational Resources Information Center

    Weaver, Dave

    The report describes and evaluates the use of a set of learning tools called database managers and their creation of databases to help teach problem solving skills in social studies. Details include the design, building, and use of databases in a social studies setting, along with advantages and disadvantages of using them. The three types of…

  20. VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, N.; Sellis, Timos

    1992-01-01

    One of biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental database access method, VIEWCACHE, provides such an interface for accessing distributed data sets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image data sets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate distributed database search.

  1. Facilitating quality control for spectra assignments of small organic molecules: nmrshiftdb2--a free in-house NMR database with integrated LIMS for academic service laboratories.

    PubMed

    Kuhn, Stefan; Schlörer, Nils E

    2015-08-01

    nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.

  2. How I do it: a practical database management system to assist clinical research teams with data collection, organization, and reporting.

    PubMed

    Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe

    2015-04-01

    The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  3. 48 CFR 52.204-13 - System for Award Management Maintenance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... (SAM) database means that— (1) The Contractor has entered all mandatory information, including the DUNS... database; (2) The Contractor has completed the Core, Assertions, Representations and Certifications, and Points of Contact sections of the registration in the SAM database; (3) The Government has validated all...

  4. Salary Management System for Small and Medium-sized Enterprises

    NASA Astrophysics Data System (ADS)

    Hao, Zhang; Guangli, Xu; Yuhuan, Zhang; Yilong, Lei

    Small and Medium-sized Enterprises (SMEs) in the process of wage entry, calculation, the total number are needed to be done manually in the past, the data volume is quite large, processing speed is low, and it is easy to make error, which is resulting in low efficiency. The main purpose of writing this paper is to present the basis of salary management system, establish a scientific database, the computer payroll system, using the computer instead of a lot of past manual work in order to reduce duplication of staff labor, it will improve working efficiency.This system combines the actual needs of SMEs, through in-depth study and practice of the C/S mode, PowerBuilder10.0 development tools, databases and SQL language, Completed a payroll system needs analysis, database design, application design and development work. Wages, departments, units and personnel database file are included in this system, and have data management, department management, personnel management and other functions, through the control and management of the database query, add, delete, modify, and other functions can be realized. This system is reasonable design, a more complete function, stable operation has been tested to meet the basic needs of the work.

  5. VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, N.; Sellis, Timos

    1993-01-01

    One of the biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental data base access method, VIEWCACHE, provides such an interface for accessing distributed datasets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image datasets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate database search.

  6. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    NASA Astrophysics Data System (ADS)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  7. Small Business Innovations (Automated Information)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Bruce G. Jackson & Associates Document Director is an automated tool that combines word processing and database management technologies to offer the flexibility and convenience of text processing with the linking capability of database management. Originally developed for NASA, it provides a means to collect and manage information associated with requirements development. The software system was used by NASA in the design of the Assured Crew Return Vehicle, as well as by other government and commercial organizations including the Southwest Research Institute.

  8. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

  9. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  10. Construction of a Linux based chemical and biological information system.

    PubMed

    Molnár, László; Vágó, István; Fehér, András

    2003-01-01

    A chemical and biological information system with a Web-based easy-to-use interface and corresponding databases has been developed. The constructed system incorporates all chemical, numerical and textual data related to the chemical compounds, including numerical biological screen results. Users can search the database by traditional textual/numerical and/or substructure or similarity queries through the web interface. To build our chemical database management system, we utilized existing IT components such as ORACLE or Tripos SYBYL for database management and Zope application server for the web interface. We chose Linux as the main platform, however, almost every component can be used under various operating systems.

  11. Software Engineering Laboratory (SEL) database organization and user's guide, revision 2

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Bristow, John

    1992-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base table is described. In addition, techniques for accessing the database through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL) are discussed.

  12. Software Engineering Laboratory (SEL) database organization and user's guide

    NASA Technical Reports Server (NTRS)

    So, Maria; Heller, Gerard; Steinberg, Sandra; Spiegel, Douglas

    1989-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base tables is described. In addition, techniques for accessing the database, through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL), are discussed.

  13. Design, Development and Utilization Perspectives on Database Management Systems

    ERIC Educational Resources Information Center

    Shneiderman, Ben

    1977-01-01

    This paper reviews the historical development of integrated data base management systems and examines competing approaches. Topics include management and utilization, implementation and design, query languages, security, integrity, privacy and concurrency. (Author/KP)

  14. Starbase Data Tables: An ASCII Relational Database for Unix

    NASA Astrophysics Data System (ADS)

    Roll, John

    2011-11-01

    Database management is an increasingly important part of astronomical data analysis. Astronomers need easy and convenient ways of storing, editing, filtering, and retrieving data about data. Commercial databases do not provide good solutions for many of the everyday and informal types of database access astronomers need. The Starbase database system with simple data file formatting rules and command line data operators has been created to answer this need. The system includes a complete set of relational and set operators, fast search/index and sorting operators, and many formatting and I/O operators. Special features are included to enhance the usefulness of the database when manipulating astronomical data. The software runs under UNIX, MSDOS and IRAF.

  15. MicroUse: The Database on Microcomputer Applications in Libraries and Information Centers.

    ERIC Educational Resources Information Center

    Chen, Ching-chih; Wang, Xiaochu

    1984-01-01

    Describes MicroUse, a microcomputer-based database on microcomputer applications in libraries and information centers which was developed using relational database manager dBASE II. The description includes its system configuration, software utilized, the in-house-developed dBASE programs, multifile structure, basic functions, MicroUse records,…

  16. The Database Business: Managing Today--Planning for Tomorrow. Issues and Futures.

    ERIC Educational Resources Information Center

    Aitchison, T. M.; And Others

    1988-01-01

    Current issues and the future of the database business are discussed in five papers. Topics covered include aspects relating to the quality of database production; international ownership in the U.S. information marketplace; an overview of pricing strategies in the electronic information industry; and pricing issues from the viewpoints of online…

  17. Towards G2G: Systems of Technology Database Systems

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David

    2005-01-01

    We present an approach and methodology for developing Government-to-Government (G2G) Systems of Technology Database Systems. G2G will deliver technologies for distributed and remote integration of technology data for internal use in analysis and planning as well as for external communications. G2G enables NASA managers, engineers, operational teams and information systems to "compose" technology roadmaps and plans by selecting, combining, extending, specializing and modifying components of technology database systems. G2G will interoperate information and knowledge that is distributed across organizational entities involved that is ideal for NASA future Exploration Enterprise. Key contributions of the G2G system will include the creation of an integrated approach to sustain effective management of technology investments that supports the ability of various technology database systems to be independently managed. The integration technology will comply with emerging open standards. Applications can thus be customized for local needs while enabling an integrated management of technology approach that serves the global needs of NASA. The G2G capabilities will use NASA s breakthrough in database "composition" and integration technology, will use and advance emerging open standards, and will use commercial information technologies to enable effective System of Technology Database systems.

  18. The Fabric for Frontier Experiments Project at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirby, Michael

    2014-01-01

    The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere, 2) an extensive data management system for managing local and remote caches, cataloging, querying,more » moving, and tracking the use of data, 3) custom and generic database applications for calibrations, beam information, and other purposes, 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.« less

  19. Clinical Databases for Chest Physicians.

    PubMed

    Courtwright, Andrew M; Gabriel, Peter E

    2018-04-01

    A clinical database is a repository of patient medical and sociodemographic information focused on one or more specific health condition or exposure. Although clinical databases may be used for research purposes, their primary goal is to collect and track patient data for quality improvement, quality assurance, and/or actual clinical management. This article aims to provide an introduction and practical advice on the development of small-scale clinical databases for chest physicians and practice groups. Through example projects, we discuss the pros and cons of available technical platforms, including Microsoft Excel and Access, relational database management systems such as Oracle and PostgreSQL, and Research Electronic Data Capture. We consider approaches to deciding the base unit of data collection, creating consensus around variable definitions, and structuring routine clinical care to complement database aims. We conclude with an overview of regulatory and security considerations for clinical databases. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  20. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    NASA Astrophysics Data System (ADS)

    Viegas, F.; Malon, D.; Cranshaw, J.; Dimitrov, G.; Nowak, M.; Nairz, A.; Goossens, L.; Gallas, E.; Gamboa, C.; Wong, A.; Vinek, E.

    2010-04-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  1. PARPs database: A LIMS systems for protein-protein interaction data mining or laboratory information management system

    PubMed Central

    Droit, Arnaud; Hunter, Joanna M; Rouleau, Michèle; Ethier, Chantal; Picard-Cloutier, Aude; Bourgais, David; Poirier, Guy G

    2007-01-01

    Background In the "post-genome" era, mass spectrometry (MS) has become an important method for the analysis of proteins and the rapid advancement of this technique, in combination with other proteomics methods, results in an increasing amount of proteome data. This data must be archived and analysed using specialized bioinformatics tools. Description We herein describe "PARPs database," a data analysis and management pipeline for liquid chromatography tandem mass spectrometry (LC-MS/MS) proteomics. PARPs database is a web-based tool whose features include experiment annotation, protein database searching, protein sequence management, as well as data-mining of the peptides and proteins identified. Conclusion Using this pipeline, we have successfully identified several interactions of biological significance between PARP-1 and other proteins, namely RFC-1, 2, 3, 4 and 5. PMID:18093328

  2. Slushie World: An In-Class Access Database Tutorial

    ERIC Educational Resources Information Center

    Wynn, Donald E., Jr.; Pratt, Renée M. E.

    2015-01-01

    The Slushie World case study is designed to teach the basics of Microsoft Access and database management over a series of three 75-minute class sessions. Students are asked to build a basic database to track sales and inventory for a small business. Skills to be learned include table creation, data entry and importing, form and report design,…

  3. A RESEARCH DATABASE FOR IMPROVED DATA MANAGEMENT AND ANALYSIS IN LONGITUDINAL STUDIES

    PubMed Central

    BIELEFELD, ROGER A.; YAMASHITA, TOYOKO S.; KEREKES, EDWARD F.; ERCANLI, EHAT; SINGER, LYNN T.

    2014-01-01

    We developed a research database for a five-year prospective investigation of the medical, social, and developmental correlates of chronic lung disease during the first three years of life. We used the Ingres database management system and the Statit statistical software package. The database includes records containing 1300 variables each, the results of 35 psychological tests, each repeated five times (providing longitudinal data on the child, the parents, and behavioral interactions), both raw and calculated variables, and both missing and deferred values. The four-layer menu-driven user interface incorporates automatic activation of complex functions to handle data verification, missing and deferred values, static and dynamic backup, determination of calculated values, display of database status, reports, bulk data extraction, and statistical analysis. PMID:7596250

  4. Technology for organization of the onboard system for processing and storage of ERS data for ultrasmall spacecraft

    NASA Astrophysics Data System (ADS)

    Strotov, Valery V.; Taganov, Alexander I.; Konkin, Yuriy V.; Kolesenkov, Aleksandr N.

    2017-10-01

    Task of processing and analysis of obtained Earth remote sensing data on ultra-small spacecraft board is actual taking into consideration significant expenditures of energy for data transfer and low productivity of computers. Thereby, there is an issue of effective and reliable storage of the general information flow obtained from onboard systems of information collection, including Earth remote sensing data, into a specialized data base. The paper has considered peculiarities of database management system operation with the multilevel memory structure. For storage of data in data base the format has been developed that describes a data base physical structure which contains required parameters for information loading. Such structure allows reducing a memory size occupied by data base because it is not necessary to store values of keys separately. The paper has shown architecture of the relational database management system oriented into embedment into the onboard ultra-small spacecraft software. Data base for storage of different information, including Earth remote sensing data, can be developed by means of such database management system for its following processing. Suggested database management system architecture has low requirements to power of the computer systems and memory resources on the ultra-small spacecraft board. Data integrity is ensured under input and change of the structured information.

  5. The use of intelligent database systems in acute pancreatitis--a systematic review.

    PubMed

    van den Heever, Marc; Mittal, Anubhav; Haydock, Matthew; Windsor, John

    2014-01-01

    Acute pancreatitis (AP) is a complex disease with multiple aetiological factors, wide ranging severity, and multiple challenges to effective triage and management. Databases, data mining and machine learning algorithms (MLAs), including artificial neural networks (ANNs), may assist by storing and interpreting data from multiple sources, potentially improving clinical decision-making. 1) Identify database technologies used to store AP data, 2) collate and categorise variables stored in AP databases, 3) identify the MLA technologies, including ANNs, used to analyse AP data, and 4) identify clinical and non-clinical benefits and obstacles in establishing a national or international AP database. Comprehensive systematic search of online reference databases. The predetermined inclusion criteria were all papers discussing 1) databases, 2) data mining or 3) MLAs, pertaining to AP, independently assessed by two reviewers with conflicts resolved by a third author. Forty-three papers were included. Three data mining technologies and five ANN methodologies were reported in the literature. There were 187 collected variables identified. ANNs increase accuracy of severity prediction, one study showed ANNs had a sensitivity of 0.89 and specificity of 0.96 six hours after admission--compare APACHE II (cutoff score ≥8) with 0.80 and 0.85 respectively. Problems with databases were incomplete data, lack of clinical data, diagnostic reliability and missing clinical data. This is the first systematic review examining the use of databases, MLAs and ANNs in the management of AP. The clinical benefits these technologies have over current systems and other advantages to adopting them are identified. Copyright © 2013 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  6. Video Games for Diabetes Self-Management: Examples and Design Strategies

    PubMed Central

    Lieberman, Debra A.

    2012-01-01

    The July 2012 issue of the Journal of Diabetes Science and Technology includes a special symposium called “Serious Games for Diabetes, Obesity, and Healthy Lifestyle.” As part of the symposium, this article focuses on health behavior change video games that are designed to improve and support players’ diabetes self-management. Other symposium articles include one that recommends theory-based approaches to the design of health games and identifies areas in which additional research is needed, followed by five research articles presenting studies of the design and effectiveness of games and game technologies that require physical activity in order to play. This article briefly describes 14 diabetes self-management video games, and, when available, cites research findings on their effectiveness. The games were found by searching the Health Games Research online searchable database, three bibliographic databases (ACM Digital Library, PubMed, and Social Sciences Databases of CSA Illumina), and the Google search engine, using the search terms “diabetes” and “game.” Games were selected if they addressed diabetes self-management skills. PMID:22920805

  7. Video games for diabetes self-management: examples and design strategies.

    PubMed

    Lieberman, Debra A

    2012-07-01

    The July 2012 issue of the Journal of Diabetes Science and Technology includes a special symposium called "Serious Games for Diabetes, Obesity, and Healthy Lifestyle." As part of the symposium, this article focuses on health behavior change video games that are designed to improve and support players' diabetes self-management. Other symposium articles include one that recommends theory-based approaches to the design of health games and identifies areas in which additional research is needed, followed by five research articles presenting studies of the design and effectiveness of games and game technologies that require physical activity in order to play. This article briefly describes 14 diabetes self-management video games, and, when available, cites research findings on their effectiveness. The games were found by searching the Health Games Research online searchable database, three bibliographic databases (ACM Digital Library, PubMed, and Social Sciences Databases of CSA Illumina), and the Google search engine, using the search terms "diabetes" and "game." Games were selected if they addressed diabetes self-management skills. © 2012 Diabetes Technology Society.

  8. STI Handbook: Guidelines for Producing, Using, and Managing Scientific and Technical Information in the Department of the Navy. A Handbook for Navy Scientists and Engineers on the Use of Scientific and Technical Information

    DTIC Science & Technology

    1992-02-01

    6 What Information Should Be Included in the TR Database? 2-6 What Types of Media Can Be Used to Submit Information to the TR Database? 2-9 How Is...reports. Contract administration documents. Regulations. Commercially published books. WHAT TYPES OF MEDIA CAN BE USED TO SUBMIT INFORMATION TO THE TR...TOWARD DTIC’S WUIS DATA- BASE ? The WUIS database, used to control and report technical and management data, summarizes ongoing research and technology

  9. Design and deployment of a large brain-image database for clinical and nonclinical research

    NASA Astrophysics Data System (ADS)

    Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.

    2004-04-01

    An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.

  10. Computers and Library Management.

    ERIC Educational Resources Information Center

    Cooke, Deborah M.; And Others

    1985-01-01

    This five-article section discusses changes in the management of the school library resulting from use of the computer. Topics covered include data management programs (record keeping, word processing, and bibliographies); practical applications of a database; evaluation of "Circulation Plus" software; ergonomics and computers; and…

  11. Managing Documents in the Wider Area: Intelligent Document Management.

    ERIC Educational Resources Information Center

    Bittleston, Richard

    1995-01-01

    Discusses techniques for managing documents in wide area networks, reviews technique limitations, and offers recommendations to database designers. Presented techniques include: increasing bandwidth, reducing data traffic, synchronizing documentation, partial synchronization, audit trials, navigation, and distribution control and security. Two…

  12. USGS Nonindigenous Aquatic Species database with a focus on the introduced fishes of the lower Tennessee and Cumberland drainages

    USGS Publications Warehouse

    Fuller, Pamela L.; Cannister, Matthew; Johansen, Rebecca; Estes, L. Dwayne; Hamilton, Steven W.; Barrass, Andrew N.

    2013-01-01

    The Nonindigenous Aquatic Species (NAS) database (http://nas.er.usgs.gov) functions as a national repository and clearinghouse for occurrence data for introduced species within the United States. Included is locality information on over 1,100 species of vertebrates, invertebrates, and vascular plants introduced as early as 1850. Taxa include foreign (exotic) species and species native to North America that have been transported outside of their natural range. Locality data are obtained from published and unpublished literature, state, federal and local monitoring programs, museum accessions, on-line databases, websites, professional communications and on-line reporting forms. The NAS web site provides immediate access to new occurrence records through a real-time interface with the NAS database. Visitors to the web site are presented with a set of pre-defined queries that generate lists of species according to state or hydrologic basin of interest. Fact sheets, distribution maps, and information on new occurrences are updated as new records and information become available. The NAS database allows resource managers to learn of new introductions reported in their region or nearby regions, improving response time. Conversely, managers are encouraged to report their observations of new occurrences to the NAS database so information can be disseminated to other managers, researchers, and the public. In May 2004, the NAS database incorporated an Alert System to notify registered users of new introductions as part of a national early detection/rapid response system. Users can register to receive alerts based on geographic or taxonomic criteria. The NAS database was used to identify 23 fish species introduced into the lower Tennessee and Cumberland drainages. Most of these are sport fish stocked to support fisheries, but the list also includes accidental and illegal introductions such as Asian Carps, clupeids, various species popular in the aquarium trade, and Atlantic Needlefish (Strongylura marina) that was introduced via the newly-constructed Tennessee-Tombigbee Canal.

  13. The Marshall Islands Data Management Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoker, A.C.; Conrado, C.L.

    1995-09-01

    This report is a resource document of the methods and procedures used currently in the Data Management Program of the Marshall Islands Dose Assessment and Radioecology Project. Since 1973, over 60,000 environmental samples have been collected. Our program includes relational database design, programming and maintenance; sample and information management; sample tracking; quality control; and data entry, evaluation and reduction. The usefulness of scientific databases involves careful planning in order to fulfill the requirements of any large research program. Compilation of scientific results requires consolidation of information from several databases, and incorporation of new information as it is generated. The successmore » in combining and organizing all radionuclide analysis, sample information and statistical results into a readily accessible form, is critical to our project.« less

  14. Insight: An ontology-based integrated database and analysis platform for epilepsy self-management research.

    PubMed

    Sahoo, Satya S; Ramesh, Priya; Welter, Elisabeth; Bukach, Ashley; Valdez, Joshua; Tatsuoka, Curtis; Bamps, Yvan; Stoll, Shelley; Jobst, Barbara C; Sajatovic, Martha

    2016-10-01

    We present Insight as an integrated database and analysis platform for epilepsy self-management research as part of the national Managing Epilepsy Well Network. Insight is the only available informatics platform for accessing and analyzing integrated data from multiple epilepsy self-management research studies with several new data management features and user-friendly functionalities. The features of Insight include, (1) use of Common Data Elements defined by members of the research community and an epilepsy domain ontology for data integration and querying, (2) visualization tools to support real time exploration of data distribution across research studies, and (3) an interactive visual query interface for provenance-enabled research cohort identification. The Insight platform contains data from five completed epilepsy self-management research studies covering various categories of data, including depression, quality of life, seizure frequency, and socioeconomic information. The data represents over 400 participants with 7552 data points. The Insight data exploration and cohort identification query interface has been developed using Ruby on Rails Web technology and open source Web Ontology Language Application Programming Interface to support ontology-based reasoning. We have developed an efficient ontology management module that automatically updates the ontology mappings each time a new version of the Epilepsy and Seizure Ontology is released. The Insight platform features a Role-based Access Control module to authenticate and effectively manage user access to different research studies. User access to Insight is managed by the Managing Epilepsy Well Network database steering committee consisting of representatives of all current collaborating centers of the Managing Epilepsy Well Network. New research studies are being continuously added to the Insight database and the size as well as the unique coverage of the dataset allows investigators to conduct aggregate data analysis that will inform the next generation of epilepsy self-management studies. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. The Clinical Next-Generation Sequencing Database: A Tool for the Unified Management of Clinical Information and Genetic Variants to Accelerate Variant Pathogenicity Classification.

    PubMed

    Nishio, Shin-Ya; Usami, Shin-Ichi

    2017-03-01

    Recent advances in next-generation sequencing (NGS) have given rise to new challenges due to the difficulties in variant pathogenicity interpretation and large dataset management, including many kinds of public population databases as well as public or commercial disease-specific databases. Here, we report a new database development tool, named the "Clinical NGS Database," for improving clinical NGS workflow through the unified management of variant information and clinical information. This database software offers a two-feature approach to variant pathogenicity classification. The first of these approaches is a phenotype similarity-based approach. This database allows the easy comparison of the detailed phenotype of each patient with the average phenotype of the same gene mutation at the variant or gene level. It is also possible to browse patients with the same gene mutation quickly. The other approach is a statistical approach to variant pathogenicity classification based on the use of the odds ratio for comparisons between the case and the control for each inheritance mode (families with apparently autosomal dominant inheritance vs. control, and families with apparently autosomal recessive inheritance vs. control). A number of case studies are also presented to illustrate the utility of this database. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.

  16. United States Army Medical Materiel Development Activity: 1997 Annual Report.

    DTIC Science & Technology

    1997-01-01

    business planning and execution information management system (Project Management Division Database ( PMDD ) and Product Management Database System (PMDS...MANAGEMENT • Project Management Division Database ( PMDD ), Product Management Database System (PMDS), and Special Users Database System:The existing...System (FMS), were investigated. New Product Managers and Project Managers were added into PMDS and PMDD . A separate division, Support, was

  17. Technology for Space Station Evolution: the Data Management System

    NASA Technical Reports Server (NTRS)

    Abbott, L.

    1990-01-01

    Viewgraphs on the data management system (DMS) for the space station evolution are presented. Topics covered include DMS architecture and implementation approach; and an overview of the runtime object database.

  18. Learning Asset Technology Integration Support Tool Design Document

    DTIC Science & Technology

    2010-05-11

    language known as Hypertext Preprocessor ( PHP ) and by MySQL – a relational database management system that can also be used for content management. It...Requirements The LATIST tool will be implemented utilizing a WordPress platform with MySQL as the database. Also the LATIST system must effectively work... MySQL . When designing the LATIST system there are several considerations which must be accounted for in the working prototype. These include: • DAU

  19. Schema Versioning for Multitemporal Relational Databases.

    ERIC Educational Resources Information Center

    De Castro, Cristina; Grandi, Fabio; Scalas, Maria Rita

    1997-01-01

    Investigates new design options for extended schema versioning support for multitemporal relational databases. Discusses the improved functionalities they may provide. Outlines options and basic motivations for the new design solutions, as well as techniques for the management of proposed schema versioning solutions, includes algorithms and…

  20. Scale-Independent Relational Query Processing

    DTIC Science & Technology

    2013-10-04

    source options are also available, including Postgresql, MySQL , and SQLite. These mod- ern relational databases are generally very complex software systems...and Their Application to Data Stream Management. IGI Global, 2010. [68] George Reese. Database Programming with JDBC and Java , Second Edition. Ed. by

  1. The Education of Librarians for Data Administration.

    ERIC Educational Resources Information Center

    Koenig, Michael E. D.; Kochoff, Stephen T.

    1983-01-01

    Argues that the increasing importance of database management systems (DBMS) and recognition of the information dependency of business planning are creating new job opportunities for librarians/information technicians. Highlights include development and functions of DBMSs, data and database administration, potential for librarians, and implications…

  2. [Health management system in outpatient follow-up of kidney transplantation patients].

    PubMed

    Zhang, Hong; Xie, Jinliang; Yao, Hui; Liu, Ling; Tan, Jianwen; Geng, Chunmi

    2014-07-01

    To develop a health management system for outpatient follow-up of kidney transplant patients. Access 2010 database software was used to establish the health management system for kidney transplantation patients in Windows XP operating system. Database management and post-operation follow-up of the kidney transplantation patients were realized through 6 function modules including data input, data query, data printing, questionnaire survey, data export, and follow-up management. The system worked stably and reliably, and the data input was easy and fast. The query, the counting and printing were convenient. Health management system for patients after kidney transplantation not only reduces the work pressure of the follow-up staff, but also improves the efficiency of outpatient follow-up.

  3. Organization's Orderly Interest Exploration: Inception, Development and Insights of AIAA's Topics Database

    NASA Technical Reports Server (NTRS)

    Marshall, Jospeh R.; Morris, Allan T.

    2007-01-01

    Since 2003, AIAA's Computer Systems and Software Systems Technical Committees (TCs) have developed a database that aids technical committee management to map technical topics to their members. This Topics/Interest (T/I) database grew out of a collection of charts and spreadsheets maintained by the TCs. Since its inception, the tool has evolved into a multi-dimensional database whose dimensions include the importance, interest and expertise of TC members and whether or not a member and/or a TC is actively involved with the topic. In 2005, the database was expanded to include the TCs in AIAA s Information Systems Group and then expanded further to include all AIAA TCs. It was field tested at an AIAA Technical Activities Committee (TAC) Workshop in early 2006 through live access by over 80 users. Through the use of the topics database, TC and program committee (PC) members can accomplish relevant tasks such as: to identify topic experts (for Aerospace America articles or external contacts), to determine the interest of its members, to identify overlapping topics between diverse TCs and PCs, to guide new member drives and to reveal emerging topics. This paper will describe the origins, inception, initial development, field test and current version of the tool as well as elucidate the benefits and insights gained by using the database to aid the management of various TC functions. Suggestions will be provided to guide future development of the database for the purpose of providing dynamics and system level benefits to AIAA that currently do not exist in any technical organization.

  4. Globe Teachers Guide and Photographic Data on the Web

    NASA Technical Reports Server (NTRS)

    Kowal, Dan

    2004-01-01

    The task of managing the GLOBE Online Teacher s Guide during this time period focused on transforming the technology behind the delivery system of this document. The web application transformed from a flat file retrieval system to a dynamic database access approach. The new methodology utilizes Java Server Pages (JSP) on the front-end and an Oracle relational database on the backend. This new approach allows users of the web site, mainly teachers, to access content efficiently by grade level and/or by investigation or educational concept area. Moreover, teachers can gain easier access to data sheets and lab and field guides. The new online guide also included updated content for all GLOBE protocols. The GLOBE web management team was given documentation for maintaining the new application. Instructions for modifying the JSP templates and managing database content were included in this document. It was delivered to the team by the end of October, 2003. The National Geophysical Data Center (NGDC) continued to manage the school study site photos on the GLOBE website. 333 study site photo images were added to the GLOBE database and posted on the web during this same time period for 64 schools. Documentation for processing study site photos was also delivered to the new GLOBE web management team. Lastly, assistance was provided in transferring reference applications such as the Cloud and LandSat quizzes and Earth Systems Online Poster from NGDC servers to GLOBE servers along with documentation for maintaining these applications.

  5. 7 CFR 274.3 - Retailer management.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... retailer, and it must include acceptable privacy and security features. Such systems shall only be... terminals that are capable of relaying electronic transactions to a central database computer for... specifications prior to implementation of the EBT system to enable third party processors to access the database...

  6. REFGEN and TREENAMER: Automated Sequence Data Handling for Phylogenetic Analysis in the Genomic Era

    PubMed Central

    Leonard, Guy; Stevens, Jamie R.; Richards, Thomas A.

    2009-01-01

    The phylogenetic analysis of nucleotide sequences and increasingly that of amino acid sequences is used to address a number of biological questions. Access to extensive datasets, including numerous genome projects, means that standard phylogenetic analyses can include many hundreds of sequences. Unfortunately, most phylogenetic analysis programs do not tolerate the sequence naming conventions of genome databases. Managing large numbers of sequences and standardizing sequence labels for use in phylogenetic analysis programs can be a time consuming and laborious task. Here we report the availability of an online resource for the management of gene sequences recovered from public access genome databases such as GenBank. These web utilities include the facility for renaming every sequence in a FASTA alignment file, with each sequence label derived from a user-defined combination of the species name and/or database accession number. This facility enables the user to keep track of the branching order of the sequences/taxa during multiple tree calculations and re-optimisations. Post phylogenetic analysis, these webpages can then be used to rename every label in the subsequent tree files (with a user-defined combination of species name and/or database accession number). Together these programs drastically reduce the time required for managing sequence alignments and labelling phylogenetic figures. Additional features of our platform include the automatic removal of identical accession numbers (recorded in the report file) and generation of species and accession number lists for use in supplementary materials or figure legends. PMID:19812722

  7. Difficulties and challenges associated with literature searches in operating room management, complete with recommendations.

    PubMed

    Wachtel, Ruth E; Dexter, Franklin

    2013-12-01

    The purpose of this article is to teach operating room managers, financial analysts, and those with a limited knowledge of search engines, including PubMed, how to locate articles they need in the areas of operating room and anesthesia group management. Many physicians are unaware of current literature in their field and evidence-based practices. The most common source of information is colleagues. Many people making management decisions do not read published scientific articles. Databases such as PubMed are available to search for such articles. Other databases, such as citation indices and Google Scholar, can be used to uncover additional articles. Nevertheless, most people who do not know how to use these databases are reluctant to utilize help resources when they do not know how to accomplish a task. Most people are especially reluctant to use on-line help files. Help files and search databases are often difficult to use because they have been designed for users already familiar with the field. The help files and databases have specialized vocabularies unique to the application. MeSH terms in PubMed are not useful alternatives for operating room management, an important limitation, because MeSH is the default when search terms are entered in PubMed. Librarians or those trained in informatics can be valuable assets for searching unusual databases, but they must possess the domain knowledge relative to the subject they are searching. The search methods we review are especially important when the subject area (e.g., anesthesia group management) is so specific that only 1 or 2 articles address the topic of interest. The materials are presented broadly enough that the reader can extrapolate the findings to other areas of clinical and management issues in anesthesiology.

  8. National briefing summaries: Nuclear fuel cycle and waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, K.J.; Bradley, D.J.; Fletcher, J.F.

    Since 1976, the International Program Support Office (IPSO) at the Pacific Northwest Laboratory (PNL) has collected and compiled publicly available information concerning foreign and international radioactive waste management programs. This National Briefing Summaries is a printout of an electronic database that has been compiled and is maintained by the IPSO staff. The database contains current information concerning the radioactive waste management programs (with supporting information on nuclear power and the nuclear fuel cycle) of most of the nations (except eastern European countries) that now have or are contemplating nuclear power, and of the multinational agencies that are active in radioactivemore » waste management. Information in this document is included for three additional countries (China, Mexico, and USSR) compared to the prior issue. The database and this document were developed in response to needs of the US Department of Energy.« less

  9. An online database for IHN virus in Pacific Salmonid fish: MEAP-IHNV

    USGS Publications Warehouse

    Kurath, Gael

    2012-01-01

    The MEAP-IHNV database provides access to detailed data for anyone interested in IHNV molecular epidemiology, such as fish health professionals, fish culture facility managers, and academic researchers. The flexible search capabilities enable the user to generate various output formats, including tables and maps, which should assist users in developing and testing hypotheses about how IHNV moves across landscapes and changes over time. The MEAP-IHNV database is available online at http://gis.nacse.org/ihnv/ (fig. 1). The database contains records that provide background information and genetic sequencing data for more than 1,000 individual field isolates of the fish virus Infectious hematopoietic necrosis virus (IHNV), and is updated approximately annually. It focuses on IHNV isolates collected throughout western North America from 1966 to the present. The database also includes a small number of IHNV isolates from Eastern Russia. By engaging the expertise of the broader community of colleagues interested in IHNV, our goal is to enhance the overall understanding of IHNV epidemiology, including defining sources of disease outbreaks and viral emergence events, identifying virus traffic patterns and potential reservoirs, and understanding how human management of salmonid fish culture affects disease. Ultimately, this knowledge can be used to develop new strategies to reduce the effect of IHN disease in cultured and wild fish.

  10. Office Computer Software: A Comprehensive Review of Software Programs.

    ERIC Educational Resources Information Center

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  11. Science Inventory Products About Land and Waste Management Research

    EPA Pesticide Factsheets

    Resources from the Science Inventory database of EPA's Office of Research and Development, as well as EPA's Science Matters journal, include research on managing contaminated sites and ground water modeling and decontamination technologies.

  12. Land and Waste Management Research Publications in the Science Inventory

    EPA Pesticide Factsheets

    Resources from the Science Inventory database of EPA's Office of Research and Development, as well as EPA's Science Matters journal, include research on managing contaminated sites and ground water modeling and decontamination technologies.

  13. Searching for religion and mental health studies required health, social science, and grey literature databases.

    PubMed

    Wright, Judy M; Cottrell, David J; Mir, Ghazala

    2014-07-01

    To determine the optimal databases to search for studies of faith-sensitive interventions for treating depression. We examined 23 health, social science, religious, and grey literature databases searched for an evidence synthesis. Databases were prioritized by yield of (1) search results, (2) potentially relevant references identified during screening, (3) included references contained in the synthesis, and (4) included references that were available in the database. We assessed the impact of databases beyond MEDLINE, EMBASE, and PsycINFO by their ability to supply studies identifying new themes and issues. We identified pragmatic workload factors that influence database selection. PsycINFO was the best performing database within all priority lists. ArabPsyNet, CINAHL, Dissertations and Theses, EMBASE, Global Health, Health Management Information Consortium, MEDLINE, PsycINFO, and Sociological Abstracts were essential for our searches to retrieve the included references. Citation tracking activities and the personal library of one of the research teams made significant contributions of unique, relevant references. Religion studies databases (Am Theo Lib Assoc, FRANCIS) did not provide unique, relevant references. Literature searches for reviews and evidence syntheses of religion and health studies should include social science, grey literature, non-Western databases, personal libraries, and citation tracking activities. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Working with Data: Discovering Knowledge through Mining and Analysis; Systematic Knowledge Management and Knowledge Discovery; Text Mining; Methodological Approach in Discovering User Search Patterns through Web Log Analysis; Knowledge Discovery in Databases Using Formal Concept Analysis; Knowledge Discovery with a Little Perspective.

    ERIC Educational Resources Information Center

    Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.

    2000-01-01

    These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)

  15. MouseNet database: digital management of a large-scale mutagenesis project.

    PubMed

    Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M

    2000-07-01

    The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.

  16. IceVal DatAssistant: An Interactive, Automated Icing Data Management System

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Wright, William B.

    2008-01-01

    As with any scientific endeavor, the foundation of icing research at the NASA Glenn Research Center (GRC) is the data acquired during experimental testing. In the case of the GRC Icing Branch, an important part of this data consists of ice tracings taken following tests carried out in the GRC Icing Research Tunnel (IRT), as well as the associated operational and environmental conditions documented during these tests. Over the years, the large number of experimental runs completed has served to emphasize the need for a consistent strategy for managing this data. To address the situation, the Icing Branch has recently elected to implement the IceVal DatAssistant automated data management system. With the release of this system, all publicly available IRT-generated experimental ice shapes with complete and verifiable conditions have now been compiled into one electronically-searchable database. Simulation software results for the equivalent conditions, generated using the latest version of the LEWICE ice shape prediction code, are likewise included and are linked to the corresponding experimental runs. In addition to this comprehensive database, the IceVal system also includes a graphically-oriented database access utility, which provides reliable and easy access to all data contained in the database. In this paper, the issues surrounding historical icing data management practices are discussed, as well as the anticipated benefits to be achieved as a result of migrating to the new system. A detailed description of the software system features and database content is also provided; and, finally, known issues and plans for future work are presented.

  17. IceVal DatAssistant: An Interactive, Automated Icing Data Management System

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Wright, William B.

    2008-01-01

    As with any scientific endeavor, the foundation of icing research at the NASA Glenn Research Center (GRC) is the data acquired during experimental testing. In the case of the GRC Icing Branch, an important part of this data consists of ice tracings taken following tests carried out in the GRC Icing Research Tunnel (IRT), as well as the associated operational and environmental conditions during those tests. Over the years, the large number of experimental runs completed has served to emphasize the need for a consistent strategy to manage the resulting data. To address this situation, the Icing Branch has recently elected to implement the IceVal DatAssistant automated data management system. With the release of this system, all publicly available IRT-generated experimental ice shapes with complete and verifiable conditions have now been compiled into one electronically-searchable database; and simulation software results for the equivalent conditions, generated using the latest version of the LEWICE ice shape prediction code, are likewise included and linked to the corresponding experimental runs. In addition to this comprehensive database, the IceVal system also includes a graphically-oriented database access utility, which provides reliable and easy access to all data contained in the database. In this paper, the issues surrounding historical icing data management practices are discussed, as well as the anticipated benefits to be achieved as a result of migrating to the new system. A detailed description of the software system features and database content is also provided; and, finally, known issues and plans for future work are presented.

  18. DataHub knowledge based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Collins, Donald J.; Doyle, Richard J.; Jacobson, Allan S.

    1991-01-01

    Viewgraphs on DataHub knowledge based assistance for science visualization and analysis using large distributed databases. Topics covered include: DataHub functional architecture; data representation; logical access methods; preliminary software architecture; LinkWinds; data knowledge issues; expert systems; and data management.

  19. 23 CFR 970.210 - Federal lands bridge management system (BMS).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Section 970.210 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS... needs using, as a minimum, the following components: (1) A database and an ongoing program for the... BMS. The minimum BMS database shall include: (i) Data described by the inventory section of the...

  20. Information management systems for pharmacogenomics.

    PubMed

    Thallinger, Gerhard G; Trajanoski, Slave; Stocker, Gernot; Trajanoski, Zlatko

    2002-09-01

    The value of high-throughput genomic research is dramatically enhanced by association with key patient data. These data are generally available but of disparate quality and not typically directly associated. A system that could bring these disparate data sources into a common resource connected with functional genomic data would be tremendously advantageous. However, the integration of clinical and accurate interpretation of the generated functional genomic data requires the development of information management systems capable of effectively capturing the data as well as tools to make that data accessible to the laboratory scientist or to the clinician. In this review these challenges and current information technology solutions associated with the management, storage and analysis of high-throughput data are highlighted. It is suggested that the development of a pharmacogenomic data management system which integrates public and proprietary databases, clinical datasets, and data mining tools embedded in a high-performance computing environment should include the following components: parallel processing systems, storage technologies, network technologies, databases and database management systems (DBMS), and application services.

  1. Application of China's National Forest Continuous Inventory database.

    PubMed

    Xie, Xiaokui; Wang, Qingli; Dai, Limin; Su, Dongkai; Wang, Xinchuang; Qi, Guang; Ye, Yujing

    2011-12-01

    The maintenance of a timely, reliable and accurate spatial database on current forest ecosystem conditions and changes is essential to characterize and assess forest resources and support sustainable forest management. Information for such a database can be obtained only through a continuous forest inventory. The National Forest Continuous Inventory (NFCI) is the first level of China's three-tiered inventory system. The NFCI is administered by the State Forestry Administration; data are acquired by five inventory institutions around the country. Several important components of the database include land type, forest classification and ageclass/ age-group. The NFCI database in China is constructed based on 5-year inventory periods, resulting in some of the data not being timely when reports are issued. To address this problem, a forest growth simulation model has been developed to update the database for years between the periodic inventories. In order to aid in forest plan design and management, a three-dimensional virtual reality system of forest landscapes for selected units in the database (compartment or sub-compartment) has also been developed based on Virtual Reality Modeling Language. In addition, a transparent internet publishing system for a spatial database based on open source WebGIS (UMN Map Server) has been designed and utilized to enhance public understanding and encourage free participation of interested parties in the development, implementation, and planning of sustainable forest management.

  2. The liver tissue bank and clinical database in China.

    PubMed

    Yang, Yuan; Liu, Yi-Min; Wei, Ming-Yue; Wu, Yi-Fei; Gao, Jun-Hui; Liu, Lei; Zhou, Wei-Ping; Wang, Hong-Yang; Wu, Meng-Chao

    2010-12-01

    To develop a standardized and well-rounded material available for hepatology research, the National Liver Tissue Bank (NLTB) Project began in 2008 in China to make well-characterized and optimally preserved liver tumor tissue and clinical database. From Dec 2008 to Jun 2010, over 3000 individuals have been enrolled as liver tumor donors to the NLTB, including 2317 cases of newly diagnosed hepatocellular carcinoma (HCC) and about 1000 cases of diagnosed benign or malignant liver tumors. The clinical database and sample store can be managed easily and correctly with the data management platform used. We believe that the high-quality samples with detailed information database will become the cornerstone of hepatology research especially in studies exploring the diagnosis and new treatments for HCC and other liver diseases.

  3. Land, Oil Spill, and Waste Management Research Publications in the Science Inventory

    EPA Pesticide Factsheets

    Resources from the Science Inventory database of EPA's Office of Research and Development, as well as EPA's Science Matters journal, include research on managing contaminated sites and ground water modeling and decontamination technologies.

  4. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  5. Using relational databases for improved sequence similarity searching and large-scale genomic analyses.

    PubMed

    Mackey, Aaron J; Pearson, William R

    2004-10-01

    Relational databases are designed to integrate diverse types of information and manage large sets of search results, greatly simplifying genome-scale analyses. Relational databases are essential for management and analysis of large-scale sequence analyses, and can also be used to improve the statistical significance of similarity searches by focusing on subsets of sequence libraries most likely to contain homologs. This unit describes using relational databases to improve the efficiency of sequence similarity searching and to demonstrate various large-scale genomic analyses of homology-related data. This unit describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. These include basic use of the database to generate a novel sequence library subset, how to extend and use seqdb_demo for the storage of sequence similarity search results and making use of various kinds of stored search results to address aspects of comparative genomic analysis.

  6. Data, Data Everywhere but Not a Byte to Read: Managing Monitoring Information.

    ERIC Educational Resources Information Center

    Stafford, Susan G.

    1993-01-01

    Describes the Forest Science Data Bank that contains 2,400 data sets from over 350 existing ecological studies. Database features described include involvement of the scientific community; database documentation; data quality assurance; security; data access and retrieval; and data import/export flexibility. Appendices present the Quantitative…

  7. SPIRES Tailored to a Special Library: A Mainframe Answer for a Small Online Catalog.

    ERIC Educational Resources Information Center

    Newton, Mary

    1989-01-01

    Describes the design and functions of a technical library database maintained on a mainframe computer and supported by the SPIRES database management system. The topics covered include record structures, vocabulary control, input procedures, searching features, time considerations, and cost effectiveness. (three references) (CLB)

  8. Intelligent community management system based on the devicenet fieldbus

    NASA Astrophysics Data System (ADS)

    Wang, Yulan; Wang, Jianxiong; Liu, Jiwen

    2013-03-01

    With the rapid development of the national economy and the improvement of people's living standards, people are making higher demands on the living environment. And the estate management content, management efficiency and service quality have been higher required. This paper in-depth analyzes about the intelligent community of the structure and composition. According to the users' requirements and related specifications, it achieves the district management systems, which includes Basic Information Management: the management level of housing, household information management, administrator-level management, password management, etc. Service Management: standard property costs, property charges collecting, the history of arrears and other property expenses. Security Management: household gas, water, electricity and security and other security management, security management district and other public places. Systems Management: backup database, restore database, log management. This article also carries out on the Intelligent Community System analysis, proposes an architecture which is based on B / S technology system. And it has achieved a global network device management with friendly, easy to use, unified human - machine interface.

  9. Research information needs on terrestrial vertebrate species of the interior Columbia basin and northern portions of the Klamath and Great Basins: a research, development, and application database.

    Treesearch

    Bruce G. Marcot

    1997-01-01

    Research information needs on selected invertebrates and all vertebrates of the interior Columbia River basin and adjacent areas in the United States were collected into a research, development, and application database as part of the Interior Columbia Basin Ecosystem Management Project. The database includes 482 potential research study topics on 232 individual...

  10. Respiratory cancer database: An open access database of respiratory cancer gene and miRNA.

    PubMed

    Choubey, Jyotsna; Choudhari, Jyoti Kant; Patel, Ashish; Verma, Mukesh Kumar

    2017-01-01

    Respiratory cancer database (RespCanDB) is a genomic and proteomic database of cancer of respiratory organ. It also includes the information of medicinal plants used for the treatment of various respiratory cancers with structure of its active constituents as well as pharmacological and chemical information of drug associated with various respiratory cancers. Data in RespCanDB has been manually collected from published research article and from other databases. Data has been integrated using MySQL an object-relational database management system. MySQL manages all data in the back-end and provides commands to retrieve and store the data into the database. The web interface of database has been built in ASP. RespCanDB is expected to contribute to the understanding of scientific community regarding respiratory cancer biology as well as developments of new way of diagnosing and treating respiratory cancer. Currently, the database consist the oncogenomic information of lung cancer, laryngeal cancer, and nasopharyngeal cancer. Data for other cancers, such as oral and tracheal cancers, will be added in the near future. The URL of RespCanDB is http://ridb.subdic-bioinformatics-nitrr.in/.

  11. Project management tool

    NASA Technical Reports Server (NTRS)

    Maluf, David A. (Inventor); Bell, David G. (Inventor); Gurram, Mohana M. (Inventor); Gawdiak, Yuri O. (Inventor)

    2009-01-01

    A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as a monthly report, a task plan report, a budget report and a risk management report, are generated and made available for display or further analysis. An extensible database allows searching for information based upon context and upon content.

  12. An online database of nuclear electromagnetic moments

    NASA Astrophysics Data System (ADS)

    Mertzimekis, T. J.; Stamou, K.; Psaltis, A.

    2016-01-01

    Measurements of nuclear magnetic dipole and electric quadrupole moments are considered quite important for the understanding of nuclear structure both near and far from the valley of stability. The recent advent of radioactive beams has resulted in a plethora of new, continuously flowing, experimental data on nuclear structure - including nuclear moments - which hinders the information management. A new, dedicated, public and user friendly online database (http://magneticmoments.info) has been created comprising experimental data of nuclear electromagnetic moments. The present database supersedes existing printed compilations, including also non-evaluated series of data and relevant meta-data, while putting strong emphasis on bimonthly updates. The scope, features and extensions of the database are reported.

  13. Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.

    DTIC Science & Technology

    1993-05-01

    Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model

  14. A DBMS architecture for global change research

    NASA Astrophysics Data System (ADS)

    Hachem, Nabil I.; Gennert, Michael A.; Ward, Matthew O.

    1993-08-01

    The goal of this research is the design and development of an integrated system for the management of very large scientific databases, cartographic/geographic information processing, and exploratory scientific data analysis for global change research. The system will represent both spatial and temporal knowledge about natural and man-made entities on the eath's surface, following an object-oriented paradigm. A user will be able to derive, modify, and apply, procedures to perform operations on the data, including comparison, derivation, prediction, validation, and visualization. This work represents an effort to extend the database technology with an intrinsic class of operators, which is extensible and responds to the growing needs of scientific research. Of significance is the integration of many diverse forms of data into the database, including cartography, geography, hydrography, hypsography, images, and urban planning data. Equally important is the maintenance of metadata, that is, data about the data, such as coordinate transformation parameters, map scales, and audit trails of previous processing operations. This project will impact the fields of geographical information systems and global change research as well as the database community. It will provide an integrated database management testbed for scientific research, and a testbed for the development of analysis tools to understand and predict global change.

  15. CampusGIS of the University of Cologne: a tool for orientation, navigation, and management

    NASA Astrophysics Data System (ADS)

    Baaser, U.; Gnyp, M. L.; Hennig, S.; Hoffmeister, D.; Köhn, N.; Laudien, R.; Bareth, G.

    2006-10-01

    The working group for GIS and Remote Sensing at the Department of Geography at the University of Cologne has established a WebGIS called CampusGIS of the University of Cologne. The overall task of the CampusGIS is the connection of several existing databases at the University of Cologne with spatial data. These existing databases comprise data about staff, buildings, rooms, lectures, and general infrastructure like bus stops etc. These information were yet not linked to their spatial relation. Therefore, a GIS-based method is developed to link all the different databases to spatial entities. Due to the philosophy of the CampusGIS, an online-GUI is programmed which enables users to search for staff, buildings, or institutions. The query results are linked to the GIS database which allows the visualization of the spatial location of the searched entity. This system was established in 2005 and is operational since early 2006. In this contribution, the focus is on further developments. First results of (i) including routing services in, (ii) programming GUIs for mobile devices for, and (iii) including infrastructure management tools in the CampusGIS are presented. Consequently, the CampusGIS is not only available for spatial information retrieval and orientation. It also serves for on-campus navigation and administrative management.

  16. G-Hash: Towards Fast Kernel-based Similarity Search in Large Graph Databases.

    PubMed

    Wang, Xiaohong; Smalter, Aaron; Huan, Jun; Lushington, Gerald H

    2009-01-01

    Structured data including sets, sequences, trees and graphs, pose significant challenges to fundamental aspects of data management such as efficient storage, indexing, and similarity search. With the fast accumulation of graph databases, similarity search in graph databases has emerged as an important research topic. Graph similarity search has applications in a wide range of domains including cheminformatics, bioinformatics, sensor network management, social network management, and XML documents, among others.Most of the current graph indexing methods focus on subgraph query processing, i.e. determining the set of database graphs that contains the query graph and hence do not directly support similarity search. In data mining and machine learning, various graph kernel functions have been designed to capture the intrinsic similarity of graphs. Though successful in constructing accurate predictive and classification models for supervised learning, graph kernel functions have (i) high computational complexity and (ii) non-trivial difficulty to be indexed in a graph database.Our objective is to bridge graph kernel function and similarity search in graph databases by proposing (i) a novel kernel-based similarity measurement and (ii) an efficient indexing structure for graph data management. Our method of similarity measurement builds upon local features extracted from each node and their neighboring nodes in graphs. A hash table is utilized to support efficient storage and fast search of the extracted local features. Using the hash table, a graph kernel function is defined to capture the intrinsic similarity of graphs and for fast similarity query processing. We have implemented our method, which we have named G-hash, and have demonstrated its utility on large chemical graph databases. Our results show that the G-hash method achieves state-of-the-art performance for k-nearest neighbor (k-NN) classification. Most importantly, the new similarity measurement and the index structure is scalable to large database with smaller indexing size, faster indexing construction time, and faster query processing time as compared to state-of-the-art indexing methods such as C-tree, gIndex, and GraphGrep.

  17. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  18. UniGene Tabulator: a full parser for the UniGene format.

    PubMed

    Lenzi, Luca; Frabetti, Flavia; Facchin, Federica; Casadei, Raffaella; Vitale, Lorenza; Canaider, Silvia; Carinci, Paolo; Zannotti, Maria; Strippoli, Pierluigi

    2006-10-15

    UniGene Tabulator 1.0 provides a solution for full parsing of UniGene flat file format; it implements a structured graphical representation of each data field present in UniGene following import into a common database managing system usable in a personal computer. This database includes related tables for sequence, protein similarity, sequence-tagged site (STS) and transcript map interval (TXMAP) data, plus a summary table where each record represents a UniGene cluster. UniGene Tabulator enables full local management of UniGene data, allowing parsing, querying, indexing, retrieving, exporting and analysis of UniGene data in a relational database form, usable on Macintosh (OS X 10.3.9 or later) and Windows (2000, with service pack 4, XP, with service pack 2 or later) operating systems-based computers. The current release, including both the FileMaker runtime applications, is freely available at http://apollo11.isto.unibo.it/software/

  19. Customized laboratory information management system for a clinical and research leukemia cytogenetics laboratory.

    PubMed

    Bakshi, Sonal R; Shukla, Shilin N; Shah, Pankaj M

    2009-01-01

    We developed a Microsoft Access-based laboratory management system to facilitate database management of leukemia patients referred for cytogenetic tests in regards to karyotyping and fluorescence in situ hybridization (FISH). The database is custom-made for entry of patient data, clinical details, sample details, cytogenetics test results, and data mining for various ongoing research areas. A number of clinical research laboratoryrelated tasks are carried out faster using specific "queries." The tasks include tracking clinical progression of a particular patient for multiple visits, treatment response, morphological and cytogenetics response, survival time, automatic grouping of patient inclusion criteria in a research project, tracking various processing steps of samples, turn-around time, and revenue generated. Since 2005 we have collected of over 5,000 samples. The database is easily updated and is being adapted for various data maintenance and mining needs.

  20. A blue carbon soil database: Tidal wetland stocks for the US National Greenhouse Gas Inventory

    NASA Astrophysics Data System (ADS)

    Feagin, R. A.; Eriksson, M.; Hinson, A.; Najjar, R. G.; Kroeger, K. D.; Herrmann, M.; Holmquist, J. R.; Windham-Myers, L.; MacDonald, G. M.; Brown, L. N.; Bianchi, T. S.

    2015-12-01

    Coastal wetlands contain large reservoirs of carbon, and in 2015 the US National Greenhouse Gas Inventory began the work of placing blue carbon within the national regulatory context. The potential value of a wetland carbon stock, in relation to its location, soon could be influential in determining governmental policy and management activities, or in stimulating market-based CO2 sequestration projects. To meet the national need for high-resolution maps, a blue carbon stock database was developed linking National Wetlands Inventory datasets with the USDA Soil Survey Geographic Database. Users of the database can identify the economic potential for carbon conservation or restoration projects within specific estuarine basins, states, wetland types, physical parameters, and land management activities. The database is geared towards both national-level assessments and local-level inquiries. Spatial analysis of the stocks show high variance within individual estuarine basins, largely dependent on geomorphic position on the landscape, though there are continental scale trends to the carbon distribution as well. Future plans including linking this database with a sedimentary accretion database to predict carbon flux in US tidal wetlands.

  1. National Databases for Neurosurgical Outcomes Research: Options, Strengths, and Limitations.

    PubMed

    Karhade, Aditya V; Larsen, Alexandra M G; Cote, David J; Dubois, Heloise M; Smith, Timothy R

    2017-08-05

    Quality improvement, value-based care delivery, and personalized patient care depend on robust clinical, financial, and demographic data streams of neurosurgical outcomes. The neurosurgical literature lacks a comprehensive review of large national databases. To assess the strengths and limitations of various resources for outcomes research in neurosurgery. A review of the literature was conducted to identify surgical outcomes studies using national data sets. The databases were assessed for the availability of patient demographics and clinical variables, longitudinal follow-up of patients, strengths, and limitations. The number of unique patients contained within each data set ranged from thousands (Quality Outcomes Database [QOD]) to hundreds of millions (MarketScan). Databases with both clinical and financial data included PearlDiver, Premier Healthcare Database, Vizient Clinical Data Base and Resource Manager, and the National Inpatient Sample. Outcomes collected by databases included patient-reported outcomes (QOD); 30-day morbidity, readmissions, and reoperations (National Surgical Quality Improvement Program); and disease incidence and disease-specific survival (Surveillance, Epidemiology, and End Results-Medicare). The strengths of large databases included large numbers of rare pathologies and multi-institutional nationally representative sampling; the limitations of these databases included variable data veracity, variable data completeness, and missing disease-specific variables. The improvement of existing large national databases and the establishment of new registries will be crucial to the future of neurosurgical outcomes research. Copyright © 2017 by the Congress of Neurological Surgeons

  2. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  3. Coordination and standardization of federal sedimentation activities

    USGS Publications Warehouse

    Glysson, G. Douglas; Gray, John R.

    1997-01-01

    - precipitation information critical to water resources management. Memorandum M-92-01 covers primarily freshwater bodies and includes activities, such as "development and distribution of consensus standards, field-data collection and laboratory analytical methods, data processing and interpretation, data-base management, quality control and quality assurance, and water- resources appraisals, assessments, and investigations." Research activities are not included.

  4. Microcomputer Software for Libraries: A Survey.

    ERIC Educational Resources Information Center

    Nolan, Jeanne M.

    1983-01-01

    Reports on findings of research done by Nolan Information Management Services concerning availability of microcomputer software for libraries. Highlights include software categories (specific, generic-database management programs, original); number of programs available in 1982 for 12 applications; projections for 1983; and future software…

  5. Information technologies in public health management: a database on biocides to improve quality of life.

    PubMed

    Roman, C; Scripcariu, L; Diaconescu, Rm; Grigoriu, A

    2012-01-01

    Biocides for prolonging the shelf life of a large variety of materials have been extensively used over the last decades. It has estimated that the worldwide biocide consumption to be about 12.4 billion dollars in 2011, and is expected to increase in 2012. As biocides are substances we get in contact with in our everyday lives, access to this type of information is of paramount importance in order to ensure an appropriate living environment. Consequently, a database where information may be quickly processed, sorted, and easily accessed, according to different search criteria, is the most desirable solution. The main aim of this work was to design and implement a relational database with complete information about biocides used in public health management to improve the quality of life. Design and implementation of a relational database for biocides, by using the software "phpMyAdmin". A database, which allows for an efficient collection, storage, and management of information including chemical properties and applications of a large quantity of biocides, as well as its adequate dissemination into the public health environment. The information contained in the database herein presented promotes an adequate use of biocides, by means of information technologies, which in consequence may help achieve important improvement in our quality of life.

  6. Challenges and Experiences of Building Multidisciplinary Datasets across Cultures

    NASA Astrophysics Data System (ADS)

    Jamiyansharav, K.; Laituri, M.; Fernandez-Gimenez, M.; Fassnacht, S. R.; Venable, N. B. H.; Allegretti, A. M.; Reid, R.; Baival, B.; Jamsranjav, C.; Ulambayar, T.; Linn, S.; Angerer, J.

    2017-12-01

    Efficient data sharing and management are key challenges to multidisciplinary scientific research. These challenges are further complicated by adding a multicultural component. We address the construction of a complex database for social-ecological analysis in Mongolia. Funded by the National Science Foundation (NSF) Dynamics of Coupled Natural and Human (CNH) Systems, the Mongolian Rangelands and Resilience (MOR2) project focuses on the vulnerability of Mongolian pastoral systems to climate change and adaptive capacity. The MOR2 study spans over three years of fieldwork in 36 paired districts (Soum) from 18 provinces (Aimag) of Mongolia that covers steppe, mountain forest steppe, desert steppe and eastern steppe ecological zones. Our project team is composed of hydrologists, social scientists, geographers, and ecologists. The MOR2 database includes multiple ecological, social, meteorological, geospatial and hydrological datasets, as well as archives of original data and survey in multiple formats. Managing this complex database requires significant organizational skills, attention to detail and ability to communicate within collective team members from diverse disciplines and across multiple institutions in the US and Mongolia. We describe the database's rich content, organization, structure and complexity. We discuss lessons learned, best practices and recommendations for complex database management, sharing, and archiving in creating a cross-cultural and multi-disciplinary database.

  7. DataSpread: Unifying Databases and Spreadsheets.

    PubMed

    Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya

    2015-08-01

    Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current "pane" (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases.

  8. DataSpread: Unifying Databases and Spreadsheets

    PubMed Central

    Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya

    2015-01-01

    Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current “pane” (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases. PMID:26900487

  9. 23 CFR 971.210 - Federal lands bridge management system (BMS).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Section 971.210 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS... components, as a minimum, as a basic framework for a BMS: (1) A database and an ongoing program for the... BMS. The minimum BMS database shall include: (i) The inventory data required by the NBIS (23 CFR 650...

  10. 23 CFR 971.210 - Federal lands bridge management system (BMS).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Section 971.210 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS... components, as a minimum, as a basic framework for a BMS: (1) A database and an ongoing program for the... BMS. The minimum BMS database shall include: (i) The inventory data required by the NBIS (23 CFR 650...

  11. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    ERIC Educational Resources Information Center

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  12. Managing Data, Provenance and Chaos through Standardization and Automation at the Georgia Coastal Ecosystems LTER Site

    NASA Astrophysics Data System (ADS)

    Sheldon, W.

    2013-12-01

    Managing data for a large, multidisciplinary research program such as a Long Term Ecological Research (LTER) site is a significant challenge, but also presents unique opportunities for data stewardship. LTER research is conducted within multiple organizational frameworks (i.e. a specific LTER site as well as the broader LTER network), and addresses both specific goals defined in an NSF proposal as well as broader goals of the network; therefore, every LTER data can be linked to rich contextual information to guide interpretation and comparison. The challenge is how to link the data to this wealth of contextual metadata. At the Georgia Coastal Ecosystems LTER we developed an integrated information management system (GCE-IMS) to manage, archive and distribute data, metadata and other research products as well as manage project logistics, administration and governance (figure 1). This system allows us to store all project information in one place, and provide dynamic links through web applications and services to ensure content is always up to date on the web as well as in data set metadata. The database model supports tracking changes over time in personnel roles, projects and governance decisions, allowing these databases to serve as canonical sources of project history. Storing project information in a central database has also allowed us to standardize both the formatting and content of critical project information, including personnel names, roles, keywords, place names, attribute names, units, and instrumentation, providing consistency and improving data and metadata comparability. Lookup services for these standard terms also simplify data entry in web and database interfaces. We have also coupled the GCE-IMS to our MATLAB- and Python-based data processing tools (i.e. through database connections) to automate metadata generation and packaging of tabular and GIS data products for distribution. Data processing history is automatically tracked throughout the data lifecycle, from initial import through quality control, revision and integration by our data processing system (GCE Data Toolbox for MATLAB), and included in metadata for versioned data products. This high level of automation and system integration has proven very effective in managing the chaos and scalability of our information management program.

  13. Integration of Web-based and PC-based clinical research databases.

    PubMed

    Brandt, C A; Sun, K; Charpentier, P; Nadkarni, P M

    2004-01-01

    We have created a Web-based repository or data library of information about measurement instruments used in studies of multi-factorial geriatric health conditions (the Geriatrics Research Instrument Library - GRIL) based upon existing features of two separate clinical study data management systems. GRIL allows browsing, searching, and selecting measurement instruments based upon criteria such as keywords and areas of applicability. Measurement instruments selected can be printed and/or included in an automatically generated standalone microcomputer database application, which can be downloaded by investigators for use in data collection and data management. Integration of database applications requires the creation of a common semantic model, and mapping from each system to this model. Various database schema conflicts at the table and attribute level must be identified and resolved prior to integration. Using a conflict taxonomy and a mapping schema facilitates this process. Critical conflicts at the table level that required resolution included name and relationship differences. A major benefit of integration efforts is the sharing of features and cross-fertilization of applications created for similar purposes in different operating environments. Integration of applications mandates some degree of metadata model unification.

  14. A compilation of spatial digital databases for selected U.S. Geological Survey nonfuel mineral resource assessments for parts of Idaho and Montana

    USGS Publications Warehouse

    Carlson, Mary H.; Zientek, Michael L.; Causey, J. Douglas; Kayser, Helen Z.; Spanski, Gregory T.; Wilson, Anna B.; Van Gosen, Bradley S.; Trautwein, Charles M.

    2007-01-01

    This report compiles selected results from 13 U.S. Geological Survey (USGS) mineral resource assessment studies conducted in Idaho and Montana into consistent spatial databases that can be used in a geographic information system. The 183 spatial databases represent areas of mineral potential delineated in these studies and include attributes on mineral deposit type, level of mineral potential, certainty, and a reference. The assessments were conducted for five 1? x 2? quadrangles (Butte, Challis, Choteau, Dillon, and Wallace), several U.S. Forest Service (USFS) National Forests (including Challis, Custer, Gallatin, Helena, and Payette), and one Bureau of Land Management (BLM) Resource Area (Dillon). The data contained in the spatial databases are based on published information: no new interpretations are made. This digital compilation is part of an ongoing effort to provide mineral resource information formatted for use in spatial analysis. In particular, this is one of several reports prepared to address USFS needs for science information as forest management plans are revised in the Northern Rocky Mountains.

  15. CycADS: an annotation database system to ease the development and update of BioCyc databases

    PubMed Central

    Vellozo, Augusto F.; Véron, Amélie S.; Baa-Puyoulet, Patrice; Huerta-Cepas, Jaime; Cottret, Ludovic; Febvay, Gérard; Calevro, Federica; Rahbé, Yvan; Douglas, Angela E.; Gabaldón, Toni; Sagot, Marie-France; Charles, Hubert; Colella, Stefano

    2011-01-01

    In recent years, genomes from an increasing number of organisms have been sequenced, but their annotation remains a time-consuming process. The BioCyc databases offer a framework for the integrated analysis of metabolic networks. The Pathway tool software suite allows the automated construction of a database starting from an annotated genome, but it requires prior integration of all annotations into a specific summary file or into a GenBank file. To allow the easy creation and update of a BioCyc database starting from the multiple genome annotation resources available over time, we have developed an ad hoc data management system that we called Cyc Annotation Database System (CycADS). CycADS is centred on a specific database model and on a set of Java programs to import, filter and export relevant information. Data from GenBank and other annotation sources (including for example: KAAS, PRIAM, Blast2GO and PhylomeDB) are collected into a database to be subsequently filtered and extracted to generate a complete annotation file. This file is then used to build an enriched BioCyc database using the PathoLogic program of Pathway Tools. The CycADS pipeline for annotation management was used to build the AcypiCyc database for the pea aphid (Acyrthosiphon pisum) whose genome was recently sequenced. The AcypiCyc database webpage includes also, for comparative analyses, two other metabolic reconstruction BioCyc databases generated using CycADS: TricaCyc for Tribolium castaneum and DromeCyc for Drosophila melanogaster. Linked to its flexible design, CycADS offers a powerful software tool for the generation and regular updating of enriched BioCyc databases. The CycADS system is particularly suited for metabolic gene annotation and network reconstruction in newly sequenced genomes. Because of the uniform annotation used for metabolic network reconstruction, CycADS is particularly useful for comparative analysis of the metabolism of different organisms. Database URL: http://www.cycadsys.org PMID:21474551

  16. The Next Step in Educational Program Budgets and Information Resource Management: Integrated Data Structures.

    ERIC Educational Resources Information Center

    Jackowski, Edward M.

    1988-01-01

    Discusses the role that information resource management (IRM) plays in educational program-oriented budgeting (POB), and presents a theoretical IRM model. Highlights include design considerations for integrated data systems; database management systems (DBMS); and how POB data can be integrated to enhance its value and use within an educational…

  17. Information Resources Management. Nordic Conference on Information and Documentation (6th, Helsinki, Finland, August 19-22, 1985).

    ERIC Educational Resources Information Center

    Samfundet for Informationstjanst i Finland, Helsinki.

    The 54 conference papers compiled in this proceedings include plenary addresses; reviews of Nordic databases; and discussions of documents, systems, services, and products as they relate to information resources management (IRM). Almost half of the presentations are in English: (1) "What Is Information Resources Management?" (Forest…

  18. Integrating GIS, Archeology, and the Internet.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sera White; Brenda Ringe Pace; Randy Lee

    2004-08-01

    At the Idaho National Engineering and Environmental Laboratory's (INEEL) Cultural Resource Management Office, a newly developed Data Management Tool (DMT) is improving management and long-term stewardship of cultural resources. The fully integrated system links an archaeological database, a historical database, and a research database to spatial data through a customized user interface using ArcIMS and Active Server Pages. Components of the new DMT are tailored specifically to the INEEL and include automated data entry forms for historic and prehistoric archaeological sites, specialized queries and reports that address both yearly and project-specific documentation requirements, and unique field recording forms. The predictivemore » modeling component increases the DMT’s value for land use planning and long-term stewardship. The DMT enhances the efficiency of archive searches, improving customer service, oversight, and management of the large INEEL cultural resource inventory. In the future, the DMT will facilitate data sharing with regulatory agencies, tribal organizations, and the general public.« less

  19. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.« less

  20. Efficacy of Noninvasive Stellate Ganglion Blockade Performed Using Physical Agent Modalities in Patients with Sympathetic Hyperactivity-Associated Disorders: A Systematic Review and Meta-Analysis.

    PubMed

    Liao, Chun-De; Tsauo, Jau-Yih; Liou, Tsan-Hon; Chen, Hung-Chou; Rau, Chi-Lun

    2016-01-01

    Stellate ganglion blockade (SGB) is mainly used to relieve symptoms of neuropathic pain in conditions such as complex regional pain syndrome and has several potential complications. Noninvasive SGB performed using physical agent modalities (PAMs), such as light irradiation and electrical stimulation, can be clinically used as an alternative to conventional invasive SGB. However, its application protocols vary and its clinical efficacy remains controversial. This study investigated the use of noninvasive SGB for managing neuropathic pain or other disorders associated with sympathetic hyperactivity. We performed a comprehensive search of the following online databases: Medline, PubMed, Excerpta Medica Database, Cochrane Library Database, Ovid MEDLINE, Europe PubMed Central, EBSCOhost Research Databases, CINAHL, ProQuest Research Library, Physiotherapy Evidence Database, WorldWideScience, BIOSIS, and Google Scholar. We identified and included quasi-randomized or randomized controlled trials reporting the efficacy of SGB performed using therapeutic ultrasound, transcutaneous electrical nerve stimulation, light irradiation using low-level laser therapy, or xenon light or linearly polarized near-infrared light irradiation near or over the stellate ganglion region in treating complex regional pain syndrome or disorders requiring sympatholytic management. The included articles were subjected to a meta-analysis and risk of bias assessment. Nine randomized and four quasi-randomized controlled trials were included. Eleven trials had good methodological quality with a Physiotherapy Evidence Database (PEDro) score of ≥6, whereas the remaining two trials had a PEDro score of <6. The meta-analysis results revealed that the efficacy of noninvasive SGB on 100-mm visual analog pain score is higher than that of a placebo or active control (weighted mean difference, -21.59 mm; 95% CI, -34.25, -8.94; p = 0.0008). Noninvasive SGB performed using PAMs effectively relieves pain of various etiologies, making it a valuable addition to the contemporary pain management armamentarium. However, this evidence is limited by the potential risk of bias.

  1. Microvax-based data management and reduction system for the regional planetary image facilities

    NASA Technical Reports Server (NTRS)

    Arvidson, R.; Guinness, E.; Slavney, S.; Weiss, B.

    1987-01-01

    Presented is a progress report for the Regional Planetary Image Facilities (RPIF) prototype image data management and reduction system being jointly implemented by Washington University and the USGS, Flagstaff. The system will consist of a MicroVAX with a high capacity (approx 300 megabyte) disk drive, a compact disk player, an image display buffer, a videodisk player, USGS image processing software, and SYSTEM 1032 - a commercial relational database management package. The USGS, Flagstaff, will transfer their image processing software including radiometric and geometric calibration routines, to the MicroVAX environment. Washington University will have primary responsibility for developing the database management aspects of the system and for integrating the various aspects into a working system.

  2. 76 FR 59170 - Hartford Financial Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT; Notice of Negative... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, Connecticut (The Hartford, Corporate/EIT/CTO Database Management Division). The negative determination was issued on August 19, 2011...

  3. An Apple for the Librarian: The OUC Experience.

    ERIC Educational Resources Information Center

    Planton, Stanley; Phillips, Susan

    1986-01-01

    Describes computerization of routine library procedures on Apple microcomputers at a small regional campus of Ohio University. Highlights include use of a database management program--PFS:FILE--for acquisition lists, equipment/supplies inventory, microfilm and periodicals management, and statistical manipulations, and a spreadsheet…

  4. A reservoir morphology database for the conterminous United States

    USGS Publications Warehouse

    Rodgers, Kirk D.

    2017-09-13

    The U.S. Geological Survey, in cooperation with the Reservoir Fisheries Habitat Partnership, combined multiple national databases to create one comprehensive national reservoir database and to calculate new morphological metrics for 3,828 reservoirs. These new metrics include, but are not limited to, shoreline development index, index of basin permanence, development of volume, and other descriptive metrics based on established morphometric formulas. The new database also contains modeled chemical and physical metrics. Because of the nature of the existing databases used to compile the Reservoir Morphology Database and the inherent missing data, some metrics were not populated. One comprehensive database will assist water-resource managers in their understanding of local reservoir morphology and water chemistry characteristics throughout the continental United States.

  5. The USA-NPN Information Management System: A tool in support of phenological assessments

    NASA Astrophysics Data System (ADS)

    Rosemartin, A.; Vazquez, R.; Wilson, B. E.; Denny, E. G.

    2009-12-01

    The USA National Phenology Network (USA-NPN) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and all aspects of environmental change. Data management and information sharing are central to the USA-NPN mission. The USA-NPN develops, implements, and maintains a comprehensive Information Management System (IMS) to serve the needs of the network, including the collection, storage and dissemination of phenology data, access to phenology-related information, tools for data interpretation, and communication among partners of the USA-NPN. The IMS includes components for data storage, such as the National Phenology Database (NPD), and several online user interfaces to accommodate data entry, data download, data visualization and catalog searches for phenology-related information. The IMS is governed by a set of standards to ensure security, privacy, data access, and data quality. The National Phenology Database is designed to efficiently accommodate large quantities of phenology data, to be flexible to the changing needs of the network, and to provide for quality control. The database stores phenology data from multiple sources (e.g., partner organizations, researchers and citizen observers), and provides for integration with legacy datasets. Several services will be created to provide access to the data, including reports, visualization interfaces, and web services. These services will provide integrated access to phenology and related information for scientists, decision-makers and general audiences. Phenological assessments at any scale will rely on secure and flexible information management systems for the organization and analysis of phenology data. The USA-NPN’s IMS can serve phenology assessments directly, through data management and indirectly as a model for large-scale integrated data management.

  6. An online database for informing ecological network models: http://kelpforest.ucsc.edu.

    PubMed

    Beas-Luna, Rodrigo; Novak, Mark; Carr, Mark H; Tinker, Martin T; Black, August; Caselle, Jennifer E; Hoban, Michael; Malone, Dan; Iles, Alison

    2014-01-01

    Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/databaseui).

  7. An Online Database for Informing Ecological Network Models: http://kelpforest.ucsc.edu

    PubMed Central

    Beas-Luna, Rodrigo; Novak, Mark; Carr, Mark H.; Tinker, Martin T.; Black, August; Caselle, Jennifer E.; Hoban, Michael; Malone, Dan; Iles, Alison

    2014-01-01

    Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/databaseui). PMID:25343723

  8. An online database for informing ecological network models: http://kelpforest.ucsc.edu

    USGS Publications Warehouse

    Beas-Luna, Rodrigo; Tinker, M. Tim; Novak, Mark; Carr, Mark H.; Black, August; Caselle, Jennifer E.; Hoban, Michael; Malone, Dan; Iles, Alison C.

    2014-01-01

    Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/data​baseui).

  9. TWRS technical baseline database manager definition document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acree, C.D.

    1997-08-13

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

  10. A Web-Based GIS for Reporting Water Usage in the High Plains Underground Water Conservation District

    NASA Astrophysics Data System (ADS)

    Jia, M.; Deeds, N.; Winckler, M.

    2012-12-01

    The High Plains Underground Water Conservation District (HPWD) is the largest and oldest of the Texas water conservation districts, and oversees approximately 1.7 million irrigated acres. Recent rule changes have motivated HPWD to develop a more automated system to allow owners and operators to report well locations, meter locations, meter readings, the association between meters and wells, and contiguous acres. INTERA, Inc. has developed a web-based interactive system for HPWD water users to report water usage and for the district to better manage its water resources. The HPWD web management system utilizes state-of-the-art GIS techniques, including cloud-based Amazon EC2 virtual machine, ArcGIS Server, ArcSDE and ArcGIS Viewer for Flex, to support web-based water use management. The system enables users to navigate to their area of interest using a well-established base-map and perform a variety of operations and inquiries against their spatial features. The application currently has six components: user privilege management, property management, water meter registration, area registration, meter-well association and water use report. The system is composed of two main databases: spatial database and non-spatial database. With the help of Adobe Flex application at the front end and ArcGIS Server as the middle-ware, the spatial feature geometry and attributes update will be reflected immediately in the back end. As a result, property owners, along with the HPWD staff, collaborate together to weave the fabric of the spatial database. Interactions between the spatial and non-spatial databases are established by Windows Communication Foundation (WCF) services to record water-use report, user-property associations, owner-area associations, as well as meter-well associations. Mobile capabilities will be enabled in the near future for field workers to collect data and synchronize them to the spatial database. The entire solution is built on a highly scalable cloud server to dynamically allocate the computational resources so as to reduce the cost on security and hardware maintenance. In addition to the default capabilities provided by ESRI, customizations include 1) enabling interactions between spatial and non-spatial databases, 2) providing role-based feature editing, 3) dynamically filtering spatial features on the map based on user accounts and 4) comprehensive data validation.

  11. Information integration for a sky survey by data warehousing

    NASA Astrophysics Data System (ADS)

    Luo, A.; Zhang, Y.; Zhao, Y.

    The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line

  12. Enhancements to the Redmine Database Metrics Plug in

    DTIC Science & Technology

    2017-08-01

    management web application has been adopted within the US Army Research Laboratory’s Computational and Information Sciences Directorate as a database...Metrics Plug-in by Terry C Jameson Computational and Information Sciences Directorate, ARL Approved for public... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and

  13. Guide on Data Models in the Selection and Use of Database Management Systems. Final Report.

    ERIC Educational Resources Information Center

    Gallagher, Leonard J.; Draper, Jesse M.

    A tutorial introduction to data models in general is provided, with particular emphasis on the relational and network models defined by the two proposed ANSI (American National Standards Institute) database language standards. Examples based on the network and relational models include specific syntax and semantics, while examples from the other…

  14. Knowledge Management, User Education and Librarianship.

    ERIC Educational Resources Information Center

    Koenig, Michael E. D.

    2003-01-01

    Discusses the role of librarians in knowledge management in terms of designing information systems, creating classification systems and taxonomies, and implementing and operating the systems. Suggests the need for librarians to be involved in user education and training, including database searching, using current awareness services, and using…

  15. The Corporate Library and Issues Management.

    ERIC Educational Resources Information Center

    Lancaster, F. W.; Loescher, Jane

    1994-01-01

    Discussion of corporate library services and the role of the librarian focuses on the recognition and tracking of issues of potential significance to the corporation, or issues management. Topics addressed include environmental scanning of relevant literature, and the use of databases to track issues. (16 references) (LRW)

  16. [Quality management and participation into clinical database].

    PubMed

    Okubo, Suguru; Miyata, Hiroaki; Tomotaki, Ai; Motomura, Noboru; Murakami, Arata; Ono, Minoru; Iwanaka, Tadashi

    2013-07-01

    Quality management is necessary for establishing useful clinical database in cooperation with healthcare professionals and facilities. The ways of management are 1) progress management of data entry, 2) liaison with database participants (healthcare professionals), and 3) modification of data collection form. In addition, healthcare facilities are supposed to consider ethical issues and information security for joining clinical databases. Database participants should check ethical review boards and consultation service for patients.

  17. FishTraits Database

    USGS Publications Warehouse

    Angermeier, Paul L.; Frimpong, Emmanuel A.

    2009-01-01

    The need for integrated and widely accessible sources of species traits data to facilitate studies of ecology, conservation, and management has motivated development of traits databases for various taxa. In spite of the increasing number of traits-based analyses of freshwater fishes in the United States, no consolidated database of traits of this group exists publicly, and much useful information on these species is documented only in obscure sources. The largely inaccessible and unconsolidated traits information makes large-scale analysis involving many fishes and/or traits particularly challenging. FishTraits is a database of >100 traits for 809 (731 native and 78 exotic) fish species found in freshwaters of the conterminous United States, including 37 native families and 145 native genera. The database contains information on four major categories of traits: (1) trophic ecology, (2) body size and reproductive ecology (life history), (3) habitat associations, and (4) salinity and temperature tolerances. Information on geographic distribution and conservation status is also included. Together, we refer to the traits, distribution, and conservation status information as attributes. Descriptions of attributes are available here. Many sources were consulted to compile attributes, including state and regional species accounts and other databases.

  18. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    NASA Technical Reports Server (NTRS)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  19. Issues in NASA Program and Project Management. Special Report: 1997 Conference. Project Management Now and in the New Millennium

    NASA Technical Reports Server (NTRS)

    Hoffman, Edward J. (Editor); Lawbaugh, William M. (Editor)

    1997-01-01

    Topics Considered Include: NASA's Shared Experiences Program; Core Issues for the Future of the Agency; National Space Policy Strategic Management; ISO 9000 and NASA; New Acquisition Initiatives; Full Cost Initiative; PM Career Development; PM Project Database; NASA Fast Track Studies; Fast Track Projects; Earned Value Concept; Value-Added Metrics; Saturn Corporation Lessons Learned; Project Manager Credibility.

  20. THE CELL CENTERED DATABASE PROJECT: AN UPDATE ON BUILDING COMMUNITY RESOURCES FOR MANAGING AND SHARING 3D IMAGING DATA

    PubMed Central

    Martone, Maryann E.; Tran, Joshua; Wong, Willy W.; Sargis, Joy; Fong, Lisa; Larson, Stephen; Lamont, Stephan P.; Gupta, Amarnath; Ellisman, Mark H.

    2008-01-01

    Databases have become integral parts of data management, dissemination and mining in biology. At the Second Annual Conference on Electron Tomography, held in Amsterdam in 2001, we proposed that electron tomography data should be shared in a manner analogous to structural data at the protein and sequence scales. At that time, we outlined our progress in creating a database to bring together cell level imaging data across scales, The Cell Centered Database (CCDB). The CCDB was formally launched in 2002 as an on-line repository of high-resolution 3D light and electron microscopic reconstructions of cells and subcellular structures. It contains 2D, 3D and 4D structural and protein distribution information from confocal, multiphoton and electron microscopy, including correlated light and electron microscopy. Many of the data sets are derived from electron tomography of cells and tissues. In the five years since its debut, we have moved the CCDB from a prototype to a stable resource and expanded the scope of the project to include data management and knowledge engineering. Here we provide an update on the CCDB and how it is used by the scientific community. We also describe our work in developing additional knowledge tools, e.g., ontologies, for annotation and query of electron microscopic data. PMID:18054501

  1. National Administrative Databases in Adult Spinal Deformity Surgery: A Cautionary Tale.

    PubMed

    Buckland, Aaron J; Poorman, Gregory; Freitag, Robert; Jalai, Cyrus; Klineberg, Eric O; Kelly, Michael; Passias, Peter G

    2017-08-15

    Comparison between national administrative databases and a prospective multicenter physician managed database. This study aims to assess the applicability of National Administrative Databases (NADs) in adult spinal deformity (ASD). Our hypothesis is that NADs do not include comparable patients as in a physician-managed database (PMD) for surgical outcomes in adult spinal deformity. NADs such as National Inpatient Sample (NIS) and National Surgical Quality Improvement Program (NSQIP) provide large numbers of publications owing to ease of data access and lack of IRB approval requirement. These databases utilize billing codes, not clinical inclusion criteria, and have not been validated against PMDs in ASD surgery. The NIS was searched for years 2002 to 2012 and NSQIP for years 2006 to 2013 using validated spinal deformity diagnostic codes. Procedural codes (ICD-9 and CPT) were then applied to each database. A multicenter PMD including years 2008 to 2015 was used for comparison. Databases were assessed for levels fused, osteotomies, decompressed levels, and invasiveness. Database comparisons for surgical details were made in all patients, and also for patients with ≥ 5 level spinal fusions. Approximately, 37,368 NIS, 1291 NSQIP, and 737 PMD patients were identified. NADs showed an increased use of deformity billing codes over the study period (NIS doubled, 68x NSQIP, P < 0.001), but ASD remained stable in the PMD.Surgical invasiveness, levels fused and use of 3-column osteotomy (3-CO) were significantly lower for all patients in the NIS (11.4-13.7) and NSQIP databases (6.4-12.7) compared with PMD (27.5-32.3). When limited to patients with ≥5 levels, invasiveness, levels fused, and use of 3-CO remained significantly higher in the PMD compared with NADs (P < 0.001). National databases NIS and NSQIP do not capture the same patient population as is captured in PMDs in ASD. Physicians should remain cautious in interpreting conclusions drawn from these databases. 4.

  2. USGS cold-water coral geographic database-Gulf of Mexico and western North Atlantic Ocean, version 1.0

    USGS Publications Warehouse

    Scanlon, Kathryn M.; Waller, Rhian G.; Sirotek, Alexander R.; Knisel, Julia M.; O'Malley, John; Alesandrini, Stian

    2010-01-01

    The USGS Cold-Water Coral Geographic Database (CoWCoG) provides a tool for researchers and managers interested in studying, protecting, and/or utilizing cold-water coral habitats in the Gulf of Mexico and western North Atlantic Ocean.  The database makes information about the locations and taxonomy of cold-water corals available to the public in an easy-to-access form while preserving the scientific integrity of the data.  The database includes over 1700 entries, mostly from published scientific literature, museum collections, and other databases.  The CoWCoG database is easy to search in a variety of ways, and data can be quickly displayed in table form and on a map by using only the software included with this publication.  Subsets of the database can be selected on the basis of geographic location, taxonomy, or other criteria and exported to one of several available file formats.  Future versions of the database are being planned to cover a larger geographic area and additional taxa.

  3. Acoustic telemetry observation systems: challenges encountered and overcome in the Laurentian Great Lakes

    USGS Publications Warehouse

    Krueger, Charles C.; Holbrook, Christopher; Binder, Thomas R.; Vandergoot, Christopher; Hayden, Todd A.; Hondorp, Darryl W.; Nate, Nancy; Paige, Kelli; Riley, Stephen; Fisk, Aaron T.; Cooke, Steven J.

    2017-01-01

    The Great Lakes Acoustic Telemetry Observation System (GLATOS), organized in 2012, aims to advance and improve conservation and management of Great Lakes fishes by providing information on behavior, habitat use, and population dynamics. GLATOS faced challenges during establishment, including a funding agency-imposed urgency to initiate projects, a lack of telemetry expertise, and managing a flood of data. GLATOS now connects 190+ investigators, provides project consultation, maintains a web-based data portal, contributes data to Ocean Tracking Network’s global database, loans equipment, and promotes science transfer to managers. The GLATOS database currently has 50+ projects, 39 species tagged, 8000+ fish released, and 150+ million tag detections. Lessons learned include (1) seek advice from others experienced in telemetry; (2) organize networks prior to when shared data is urgently needed; (3) establish a data management system so that all receivers can contribute to every project; (4) hold annual meetings to foster relationships; (5) involve fish managers to ensure relevancy; and (6) staff require full-time commitment to lead and coordinate projects and to analyze data and publish results.

  4. The Just-in-Time Imperative.

    ERIC Educational Resources Information Center

    Weintraub, Robert S.; Martineau, Jennifer W.

    2002-01-01

    Increasinginly in demand, just-in-time learning is associated with informal, learner-driven knowledge acquisition. Technologies being used include databases, intranets, portals, and content management systems. (JOW)

  5. NNDC Stand: Activities and Services of the National Nuclear Data Center

    NASA Astrophysics Data System (ADS)

    Pritychenko, B.; Arcilla, R.; Burrows, T. W.; Dunford, C. L.; Herman, M. W.; McLane, V.; Obložinský, P.; Sonzogni, A. A.; Tuli, J. K.; Winchell, D. F.

    2005-05-01

    The National Nuclear Data Center (NNDC) collects, evaluates, and disseminates nuclear physics data for basic nuclear research, applied nuclear technologies including energy, shielding, medical and homeland security. In 2004, to answer the needs of nuclear data users community, NNDC completed a project to modernize data storage and management of its databases and began offering new nuclear data Web services. The principles of database and Web application development as well as related nuclear reaction and structure database services are briefly described.

  6. Estimating the Net Economic Value of National Forest Recreation: An Application of the National Visitor Use Monitoring Database

    Treesearch

    J.M. Bowker; C.M. Starbuck; D.B.K. English; J.C. Bergstrom; R.S. Rosenburger; D.C. McCollum

    2009-01-01

    The USDA Forest Service (FS) manages 193 million acres of public land in the United States. These public resources include vast quantities of natural resources including timber, wildlife, watersheds, air sheds, and ecosystems. The Forest Service was established in 1905, and the FS has been directed by Congress to manage the National Forests and Grasslands for the...

  7. From Workstation to Teacher Support System: A Tool to Increase Productivity.

    ERIC Educational Resources Information Center

    Chen, J. Wey

    1989-01-01

    Describes a teacher support system which is a computer-based workstation that provides support for teachers and administrators by integrating teacher utility programs, instructional management software, administrative packages, and office automation tools. Hardware is described and software components are explained, including database managers,…

  8. WATERSHED HEALTH ASSESSMENT TOOLS-INVESTIGATING FISHERIES (WHAT-IF): A MODELING TOOLKIT FOR WATERSHED AND FISHERIES MANAGEMENT

    EPA Science Inventory

    The Watershed Health Assessment Tools-Investigating Fisheries (WHAT-IF) is a decision-analysis modeling toolkit for personal computers that supports watershed and fisheries management. The WHAT-IF toolkit includes a relational database, help-system functions and documentation, a...

  9. Short Fiction on Film: A Relational DataBase.

    ERIC Educational Resources Information Center

    May, Charles

    Short Fiction on Film is a database that was created and will run on DataRelator, a relational database manager created by Bill Finzer for the California State Department of Education in 1986. DataRelator was designed for use in teaching students database management skills and to provide teachers with examples of how a database manager might be…

  10. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...

  11. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...

  12. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...

  13. Implementing the EuroFIR Document and Data Repositories as accessible resources of food composition information.

    PubMed

    Unwin, Ian; Jansen-van der Vliet, Martine; Westenbrink, Susanne; Presser, Karl; Infanger, Esther; Porubska, Janka; Roe, Mark; Finglas, Paul

    2016-02-15

    The EuroFIR Document and Data Repositories are being developed as accessible collections of source documents, including grey literature, and the food composition data reported in them. These Repositories will contain source information available to food composition database compilers when selecting their nutritional data. The Document Repository was implemented as searchable bibliographic records in the Europe PubMed Central database, which links to the documents online. The Data Repository will contain original data from source documents in the Document Repository. Testing confirmed the FoodCASE food database management system as a suitable tool for the input, documentation and quality assessment of Data Repository information. Data management requirements for the input and documentation of reported analytical results were established, including record identification and method documentation specifications. Document access and data preparation using the Repositories will provide information resources for compilers, eliminating duplicated work and supporting unambiguous referencing of data contributing to their compiled data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. 40 CFR 262.90 - Project XL for Public Utilities in New York State.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... section including, but not limited to, the following: (1) Database management for each remote location as... consolidation of waste for economical shipment (including no longer shipping waste directly to a TSD from remote...

  15. 40 CFR 262.90 - Project XL for Public Utilities in New York State.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... section including, but not limited to, the following: (1) Database management for each remote location as... consolidation of waste for economical shipment (including no longer shipping waste directly to a TSD from remote...

  16. 40 CFR 262.90 - Project XL for Public Utilities in New York State.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... section including, but not limited to, the following: (1) Database management for each remote location as... consolidation of waste for economical shipment (including no longer shipping waste directly to a TSD from remote...

  17. 40 CFR 262.90 - Project XL for Public Utilities in New York State.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... section including, but not limited to, the following: (1) Database management for each remote location as... consolidation of waste for economical shipment (including no longer shipping waste directly to a TSD from remote...

  18. Ten Years Experience In Geo-Databases For Linear Facilities Risk Assessment (Lfra)

    NASA Astrophysics Data System (ADS)

    Oboni, F.

    2003-04-01

    Keywords: geo-environmental, database, ISO14000, management, decision-making, risk, pipelines, roads, railroads, loss control, SAR, hazard identification ABSTRACT: During the past decades, characterized by the development of the Risk Management (RM) culture, a variety of different RM models have been proposed by governmental agencies in various parts of the world. The most structured models appear to have originated in the field of environmental RM. These models are briefly reviewed in the first section of the paper focusing the attention on the difference between Hazard Management and Risk Management and the need to use databases in order to allow retrieval of specific information and effective updating. The core of the paper reviews a number of different RM approaches, based on extensions of geo-databases, specifically developed for linear facilities (LF) in transportation corridors since the early 90s in Switzerland, Italy, Canada, the US and South America. The applications are compared in terms of methodology, capabilities and resources necessary to their implementation. The paper then focuses the attention on the level of detail that applications and related data have to attain. Common pitfalls related to decision making based on hazards rather than on risks are discussed. The paper focuses the last sections on the description of the next generation of linear facility RA application, including examples of results and discussion of future methodological research. It is shown that geo-databases should be linked to loss control and accident reports in order to maximize their benefits. The links between RA and ISO 14000 (environmental management code) are explicitly considered.

  19. Sankofa pediatric HIV disclosure intervention cyber data management: building capacity in a resource-limited setting and ensuring data quality.

    PubMed

    Catlin, Ann Christine; Fernando, Sumudinie; Gamage, Ruwan; Renner, Lorna; Antwi, Sampson; Tettey, Jonas Kusah; Amisah, Kofi Aikins; Kyriakides, Tassos; Cong, Xiangyu; Reynolds, Nancy R; Paintsil, Elijah

    2015-01-01

    Prevalence of pediatric HIV disclosure is low in resource-limited settings. Innovative, culturally sensitive, and patient-centered disclosure approaches are needed. Conducting such studies in resource-limited settings is not trivial considering the challenges of capturing, cleaning, and storing clinical research data. To overcome some of these challenges, the Sankofa pediatric disclosure intervention adopted an interactive cyber infrastructure for data capture and analysis. The Sankofa Project database system is built on the HUBzero cyber infrastructure ( https://hubzero.org ), an open source software platform. The hub database components support: (1) data management - the "databases" component creates, configures, and manages database access, backup, repositories, applications, and access control; (2) data collection - the "forms" component is used to build customized web case report forms that incorporate common data elements and include tailored form submit processing to handle error checking, data validation, and data linkage as the data are stored to the database; and (3) data exploration - the "dataviewer" component provides powerful methods for users to view, search, sort, navigate, explore, map, graph, visualize, aggregate, drill-down, compute, and export data from the database. The Sankofa cyber data management tool supports a user-friendly, secure, and systematic collection of all data. We have screened more than 400 child-caregiver dyads and enrolled nearly 300 dyads, with tens of thousands of data elements. The dataviews have successfully supported all data exploration and analysis needs of the Sankofa Project. Moreover, the ability of the sites to query and view data summaries has proven to be an incentive for collecting complete and accurate data. The data system has all the desirable attributes of an electronic data capture tool. It also provides an added advantage of building data management capacity in resource-limited settings due to its innovative data query and summary views and availability of real-time support by the data management team.

  20. The Hawaiian Freshwater Algal Database (HfwADB): a laboratory LIMS and online biodiversity resource

    PubMed Central

    2012-01-01

    Background Biodiversity databases serve the important role of highlighting species-level diversity from defined geographical regions. Databases that are specially designed to accommodate the types of data gathered during regional surveys are valuable in allowing full data access and display to researchers not directly involved with the project, while serving as a Laboratory Information Management System (LIMS). The Hawaiian Freshwater Algal Database, or HfwADB, was modified from the Hawaiian Algal Database to showcase non-marine algal specimens collected from the Hawaiian Archipelago by accommodating the additional level of organization required for samples including multiple species. Description The Hawaiian Freshwater Algal Database is a comprehensive and searchable database containing photographs and micrographs of samples and collection sites, geo-referenced collecting information, taxonomic data and standardized DNA sequence data. All data for individual samples are linked through unique 10-digit accession numbers (“Isolate Accession”), the first five of which correspond to the collection site (“Environmental Accession”). Users can search online for sample information by accession number, various levels of taxonomy, habitat or collection site. HfwADB is hosted at the University of Hawaii, and was made publicly accessible in October 2011. At the present time the database houses data for over 2,825 samples of non-marine algae from 1,786 collection sites from the Hawaiian Archipelago. These samples include cyanobacteria, red and green algae and diatoms, as well as lesser representation from some other algal lineages. Conclusions HfwADB is a digital repository that acts as a Laboratory Information Management System for Hawaiian non-marine algal data. Users can interact with the repository through the web to view relevant habitat data (including geo-referenced collection locations) and download images of collection sites, specimen photographs and micrographs, and DNA sequences. It is publicly available at http://algae.manoa.hawaii.edu/hfwadb/. PMID:23095476

  1. Microcomputer Database Management Systems for Bibliographic Data.

    ERIC Educational Resources Information Center

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  2. The Data Base and Decision Making in Public Schools.

    ERIC Educational Resources Information Center

    Hedges, William D.

    1984-01-01

    Describes generic types of databases--file management systems, relational database management systems, and network/hierarchical database management systems--with their respective strengths and weaknesses; discusses factors to be considered in determining whether a database is desirable; and provides evaluative criteria for use in choosing…

  3. Call-Center Based Disease Management of Pediatric Asthmatics

    DTIC Science & Technology

    2006-04-01

    This study will measure the impact of CBDMP, which promotes patient education and empowerment, on multiple factors to include; patient/caregiver quality...Prepare and reproduce patient education materials, and informed consent work sheets. Contract Oracle data base administrator to establish database for... Patient education materials and informed consent documents were reproduced. A web-based Oracle data-base was determined to be both prohibitively

  4. Chesapeake Bay Program Water Quality Database

    EPA Pesticide Factsheets

    The Chesapeake Information Management System (CIMS), designed in 1996, is an integrated, accessible information management system for the Chesapeake Bay Region. CIMS is an organized, distributed library of information and software tools designed to increase basin-wide public access to Chesapeake Bay information. The information delivered by CIMS includes technical and public information, educational material, environmental indicators, policy documents, and scientific data. Through the use of relational databases, web-based programming, and web-based GIS a large number of Internet resources have been established. These resources include multiple distributed on-line databases, on-demand graphing and mapping of environmental data, and geographic searching tools for environmental information. Baseline monitoring data, summarized data and environmental indicators that document ecosystem status and trends, confirm linkages between water quality, habitat quality and abundance, and the distribution and integrity of biological populations are also available. One of the major features of the CIMS network is the Chesapeake Bay Program's Data Hub, providing users access to a suite of long- term water quality and living resources databases. Chesapeake Bay mainstem and tidal tributary water quality, benthic macroinvertebrates, toxics, plankton, and fluorescence data can be obtained for a network of over 800 monitoring stations.

  5. A Ranking Analysis of the Management Schools in Greater China (2000-2010): Evidence from the SSCI Database

    ERIC Educational Resources Information Center

    Hou, Mingjun; Fan, Peihua; Liu, Heng

    2014-01-01

    The authors rank the management schools in Greater China (including Mainland China, Hong Kong, Taiwan, and Macau) based on their academic publications in the Social Sciences Citation Index management and business journals from 2000 to 2010. Following K. Ritzberger's (2008) and X. Yu and Z. Gao's (2010) ranking method, the authors develop six…

  6. An annotated bibliography of invasive tree pathogens Sirococcus clavigignenti-juglandacearum, Phytophthora alni, and Phytophthora quercina and a regulatory policy and management practices for invasive species

    Treesearch

    T.M. Seeland; M.E. Ostry; R. Venette; J. Juzwik

    2006-01-01

    Provides a database of selected literature pertaining to the prevention, early detection and rapid response, control and management, and rehabilitation and restoration related to three invasive fungal pathogens of forest trees. Literature addressing regulatory policy and management practices for invasive species is also included.

  7. Applying Data Mining Principles to Library Data Collection.

    ERIC Educational Resources Information Center

    Guenther, Kim

    2000-01-01

    Explains how libraries can use data mining techniques for more effective data collection. Highlights include three phases: data selection and acquisition; data preparation and processing, including a discussion of the use of XML (extensible markup language); and data interpretation and integration, including database management systems. (LRW)

  8. Relational Information Management Data-Base System

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Erickson, W. J.; Gray, F. P.; Comfort, D. L.; Wahlstrom, S. O.; Von Limbach, G.

    1985-01-01

    DBMS with several features particularly useful to scientists and engineers. RIM5 interfaced with any application program written in language capable of Calling FORTRAN routines. Applications include data management for Space Shuttle Columbia tiles, aircraft flight tests, high-pressure piping, atmospheric chemistry, census, university registration, CAD/CAM Geometry, and civil-engineering dam construction.

  9. Effects of long-term soil and crop management on soil hydraulic properties for claypan soils

    USDA-ARS?s Scientific Manuscript database

    Regional and national soil maps have been developed along with associated soil property databases to assist users in making land management decisions based on soil characteristics. These soil properties include average values from soil characterization for each soil series. In reality, these propert...

  10. Google Earth for Landowners: Insights from Hands-on Workshops

    ERIC Educational Resources Information Center

    Huff, Tristan

    2014-01-01

    Google Earth is an accessible, user-friendly GIS that can help landowners in their management planning. I offered hands-on Google Earth workshops to landowners to teach skills, including mapmaking, length and area measurement, and database management. Workshop participants were surveyed at least 6 months following workshop completion, and learning…

  11. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  12. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  13. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  14. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  15. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  16. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... individual database managers; and to perform other functions as needed for the administration of the TV bands... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers...

  17. Specialized microbial databases for inductive exploration of microbial genome sequences

    PubMed Central

    Fang, Gang; Ho, Christine; Qiu, Yaowu; Cubas, Virginie; Yu, Zhou; Cabau, Cédric; Cheung, Frankie; Moszer, Ivan; Danchin, Antoine

    2005-01-01

    Background The enormous amount of genome sequence data asks for user-oriented databases to manage sequences and annotations. Queries must include search tools permitting function identification through exploration of related objects. Methods The GenoList package for collecting and mining microbial genome databases has been rewritten using MySQL as the database management system. Functions that were not available in MySQL, such as nested subquery, have been implemented. Results Inductive reasoning in the study of genomes starts from "islands of knowledge", centered around genes with some known background. With this concept of "neighborhood" in mind, a modified version of the GenoList structure has been used for organizing sequence data from prokaryotic genomes of particular interest in China. GenoChore , a set of 17 specialized end-user-oriented microbial databases (including one instance of Microsporidia, Encephalitozoon cuniculi, a member of Eukarya) has been made publicly available. These databases allow the user to browse genome sequence and annotation data using standard queries. In addition they provide a weekly update of searches against the world-wide protein sequences data libraries, allowing one to monitor annotation updates on genes of interest. Finally, they allow users to search for patterns in DNA or protein sequences, taking into account a clustering of genes into formal operons, as well as providing extra facilities to query sequences using predefined sequence patterns. Conclusion This growing set of specialized microbial databases organize data created by the first Chinese bacterial genome programs (ThermaList, Thermoanaerobacter tencongensis, LeptoList, with two different genomes of Leptospira interrogans and SepiList, Staphylococcus epidermidis) associated to related organisms for comparison. PMID:15698474

  18. Application of materials database (MAT.DB.) to materials education

    NASA Technical Reports Server (NTRS)

    Liu, Ping; Waskom, Tommy L.

    1994-01-01

    Finding the right material for the job is an important aspect of engineering. Sometimes the choice is as fundamental as selecting between steel and aluminum. Other times, the choice may be between different compositions in an alloy. Discovering and compiling materials data is a demanding task, but it leads to accurate models for analysis and successful materials application. Mat. DB. is a database management system designed for maintaining information on the properties and processing of engineered materials, including metals, plastics, composites, and ceramics. It was developed by the Center for Materials Data of American Society for Metals (ASM) International. The ASM Center for Materials Data collects and reviews material property data for publication in books, reports, and electronic database. Mat. DB was developed to aid the data management and material applications.

  19. Decision Support Systems for Research and Management in Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Rodriquez, Luis F.

    2004-01-01

    Decision support systems have been implemented in many applications including strategic planning for battlefield scenarios, corporate decision making for business planning, production planning and control systems, and recommendation generators like those on Amazon.com(Registered TradeMark). Such tools are reviewed for developing a similar tool for NASA's ALS Program. DSS are considered concurrently with the development of the OPIS system, a database designed for chronicling of research and development in ALS. By utilizing the OPIS database, it is anticipated that decision support can be provided to increase the quality of decisions by ALS managers and researchers.

  20. The NASA ADS Abstract Service and the Distributed Astronomy Digital Library [and] Project Soup: Comparing Evaluations of Digital Collection Efforts [and] Cross-Organizational Access Management: A Digital Library Authentication and Authorization Architecture [and] BibRelEx: Exploring Bibliographic Databases by Visualization of Annotated Content-based Relations [and] Semantics-Sensitive Retrieval for Digital Picture Libraries [and] Encoded Archival Description: An Introduction and Overview.

    ERIC Educational Resources Information Center

    Kurtz, Michael J.; Eichorn, Guenther; Accomazzi, Alberto; Grant, Carolyn S.; Demleitner, Markus; Murray, Stephen S.; Jones, Michael L. W.; Gay, Geri K.; Rieger, Robert H.; Millman, David; Bruggemann-Klein, Anne; Klein, Rolf; Landgraf, Britta; Wang, James Ze; Li, Jia; Chan, Desmond; Wiederhold, Gio; Pitti, Daniel V.

    1999-01-01

    Includes six articles that discuss a digital library for astronomy; comparing evaluations of digital collection efforts; cross-organizational access management of Web-based resources; searching scientific bibliographic databases based on content-based relations between documents; semantics-sensitive retrieval for digital picture libraries; and…

  1. eCOMPAGT – efficient Combination and Management of Phenotypes and Genotypes for Genetic Epidemiology

    PubMed Central

    Schönherr, Sebastian; Weißensteiner, Hansi; Coassin, Stefan; Specht, Günther; Kronenberg, Florian; Brandstätter, Anita

    2009-01-01

    Background High-throughput genotyping and phenotyping projects of large epidemiological study populations require sophisticated laboratory information management systems. Most epidemiological studies include subject-related personal information, which needs to be handled with care by following data privacy protection guidelines. In addition, genotyping core facilities handling cooperative projects require a straightforward solution to monitor the status and financial resources of the different projects. Description We developed a database system for an efficient combination and management of phenotypes and genotypes (eCOMPAGT) deriving from genetic epidemiological studies. eCOMPAGT securely stores and manages genotype and phenotype data and enables different user modes with different rights. Special attention was drawn on the import of data deriving from TaqMan and SNPlex genotyping assays. However, the database solution is adjustable to other genotyping systems by programming additional interfaces. Further important features are the scalability of the database and an export interface to statistical software. Conclusion eCOMPAGT can store, administer and connect phenotype data with all kinds of genotype data and is available as a downloadable version at . PMID:19432954

  2. Publishing Trends in Educational Computing.

    ERIC Educational Resources Information Center

    O'Hair, Marilyn; Johnson, D. LaMont

    1989-01-01

    Describes results of a survey of secondary school and college teachers that was conducted to determine subject matter that should be included in educational computing journals. Areas of interest included computer applications; artificial intelligence; computer-aided instruction; computer literacy; computer-managed instruction; databases; distance…

  3. Southern African Treatment Resistance Network (SATuRN) RegaDB HIV drug resistance and clinical management database: supporting patient management, surveillance and research in southern Africa

    PubMed Central

    Manasa, Justen; Lessells, Richard; Rossouw, Theresa; Naidu, Kevindra; Van Vuuren, Cloete; Goedhals, Dominique; van Zyl, Gert; Bester, Armand; Skingsley, Andrew; Stott, Katharine; Danaviah, Siva; Chetty, Terusha; Singh, Lavanya; Moodley, Pravi; Iwuji, Collins; McGrath, Nuala; Seebregts, Christopher J.; de Oliveira, Tulio

    2014-01-01

    Abstract Substantial amounts of data have been generated from patient management and academic exercises designed to better understand the human immunodeficiency virus (HIV) epidemic and design interventions to control it. A number of specialized databases have been designed to manage huge data sets from HIV cohort, vaccine, host genomic and drug resistance studies. Besides databases from cohort studies, most of the online databases contain limited curated data and are thus sequence repositories. HIV drug resistance has been shown to have a great potential to derail the progress made thus far through antiretroviral therapy. Thus, a lot of resources have been invested in generating drug resistance data for patient management and surveillance purposes. Unfortunately, most of the data currently available relate to subtype B even though >60% of the epidemic is caused by HIV-1 subtype C. A consortium of clinicians, scientists, public health experts and policy markers working in southern Africa came together and formed a network, the Southern African Treatment and Resistance Network (SATuRN), with the aim of increasing curated HIV-1 subtype C and tuberculosis drug resistance data. This article describes the HIV-1 data curation process using the SATuRN Rega database. The data curation is a manual and time-consuming process done by clinical, laboratory and data curation specialists. Access to the highly curated data sets is through applications that are reviewed by the SATuRN executive committee. Examples of research outputs from the analysis of the curated data include trends in the level of transmitted drug resistance in South Africa, analysis of the levels of acquired resistance among patients failing therapy and factors associated with the absence of genotypic evidence of drug resistance among patients failing therapy. All these studies have been important for informing first- and second-line therapy. This database is a free password-protected open source database available on www.bioafrica.net. Database URL: http://www.bioafrica.net/regadb/ PMID:24504151

  4. Database Management Systems: New Homes for Migrating Bibliographic Records.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Bierbaum, Esther G.

    1987-01-01

    Assesses bibliographic databases as part of visionary text systems such as hypertext and scholars' workstations. Downloading is discussed in terms of the capability to search records and to maintain unique bibliographic descriptions, and relational database management systems, file managers, and text databases are reviewed as possible hosts for…

  5. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  6. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  7. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  8. Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.

    ERIC Educational Resources Information Center

    Pieska, K. A. O.

    1986-01-01

    Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)

  9. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  10. Information flow in the DAMA project beyond database managers: information flow managers

    NASA Astrophysics Data System (ADS)

    Russell, Lucian; Wolfson, Ouri; Yu, Clement

    1996-12-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point of sale information, is being considered in the Demand Activated Manufacturing Project (DAMA) of the American Textile Partnership (AMTEX) project. A scenario is examined in which 100 000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26 000 suppliers through the use of bill of materials explosions at four levels of detail. Enabling this communication requires an approach that shares common features with both workflows and database management. A new paradigm, the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced so as to keep estimates of demand as current as possible.

  11. Construction of databases: advances and significance in clinical research.

    PubMed

    Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian

    2015-12-01

    Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.

  12. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  13. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  14. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  15. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  16. Considerations and benefits of implementing an online database tool for business continuity.

    PubMed

    Mackinnon, Susanne; Pinette, Jennifer

    2016-01-01

    In today's challenging climate of ongoing fiscal restraints, limited resources and complex organisational structures there is an acute need to investigate opportunities to facilitate enhanced delivery of business continuity programmes while maintaining or increasing acceptable levels of service delivery. In 2013, Health Emergency Management British Columbia (HEMBC), responsible for emergency management and business continuity activities across British Columbia's health sector, transitioned its business continuity programme from a manual to automated process with the development of a customised online database, known as the Health Emergency Management Assessment Tool (HEMAT). Key benefits to date include a more efficient business continuity input process, immediate situational awareness for use in emergency response and/or advanced planning and streamlined analyses for generation of reports.

  17. The Muon Conditions Data Management:. Database Architecture and Software Infrastructure

    NASA Astrophysics Data System (ADS)

    Verducci, Monica

    2010-04-01

    The management of the Muon Conditions Database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored and their analysis. The Muon conditions database is responsible for almost all of the 'non-event' data and detector quality flags storage needed for debugging of the detector operations and for performing the reconstruction and the analysis. In particular for the early data, the knowledge of the detector performance, the corrections in term of efficiency and calibration will be extremely important for the correct reconstruction of the events. In this work, an overview of the entire Muon conditions database architecture is given, in particular the different sources of the data and the storage model used, including the database technology associated. Particular emphasis is given to the Data Quality chain: the flow of the data, the analysis and the final results are described. In addition, the description of the software interfaces used to access to the conditions data are reported, in particular, in the ATLAS Offline Reconstruction framework ATHENA environment.

  18. 47 CFR 52.107 - Hoarding.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... provision shall be included in the Service Management System tariff and in the local exchange carriers' toll free database access tariffs: [T]he Federal Communications Commission (“FCC”) has concluded that...

  19. The NSO FTS database program and archive (FTSDBM)

    NASA Technical Reports Server (NTRS)

    Lytle, D. M.

    1992-01-01

    Data from the NSO Fourier transform spectrometer is being re-archived from half inch tape onto write-once compact disk. In the process, information about each spectrum and a low resolution copy of each spectrum is being saved into an on-line database. FTSDBM is a simple database management program in the NSO external package for IRAF. A command language allows the FTSDBM user to add entries to the database, delete entries, select subsets from the database based on keyword values including ranges of values, create new database files based on these subsets, make keyword lists, examine low resolution spectra graphically, and make disk number/file number lists. Once the archive is complete, FTSDBM will allow the database to be efficiently searched for data of interest to the user and the compact disk format will allow random access to that data.

  20. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    ERIC Educational Resources Information Center

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  1. 78 FR 28756 - Defense Federal Acquisition Regulation Supplement: System for Award Management Name Changes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... Excluded Parties Listing System (EPLS) databases into the System for Award Management (SAM) database. DATES... combined the functional capabilities of the CCR, ORCA, and EPLS procurement systems into the SAM database... identification number and the type of organization from the System for Award Management database. 0 3. Revise the...

  2. An Examination of Selected Software Testing Tools: 1992

    DTIC Science & Technology

    1992-12-01

    Report ....................................................... 27-19 Figure 27-17. Metrics Manager Database Full Report...historical test database , the test management and problem reporting tools were examined using the sample test database provided by each supplier. 4-4...track the impact of new methods, organi- zational structures, and technologies. Metrics Manager is supported by an industry database that allows

  3. The Advent of Portals.

    ERIC Educational Resources Information Center

    Jackson, Mary E.

    2002-01-01

    Explains portals as tools that gather a variety of electronic information resources, including local library resources, into a single Web page. Highlights include cross-database searching; integration with university portals and course management software; the ARL (Association of Research Libraries) Scholars Portal Initiative; and selected vendors…

  4. Schools Inc.: An Administrator's Guide to the Business of Education.

    ERIC Educational Resources Information Center

    McCarthy, Bob; And Others

    1989-01-01

    This theme issue describes ways in which educational administrators are successfully automating many of their administrative tasks. Articles focus on student management; office automation, including word processing, databases, and spreadsheets; human resources; support services, including supplies, textbooks, and learning resources; financial…

  5. Huntington's Disease Research Roster Support with a Microcomputer Database Management System

    PubMed Central

    Gersting, J. M.; Conneally, P. M.; Beidelman, K.

    1983-01-01

    This paper chronicles the MEGADATS (Medical Genetics Acquisition and DAta Transfer System) database development effort in collecting, storing, retrieving, and plotting human family pedigrees. The newest system, MEGADATS-3M, is detailed. Emphasis is on the microcomputer version of MEGADATS-3M and its use to support the Huntington's Disease research roster project. Examples of data input and pedigree plotting are included.

  6. (BARS) -- Bibliographic Retrieval System Sandia Shock Compression (SSC) database Shock Physics Index (SPHINX) database. Volume 1: UNIX version query guide customized application for INGRES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herrmann, W.; von Laven, G.M.; Parker, T.

    1993-09-01

    The Bibliographic Retrieval System (BARS) is a data base management system specially designed to retrieve bibliographic references. Two databases are available, (i) the Sandia Shock Compression (SSC) database which contains over 5700 references to the literature related to stress waves in solids and their applications, and (ii) the Shock Physics Index (SPHINX) which includes over 8000 further references to stress waves in solids, material properties at intermediate and low rates, ballistic and hypervelocity impact, and explosive or shock fabrication methods. There is some overlap in the information in the two data bases.

  7. Efficacy of Noninvasive Stellate Ganglion Blockade Performed Using Physical Agent Modalities in Patients with Sympathetic Hyperactivity-Associated Disorders: A Systematic Review and Meta-Analysis

    PubMed Central

    Liao, Chun-De; Tsauo, Jau-Yih; Liou, Tsan-Hon

    2016-01-01

    Background Stellate ganglion blockade (SGB) is mainly used to relieve symptoms of neuropathic pain in conditions such as complex regional pain syndrome and has several potential complications. Noninvasive SGB performed using physical agent modalities (PAMs), such as light irradiation and electrical stimulation, can be clinically used as an alternative to conventional invasive SGB. However, its application protocols vary and its clinical efficacy remains controversial. This study investigated the use of noninvasive SGB for managing neuropathic pain or other disorders associated with sympathetic hyperactivity. Materials and Methods We performed a comprehensive search of the following online databases: Medline, PubMed, Excerpta Medica Database, Cochrane Library Database, Ovid MEDLINE, Europe PubMed Central, EBSCOhost Research Databases, CINAHL, ProQuest Research Library, Physiotherapy Evidence Database, WorldWideScience, BIOSIS, and Google Scholar. We identified and included quasi-randomized or randomized controlled trials reporting the efficacy of SGB performed using therapeutic ultrasound, transcutaneous electrical nerve stimulation, light irradiation using low-level laser therapy, or xenon light or linearly polarized near-infrared light irradiation near or over the stellate ganglion region in treating complex regional pain syndrome or disorders requiring sympatholytic management. The included articles were subjected to a meta-analysis and risk of bias assessment. Results Nine randomized and four quasi-randomized controlled trials were included. Eleven trials had good methodological quality with a Physiotherapy Evidence Database (PEDro) score of ≥6, whereas the remaining two trials had a PEDro score of <6. The meta-analysis results revealed that the efficacy of noninvasive SGB on 100-mm visual analog pain score is higher than that of a placebo or active control (weighted mean difference, −21.59 mm; 95% CI, −34.25, −8.94; p = 0.0008). Conclusions Noninvasive SGB performed using PAMs effectively relieves pain of various etiologies, making it a valuable addition to the contemporary pain management armamentarium. However, this evidence is limited by the potential risk of bias. PMID:27911934

  8. Meta-All: a system for managing metabolic pathway information.

    PubMed

    Weise, Stephan; Grosse, Ivo; Klukas, Christian; Koschützki, Dirk; Scholz, Uwe; Schreiber, Falk; Junker, Björn H

    2006-10-23

    Many attempts are being made to understand biological subjects at a systems level. A major resource for these approaches are biological databases, storing manifold information about DNA, RNA and protein sequences including their functional and structural motifs, molecular markers, mRNA expression levels, metabolite concentrations, protein-protein interactions, phenotypic traits or taxonomic relationships. The use of these databases is often hampered by the fact that they are designed for special application areas and thus lack universality. Databases on metabolic pathways, which provide an increasingly important foundation for many analyses of biochemical processes at a systems level, are no exception from the rule. Data stored in central databases such as KEGG, BRENDA or SABIO-RK is often limited to read-only access. If experimentalists want to store their own data, possibly still under investigation, there are two possibilities. They can either develop their own information system for managing that own data, which is very time-consuming and costly, or they can try to store their data in existing systems, which is often restricted. Hence, an out-of-the-box information system for managing metabolic pathway data is needed. We have designed META-ALL, an information system that allows the management of metabolic pathways, including reaction kinetics, detailed locations, environmental factors and taxonomic information. Data can be stored together with quality tags and in different parallel versions. META-ALL uses Oracle DBMS and Oracle Application Express. We provide the META-ALL information system for download and use. In this paper, we describe the database structure and give information about the tools for submitting and accessing the data. As a first application of META-ALL, we show how the information contained in a detailed kinetic model can be stored and accessed. META-ALL is a system for managing information about metabolic pathways. It facilitates the handling of pathway-related data and is designed to help biochemists and molecular biologists in their daily research. It is available on the Web at http://bic-gh.de/meta-all and can be downloaded free of charge and installed locally.

  9. Meta-All: a system for managing metabolic pathway information

    PubMed Central

    Weise, Stephan; Grosse, Ivo; Klukas, Christian; Koschützki, Dirk; Scholz, Uwe; Schreiber, Falk; Junker, Björn H

    2006-01-01

    Background Many attempts are being made to understand biological subjects at a systems level. A major resource for these approaches are biological databases, storing manifold information about DNA, RNA and protein sequences including their functional and structural motifs, molecular markers, mRNA expression levels, metabolite concentrations, protein-protein interactions, phenotypic traits or taxonomic relationships. The use of these databases is often hampered by the fact that they are designed for special application areas and thus lack universality. Databases on metabolic pathways, which provide an increasingly important foundation for many analyses of biochemical processes at a systems level, are no exception from the rule. Data stored in central databases such as KEGG, BRENDA or SABIO-RK is often limited to read-only access. If experimentalists want to store their own data, possibly still under investigation, there are two possibilities. They can either develop their own information system for managing that own data, which is very time-consuming and costly, or they can try to store their data in existing systems, which is often restricted. Hence, an out-of-the-box information system for managing metabolic pathway data is needed. Results We have designed META-ALL, an information system that allows the management of metabolic pathways, including reaction kinetics, detailed locations, environmental factors and taxonomic information. Data can be stored together with quality tags and in different parallel versions. META-ALL uses Oracle DBMS and Oracle Application Express. We provide the META-ALL information system for download and use. In this paper, we describe the database structure and give information about the tools for submitting and accessing the data. As a first application of META-ALL, we show how the information contained in a detailed kinetic model can be stored and accessed. Conclusion META-ALL is a system for managing information about metabolic pathways. It facilitates the handling of pathway-related data and is designed to help biochemists and molecular biologists in their daily research. It is available on the Web at and can be downloaded free of charge and installed locally. PMID:17059592

  10. BioWarehouse: a bioinformatics database warehouse toolkit

    PubMed Central

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David WJ; Tenenbaum, Jessica D; Karp, Peter D

    2006-01-01

    Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the database integration problem for bioinformatics. PMID:16556315

  11. BioWarehouse: a bioinformatics database warehouse toolkit.

    PubMed

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David W J; Tenenbaum, Jessica D; Karp, Peter D

    2006-03-23

    This article addresses the problem of interoperation of heterogeneous bioinformatics databases. We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. BioWarehouse embodies significant progress on the database integration problem for bioinformatics.

  12. Human health risk assessment database, "the NHSRC toxicity value database": supporting the risk assessment process at US EPA's National Homeland Security Research Center.

    PubMed

    Moudgal, Chandrika J; Garrahan, Kevin; Brady-Roberts, Eletha; Gavrelis, Naida; Arbogast, Michelle; Dun, Sarah

    2008-11-15

    The toxicity value database of the United States Environmental Protection Agency's (EPA) National Homeland Security Research Center has been in development since 2004. The toxicity value database includes a compilation of agent property, toxicity, dose-response, and health effects data for 96 agents: 84 chemical and radiological agents and 12 biotoxins. The database is populated with multiple toxicity benchmark values and agent property information from secondary sources, with web links to the secondary sources, where available. A selected set of primary literature citations and associated dose-response data are also included. The toxicity value database offers a powerful means to quickly and efficiently gather pertinent toxicity and dose-response data for a number of agents that are of concern to the nation's security. This database, in conjunction with other tools, will play an important role in understanding human health risks, and will provide a means for risk assessors and managers to make quick and informed decisions on the potential health risks and determine appropriate responses (e.g., cleanup) to agent release. A final, stand alone MS ACESSS working version of the toxicity value database was completed in November, 2007.

  13. Expert systems identify fossils and manage large paleontological databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beightol, D.S.; Conrad, M.A.

    EXPAL is a computer program permitting creation and maintenance of comprehensive databases in marine paleontology. It is designed to assist specialists and non-specialists. EXPAL includes a powerful expert system based on the morphological descriptors specific to a given group of fossils. The expert system may be used, for example, to describe and automatically identify an unknown specimen. EXPAL was first applied to Dasycladales (Calcareous green algae). Projects are under way for corresponding expert systems and databases on planktonic foraminifers and calpionellids. EXPAL runs on an IBM XT or compatible microcomputer.

  14. Database Searching by Managers.

    ERIC Educational Resources Information Center

    Arnold, Stephen E.

    Managers and executives need the easy and quick access to business and management information that online databases can provide, but many have difficulty articulating their search needs to an intermediary. One possible solution would be to encourage managers and their immediate support staff members to search textual databases directly as they now…

  15. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.« less

  16. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.« less

  17. Ottawa Panel Evidence-Based Clinical Practice Guidelines for Foot Care in the Management of Juvenile Idiopathic Arthritis.

    PubMed

    Brosseau, Lucie; Toupin-April, Karine; Wells, George; Smith, Christine A; Pugh, Arlanna G; Stinson, Jennifer N; Duffy, Ciarán M; Gifford, Wendy; Moher, David; Sherrington, Catherine; Cavallo, Sabrina; De Angelis, Gino; Loew, Laurianne; Rahman, Prinon; Marcotte, Rachel; Taki, Jade; Bisaillon, Jacinthe; King, Judy; Coda, Andrea; Hendry, Gordon J; Gauvreau, Julie; Hayles, Martin; Hayles, Kay; Feldman, Brian; Kenny, Glen P; Li, Jing Xian; Briggs, Andrew M; Martini, Rose; Feldman, Debbie Ehrmann; Maltais, Désirée B; Tupper, Susan; Bigford, Sarah; Bisch, Marg

    2016-07-01

    To create evidence-based guidelines evaluating foot care interventions for the management of juvenile idiopathic arthritis (JIA). An electronic literature search of the following databases from database inception to May 2015 was conducted: MEDLINE (Ovid), EMBASE (Ovid), Cochrane CENTRAL, and clinicaltrials.gov. The Ottawa Panel selection criteria targeted studies that assessed foot care or foot orthotic interventions for the management of JIA in those aged 0 to ≤18 years. The Physiotherapy Evidence Database scale was used to evaluate study quality, of which only high-quality studies were included (score, ≥5). A total of 362 records were screened, resulting in 3 full-text articles and 1 additional citation containing supplementary information included for the analysis. Two reviewers independently extracted study data (intervention, comparator, outcome, time period, study design) from the included studies by using standardized data extraction forms. Directed by Cochrane Collaboration methodology, the statistical analysis produced figures and graphs representing the strength of intervention outcomes and their corresponding grades (A, B, C+, C, C-, D+, D, D-). Clinical significance was achieved when an improvement of ≥30% between the intervention and control groups was present, whereas P>.05 indicated statistical significance. An expert panel Delphi consensus (≥80%) was required for the endorsement of recommendations. All included studies were of high quality and analyzed the effects of multidisciplinary foot care, customized foot orthotics, and shoe inserts for the management of JIA. Custom-made foot orthotics and prefabricated shoe inserts displayed the greatest improvement in pain intensity, activity limitation, foot pain, and disability reduction (grades A, C+). The use of customized foot orthotics and prefabricated shoe inserts seems to be a good choice for managing foot pain and function in JIA. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Charles G. Crawford

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1 development activities.« less

  19. Generalized Database Management System Support for Numeric Database Environments.

    ERIC Educational Resources Information Center

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  20. Integrated management of thesis using clustering method

    NASA Astrophysics Data System (ADS)

    Astuti, Indah Fitri; Cahyadi, Dedy

    2017-02-01

    Thesis is one of major requirements for student in pursuing their bachelor degree. In fact, finishing the thesis involves a long process including consultation, writing manuscript, conducting the chosen method, seminar scheduling, searching for references, and appraisal process by the board of mentors and examiners. Unfortunately, most of students find it hard to match all the lecturers' free time to sit together in a seminar room in order to examine the thesis. Therefore, seminar scheduling process should be on the top of priority to be solved. Manual mechanism for this task no longer fulfills the need. People in campus including students, staffs, and lecturers demand a system in which all the stakeholders can interact each other and manage the thesis process without conflicting their timetable. A branch of computer science named Management Information System (MIS) could be a breakthrough in dealing with thesis management. This research conduct a method called clustering to distinguish certain categories using mathematics formulas. A system then be developed along with the method to create a well-managed tool in providing some main facilities such as seminar scheduling, consultation and review process, thesis approval, assessment process, and also a reliable database of thesis. The database plays an important role in present and future purposes.

  1. Database usage and performance for the Fermilab Run II experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonham, D.; Box, D.; Gallas, E.

    2004-12-01

    The Run II experiments at Fermilab, CDF and D0, have extensive database needs covering many areas of their online and offline operations. Delivering data to users and processing farms worldwide has represented major challenges to both experiments. The range of applications employing databases includes, calibration (conditions), trigger information, run configuration, run quality, luminosity, data management, and others. Oracle is the primary database product being used for these applications at Fermilab and some of its advanced features have been employed, such as table partitioning and replication. There is also experience with open source database products such as MySQL for secondary databasesmore » used, for example, in monitoring. Tools employed for monitoring the operation and diagnosing problems are also described.« less

  2. JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.

    PubMed

    Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J

    2010-04-01

    The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.

  3. An Integrated Korean Biodiversity and Genetic Information Retrieval System

    PubMed Central

    Lim, Jeongheui; Bhak, Jong; Oh, Hee-Mock; Kim, Chang-Bae; Park, Yong-Ha; Paek, Woon Kee

    2008-01-01

    Background On-line biodiversity information databases are growing quickly and being integrated into general bioinformatics systems due to the advances of fast gene sequencing technologies and the Internet. These can reduce the cost and effort of performing biodiversity surveys and genetic searches, which allows scientists to spend more time researching and less time collecting and maintaining data. This will cause an increased rate of knowledge build-up and improve conservations. The biodiversity databases in Korea have been scattered among several institutes and local natural history museums with incompatible data types. Therefore, a comprehensive database and a nation wide web portal for biodiversity information is necessary in order to integrate diverse information resources, including molecular and genomic databases. Results The Korean Natural History Research Information System (NARIS) was built and serviced as the central biodiversity information system to collect and integrate the biodiversity data of various institutes and natural history museums in Korea. This database aims to be an integrated resource that contains additional biological information, such as genome sequences and molecular level diversity. Currently, twelve institutes and museums in Korea are integrated by the DiGIR (Distributed Generic Information Retrieval) protocol, with Darwin Core2.0 format as its metadata standard for data exchange. Data quality control and statistical analysis functions have been implemented. In particular, integrating molecular and genetic information from the National Center for Biotechnology Information (NCBI) databases with NARIS was recently accomplished. NARIS can also be extended to accommodate other institutes abroad, and the whole system can be exported to establish local biodiversity management servers. Conclusion A Korean data portal, NARIS, has been developed to efficiently manage and utilize biodiversity data, which includes genetic resources. NARIS aims to be integral in maximizing bio-resource utilization for conservation, management, research, education, industrial applications, and integration with other bioinformation data resources. It can be found at . PMID:19091024

  4. Database documentation of marine mammal stranding and mortality: current status review and future prospects.

    PubMed

    Chan, Derek K P; Tsui, Henry C L; Kot, Brian C W

    2017-11-21

    Databases are systematic tools to archive and manage information related to marine mammal stranding and mortality events. Stranding response networks, governmental authorities and non-governmental organizations have established regional or national stranding networks and have developed unique standard stranding response and necropsy protocols to document and track stranded marine mammal demographics, signalment and health data. The objectives of this study were to (1) describe and review the current status of marine mammal stranding and mortality databases worldwide, including the year established, types of database and their goals; and (2) summarize the geographic range included in the database, the number of cases recorded, accessibility, filter and display methods. Peer-reviewed literature was searched, focussing on published databases of live and dead marine mammal strandings and mortality and information released from stranding response organizations (i.e. online updates, journal articles and annual stranding reports). Databases that were not published in the primary literature or recognized by government agencies were excluded. Based on these criteria, 10 marine mammal stranding and mortality databases were identified, and strandings and necropsy data found in these databases were evaluated. We discuss the results, limitations and future prospects of database development. Future prospects include the development and application of virtopsy, a new necropsy investigation tool. A centralized web-accessed database of all available postmortem multimedia from stranded marine mammals may eventually support marine conservation and policy decisions, which will allow the use of marine animals as sentinels of ecosystem health, working towards a 'One Ocean-One Health' ideal.

  5. CMO: Cruise Metadata Organizer for JAMSTEC Research Cruises

    NASA Astrophysics Data System (ADS)

    Fukuda, K.; Saito, H.; Hanafusa, Y.; Vanroosebeke, A.; Kitayama, T.

    2011-12-01

    JAMSTEC's Data Research Center for Marine-Earth Sciences manages and distributes a wide variety of observational data and samples obtained from JAMSTEC research vessels and deep sea submersibles. Generally, metadata are essential to identify data and samples were obtained. In JAMSTEC, cruise metadata include cruise information such as cruise ID, name of vessel, research theme, and diving information such as dive number, name of submersible and position of diving point. They are submitted by chief scientists of research cruises in the Microsoft Excel° spreadsheet format, and registered into a data management database to confirm receipt of observational data files, cruise summaries, and cruise reports. The cruise metadata are also published via "JAMSTEC Data Site for Research Cruises" within two months after end of cruise. Furthermore, these metadata are distributed with observational data, images and samples via several data and sample distribution websites after a publication moratorium period. However, there are two operational issues in the metadata publishing process. One is that duplication efforts and asynchronous metadata across multiple distribution websites due to manual metadata entry into individual websites by administrators. The other is that differential data types or representation of metadata in each website. To solve those problems, we have developed a cruise metadata organizer (CMO) which allows cruise metadata to be connected from the data management database to several distribution websites. CMO is comprised of three components: an Extensible Markup Language (XML) database, an Enterprise Application Integration (EAI) software, and a web-based interface. The XML database is used because of its flexibility for any change of metadata. Daily differential uptake of metadata from the data management database to the XML database is automatically processed via the EAI software. Some metadata are entered into the XML database using the web-based interface by a metadata editor in CMO as needed. Then daily differential uptake of metadata from the XML database to databases in several distribution websites is automatically processed using a convertor defined by the EAI software. Currently, CMO is available for three distribution websites: "Deep Sea Floor Rock Sample Database GANSEKI", "Marine Biological Sample Database", and "JAMSTEC E-library of Deep-sea Images". CMO is planned to provide "JAMSTEC Data Site for Research Cruises" with metadata in the future.

  6. The opportunities and obstacles in developing a vascular birthmark database for clinical and research use.

    PubMed

    Sharma, Vishal K; Fraulin, Frankie Og; Harrop, A Robertson; McPhalen, Donald F

    2011-01-01

    Databases are useful tools in clinical settings. The authors review the benefits and challenges associated with the development and implementation of an efficient electronic database for the multidisciplinary Vascular Birthmark Clinic at the Alberta Children's Hospital, Calgary, Alberta. The content and structure of the database were designed using the technical expertise of a data analyst from the Calgary Health Region. Relevant clinical and demographic data fields were included with the goal of documenting ongoing care of individual patients, and facilitating future epidemiological studies of this patient population. After completion of this database, 10 challenges encountered during development were retrospectively identified. Practical solutions for these challenges are presented. THE CHALLENGES IDENTIFIED DURING THE DATABASE DEVELOPMENT PROCESS INCLUDED: identification of relevant data fields; balancing simplicity and user-friendliness with complexity and comprehensive data storage; database expertise versus clinical expertise; software platform selection; linkage of data from the previous spreadsheet to a new data management system; ethics approval for the development of the database and its utilization for research studies; ensuring privacy and limited access to the database; integration of digital photographs into the database; adoption of the database by support staff in the clinic; and maintaining up-to-date entries in the database. There are several challenges involved in the development of a useful and efficient clinical database. Awareness of these potential obstacles, in advance, may simplify the development of clinical databases by others in various surgical settings.

  7. Literature Review and Database of Relations Between Salinity and Aquatic Biota: Applications to Bowdoin National Wildlife Refuge, Montana

    USGS Publications Warehouse

    Gleason, Robert A.; Tangen, Brian A.; Laubhan, Murray K.; Finocchiaro, Raymond G.; Stamm, John F.

    2009-01-01

    Long-term accumulation of salts in wetlands at Bowdoin National Wildlife Refuge (NWR), Mont., has raised concern among wetland managers that increasing salinity may threaten plant and invertebrate communities that provide important habitat and food resources for migratory waterfowl. Currently, the U.S. Fish and Wildlife Service (USFWS) is evaluating various water management strategies to help maintain suitable ranges of salinity to sustain plant and invertebrate resources of importance to wildlife. To support this evaluation, the USFWS requested that the U.S. Geological Survey (USGS) provide information on salinity ranges of water and soil for common plants and invertebrates on Bowdoin NWR lands. To address this need, we conducted a search of the literature on occurrences of plants and invertebrates in relation to salinity and pH of the water and soil. The compiled literature was used to (1) provide a general overview of salinity concepts, (2) document published tolerances and adaptations of biota to salinity, (3) develop databases that the USFWS can use to summarize the range of reported salinity values associated with plant and invertebrate taxa, and (4) perform database summaries that describe reported salinity ranges associated with plants and invertebrates at Bowdoin NWR. The purpose of this report is to synthesize information to facilitate a better understanding of the ecological relations between salinity and flora and fauna when developing wetland management strategies. A primary focus of this report is to provide information to help evaluate and address salinity issues at Bowdoin NWR; however, the accompanying databases, as well as concepts and information discussed, are applicable to other areas or refuges. The accompanying databases include salinity values reported for 411 plant taxa and 330 invertebrate taxa. The databases are available in Microsoft Excel version 2007 (http://pubs.usgs.gov/sir/2009/5098/downloads/databases_21april2009.xls) and contain 27 data fields that include variables such as taxonomic identification, values for salinity and pH, wetland classification, location of study, and source of data. The databases are not exhaustive of the literature and are biased toward wetland habitats located in the glaciated North-Central United States; however, the databases do encompass a diversity of biota commonly found in brackish and freshwater inland wetland habitats.

  8. Rasdaman for Big Spatial Raster Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.

    2015-12-01

    Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.

  9. Geothopica and the interactive analysis and visualization of the updated Italian National Geothermal Database

    NASA Astrophysics Data System (ADS)

    Trumpy, Eugenio; Manzella, Adele

    2017-02-01

    The Italian National Geothermal Database (BDNG), is the largest collection of Italian Geothermal data and was set up in the 1980s. It has since been updated both in terms of content and management tools: information on deep wells and thermal springs (with temperature > 30 °C) are currently organized and stored in a PostgreSQL relational database management system, which guarantees high performance, data security and easy access through different client applications. The BDNG is the core of the Geothopica web site, whose webGIS tool allows different types of user to access geothermal data, to visualize multiple types of datasets, and to perform integrated analyses. The webGIS tool has been recently improved by two specially designed, programmed and implemented visualization tools to display data on well lithology and underground temperatures. This paper describes the contents of the database and its software and data update, as well as the webGIS tool including the new tools for data lithology and temperature visualization. The geoinformation organized in the database and accessible through Geothopica is of use not only for geothermal purposes, but also for any kind of georesource and CO2 storage project requiring the organization of, and access to, deep underground data. Geothopica also supports project developers, researchers, and decision makers in the assessment, management and sustainable deployment of georesources.

  10. Laboratory Information Systems.

    PubMed

    Henricks, Walter H

    2015-06-01

    Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. The new geographic information system in ETVA VI.PE.

    NASA Astrophysics Data System (ADS)

    Xagoraris, Zafiris; Soulis, George

    2016-08-01

    ETVA VI.PE. S.A. is a member of the Piraeus Bank Group of Companies and its activities include designing, developing, exploiting and managing Industrial Areas throughout Greece. Inside ETVA VI.PE.'s thirty-one Industrial Parks there are currently 2,500 manufacturing companies established, with 40,000 employees and € 2.5 billion of invested funds. In each one of the industrial areas ETVA VI.PE guarantees the companies industrial lots of land (sites) with propitious building codes and complete infrastructure networks of water supply, sewerage, paved roads, power supply, communications, cleansing services, etc. The development of Geographical Information System for ETVA VI.PE.'s Industrial Parks started at the beginning of 1992 and consists of three subsystems: Cadastre, that manages the information for the land acquisition of Industrial Areas; Street Layout - Sites, that manages the sites sold to manufacturing companies; Networks, that manages the infrastructure networks (roads, water supply, sewerage etc). The mapping of each Industrial Park is made incorporating state-of-the-art photogrammetric, cartographic and surveying methods and techniques. Passing through the phases of initial design (hybrid GIS) and system upgrade (integrated Gis solution with spatial database), the system is currently operating on a new upgrade (integrated gIS solution with spatial database) that includes redesigning and merging the system's database schemas, along with the creation of central security policies, and the development of a new web GIS application for advanced data entry, highly customisable and standard reports, and dynamic interactive maps. The new GIS bring the company to advanced levels of productivity and introduce the new era for decision making and business management.

  12. Nencki Genomics Database--Ensembl funcgen enhanced with intersections, user data and genome-wide TFBS motifs.

    PubMed

    Krystkowiak, Izabella; Lenart, Jakub; Debski, Konrad; Kuterba, Piotr; Petas, Michal; Kaminska, Bozena; Dabrowski, Michal

    2013-01-01

    We present the Nencki Genomics Database, which extends the functionality of Ensembl Regulatory Build (funcgen) for the three species: human, mouse and rat. The key enhancements over Ensembl funcgen include the following: (i) a user can add private data, analyze them alongside the public data and manage access rights; (ii) inside the database, we provide efficient procedures for computing intersections between regulatory features and for mapping them to the genes. To Ensembl funcgen-derived data, which include data from ENCODE, we add information on conserved non-coding (putative regulatory) sequences, and on genome-wide occurrence of transcription factor binding site motifs from the current versions of two major motif libraries, namely, Jaspar and Transfac. The intersections and mapping to the genes are pre-computed for the public data, and the result of any procedure run on the data added by the users is stored back into the database, thus incrementally increasing the body of pre-computed data. As the Ensembl funcgen schema for the rat is currently not populated, our database is the first database of regulatory features for this frequently used laboratory animal. The database is accessible without registration using the mysql client: mysql -h database.nencki-genomics.org -u public. Registration is required only to add or access private data. A WSDL webservice provides access to the database from any SOAP client, including the Taverna Workbench with a graphical user interface.

  13. Nencki Genomics Database—Ensembl funcgen enhanced with intersections, user data and genome-wide TFBS motifs

    PubMed Central

    Krystkowiak, Izabella; Lenart, Jakub; Debski, Konrad; Kuterba, Piotr; Petas, Michal; Kaminska, Bozena; Dabrowski, Michal

    2013-01-01

    We present the Nencki Genomics Database, which extends the functionality of Ensembl Regulatory Build (funcgen) for the three species: human, mouse and rat. The key enhancements over Ensembl funcgen include the following: (i) a user can add private data, analyze them alongside the public data and manage access rights; (ii) inside the database, we provide efficient procedures for computing intersections between regulatory features and for mapping them to the genes. To Ensembl funcgen-derived data, which include data from ENCODE, we add information on conserved non-coding (putative regulatory) sequences, and on genome-wide occurrence of transcription factor binding site motifs from the current versions of two major motif libraries, namely, Jaspar and Transfac. The intersections and mapping to the genes are pre-computed for the public data, and the result of any procedure run on the data added by the users is stored back into the database, thus incrementally increasing the body of pre-computed data. As the Ensembl funcgen schema for the rat is currently not populated, our database is the first database of regulatory features for this frequently used laboratory animal. The database is accessible without registration using the mysql client: mysql –h database.nencki-genomics.org –u public. Registration is required only to add or access private data. A WSDL webservice provides access to the database from any SOAP client, including the Taverna Workbench with a graphical user interface. Database URL: http://www.nencki-genomics.org. PMID:24089456

  14. Building Databases for Education. ERIC Digest.

    ERIC Educational Resources Information Center

    Klausmeier, Jane A.

    This digest provides a brief explanation of what a database is; explains how a database can be used; identifies important factors that should be considered when choosing database management system software; and provides citations to sources for finding reviews and evaluations of database management software. The digest is concerned primarily with…

  15. Watershed Data Management (WDM) database for Salt Creek streamflow simulation, DuPage County, Illinois, water years 2005-11

    USGS Publications Warehouse

    Bera, Maitreyee

    2014-01-01

    The U.S. Geological Survey (USGS), in cooperation with DuPage County Stormwater Management Division, maintains a USGS database of hourly meteorologic and hydrologic data for use in a near real-time streamflow simulation system, which assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek watershed in DuPage County, Illinois. Most of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorologic data (wind speed, solar radiation, air temperature, and dewpoint temperature) are collected at Argonne National Laboratory in Argonne, Ill. Potential evapotranspiration is computed from the meteorologic data. The hydrologic data (discharge and stage) are collected at USGS streamflow-gaging stations in DuPage County. These data are stored in a Watershed Data Management (WDM) database. An earlier report describes in detail the WDM database development including the processing of data from January 1, 1997, through September 30, 2004, in SEP04.WDM database. SEP04.WDM is updated with the appended data from October 1, 2004, through September 30, 2011, water years 2005–11 and renamed as SEP11.WDM. This report details the processing of meteorologic and hydrologic data in SEP11.WDM. This report provides a record of snow affected periods and the data used to fill missing-record periods for each precipitation site during water years 2005–11. The meteorologic data filling methods are described in detail in Over and others (2010), and an update is provided in this report.

  16. Geospatial database of estimates of groundwater discharge to streams in the Upper Colorado River Basin

    USGS Publications Warehouse

    Garcia, Adriana; Masbruch, Melissa D.; Susong, David D.

    2014-01-01

    The U.S. Geological Survey, as part of the Department of the Interior’s WaterSMART (Sustain and Manage America’s Resources for Tomorrow) initiative, compiled published estimates of groundwater discharge to streams in the Upper Colorado River Basin as a geospatial database. For the purpose of this report, groundwater discharge to streams is the baseflow portion of streamflow that includes contributions of groundwater from various flow paths. Reported estimates of groundwater discharge were assigned as attributes to stream reaches derived from the high-resolution National Hydrography Dataset. A total of 235 estimates of groundwater discharge to streams were compiled and included in the dataset. Feature class attributes of the geospatial database include groundwater discharge (acre-feet per year), method of estimation, citation abbreviation, defined reach, and 8-digit hydrologic unit code(s). Baseflow index (BFI) estimates of groundwater discharge were calculated using an existing streamflow characteristics dataset and were included as an attribute in the geospatial database. A comparison of the BFI estimates to the compiled estimates of groundwater discharge found that the BFI estimates were greater than the reported groundwater discharge estimates.

  17. Mass Storage Performance Information System

    NASA Technical Reports Server (NTRS)

    Scheuermann, Peter

    2000-01-01

    The purpose of this task is to develop a data warehouse to enable system administrators and their managers to gather information by querying the data logs of the MDSDS. Currently detailed logs capture the activity of the MDSDS internal to the different systems. The elements to be included in the data warehouse are requirements analysis, data cleansing, database design, database population, hardware/software acquisition, data transformation, query and report generation, and data mining.

  18. Open Clients for Distributed Databases

    NASA Astrophysics Data System (ADS)

    Chayes, D. N.; Arko, R. A.

    2001-12-01

    We are actively developing a collection of open source example clients that demonstrate use of our "back end" data management infrastructure. The data management system is reported elsewhere at this meeting (Arko and Chayes: A Scaleable Database Infrastructure). In addition to their primary goal of being examples for others to build upon, some of these clients may have limited utility in them selves. More information about the clients and the data infrastructure is available on line at http://data.ldeo.columbia.edu. The available examples to be demonstrated include several web-based clients including those developed for the Community Review System of the Digital Library for Earth System Education, a real-time watch standers log book, an offline interface to use log book entries, a simple client to search on multibeam metadata and others are Internet enabled and generally web-based front ends that support searches against one or more relational databases using industry standard SQL queries. In addition to the web based clients, simple SQL searches from within Excel and similar applications will be demonstrated. By defining, documenting and publishing a clear interface to the fully searchable databases, it becomes relatively easy to construct client interfaces that are optimized for specific applications in comparison to building a monolithic data and user interface system.

  19. Forensic DNA databases in Western Balkan region: retrospectives, perspectives, and initiatives

    PubMed Central

    Marjanović, Damir; Konjhodžić, Rijad; Butorac, Sara Sanela; Drobnič, Katja; Merkaš, Siniša; Lauc, Gordan; Primorac, Damir; Anđelinović, Šimun; Milosavljević, Mladen; Karan, Željko; Vidović, Stojko; Stojković, Oliver; Panić, Bojana; Vučetić Dragović, Anđelka; Kovačević, Sandra; Jakovski, Zlatko; Asplen, Chris; Primorac, Dragan

    2011-01-01

    The European Network of Forensic Science Institutes (ENFSI) recommended the establishment of forensic DNA databases and specific implementation and management legislations for all EU/ENFSI members. Therefore, forensic institutions from Bosnia and Herzegovina, Serbia, Montenegro, and Macedonia launched a wide set of activities to support these recommendations. To assess the current state, a regional expert team completed detailed screening and investigation of the existing forensic DNA data repositories and associated legislation in these countries. The scope also included relevant concurrent projects and a wide spectrum of different activities in relation to forensics DNA use. The state of forensic DNA analysis was also determined in the neighboring Slovenia and Croatia, which already have functional national DNA databases. There is a need for a ‘regional supplement’ to the current documentation and standards pertaining to forensic application of DNA databases, which should include regional-specific preliminary aims and recommendations. PMID:21674821

  20. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Nishikawa, Takaya

    The author describes the progress in and present status of the information management system at the research laboratories as a R & D component of pharmaceutical industry. The system deals with three fundamental types of information, that is, graphic information, numeral information and textual information which includes the former two types of information. The author and others have constructed the system which enables to process these kinds of information integrally. The system is also featured by the fact that natural form of information in which Japanese words (2 byte type) and English (1 byte type) as culture of personal & word processing computers are mixed can be processed by large-size computers because Japanese language are eligible for computer processing. The system is originally for research administrators, but can be effective also for researchers. At present 7 databases are available including external databases. The system is always ready to accept other databases newly.

  1. Forensic DNA databases in Western Balkan region: retrospectives, perspectives, and initiatives.

    PubMed

    Marjanović, Damir; Konjhodzić, Rijad; Butorac, Sara Sanela; Drobnic, Katja; Merkas, Sinisa; Lauc, Gordan; Primorac, Damir; Andjelinović, Simun; Milosavljević, Mladen; Karan, Zeljko; Vidović, Stojko; Stojković, Oliver; Panić, Bojana; Vucetić Dragović, Andjelka; Kovacević, Sandra; Jakovski, Zlatko; Asplen, Chris; Primorac, Dragan

    2011-06-01

    The European Network of Forensic Science Institutes (ENFSI) recommended the establishment of forensic DNA databases and specific implementation and management legislations for all EU/ENFSI members. Therefore, forensic institutions from Bosnia and Herzegovina, Serbia, Montenegro, and Macedonia launched a wide set of activities to support these recommendations. To assess the current state, a regional expert team completed detailed screening and investigation of the existing forensic DNA data repositories and associated legislation in these countries. The scope also included relevant concurrent projects and a wide spectrum of different activities in relation to forensics DNA use. The state of forensic DNA analysis was also determined in the neighboring Slovenia and Croatia, which already have functional national DNA databases. There is a need for a 'regional supplement' to the current documentation and standards pertaining to forensic application of DNA databases, which should include regional-specific preliminary aims and recommendations.

  2. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    NASA Astrophysics Data System (ADS)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed amount of oxygen, or of cation (using an analysis in element or oxide weight-%); this latter includes re-calculation of H2O/CO2 based on stoichiometry, and oxygen correction for F and Cl. Another option offers a list of any available standards and possible peak or background interferences for a series of elements. (3) "X-ray maps" lists the different setups recommended for element mapping using WDS, and a map calculator to facilitate maps setups and to estimate the total mapping time. (4) "X-ray data" lists all x-ray lines for a specific element (K, L, M, absorption edges, and satellite peaks) in term of energy, wavelength and peak position. A check for possible interferences on peak or background is also possible. Theoretical x-ray peak positions for each crystal are calculated based on the 2d spacing of each crystal and the wavelength of each line. (5) "Agenda" menu displays the reservation dates for each month and for each EMP lab defined. It also offers a reservation request option, this request being sent by email to the EMP manager for approval. (6) Finally, "Admin" is password restricted, and contains all necessary options to manage the database through user-friendly forms. The installation of this database is made easy and knowledge of HTML, PHP, or MySQL is unnecessary to install, configure, manage, or use it. A working database is accessible at http://cub.geoloweb.ch.

  3. Early Grades Ideas.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1984

    1984-01-01

    Five computer-oriented classroom activities are suggested. They include: Logo programming to help students develop estimation, logic and spatial skills; creating flow charts; inputting data; making snowflakes using Logo; and developing and using a database management program. (JN)

  4. Safe Drinking Water Information System Federal Version (SDWIS/FED)

    EPA Pesticide Factsheets

    SDWIS/FED is EPA’s national database that manages and collects public water system information from states, including reports of drinking water standard violations, reporting and monitoring violations, and other basic information.

  5. Importance of Data Management in a Long-term Biological Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Sigurd W; Brandt, Craig C; McCracken, Kitty

    2011-01-01

    The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program. An existing relational database was adapted and extended to handle biological data. Data modeling enabled the program's database to process, store, and retrievemore » its data. The data base's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. There are limitations. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.« less

  6. Importance of Data Management in a Long-Term Biological Monitoring Program

    NASA Astrophysics Data System (ADS)

    Christensen, Sigurd W.; Brandt, Craig C.; McCracken, Mary K.

    2011-06-01

    The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to meeting this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program when an existing relational database was adapted and extended to handle biological data. The database's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. We also discuss some limitations to our implementation. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.

  7. The Network Configuration of an Object Relational Database Management System

    NASA Technical Reports Server (NTRS)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  8. Sankofa pediatric HIV disclosure intervention cyber data management: building capacity in a resource-limited setting and ensuring data quality

    PubMed Central

    Catlin, Ann Christine; Fernando, Sumudinie; Gamage, Ruwan; Renner, Lorna; Antwi, Sampson; Tettey, Jonas Kusah; Amisah, Kofi Aikins; Kyriakides, Tassos; Cong, Xiangyu; Reynolds, Nancy R.; Paintsil, Elijah

    2015-01-01

    Prevalence of pediatric HIV disclosure is low in resource-limited settings. Innovative, culturally sensitive, and patient-centered disclosure approaches are needed. Conducting such studies in resource-limited settings is not trivial considering the challenges of capturing, cleaning, and storing clinical research data. To overcome some of these challenges, the Sankofa pediatric disclosure intervention adopted an interactive cyber infrastructure for data capture and analysis. The Sankofa Project database system is built on the HUBzero cyber infrastructure (https://hubzero.org), an open source software platform. The hub database components support: (1) data management – the “databases” component creates, configures, and manages database access, backup, repositories, applications, and access control; (2) data collection – the “forms” component is used to build customized web case report forms that incorporate common data elements and include tailored form submit processing to handle error checking, data validation, and data linkage as the data are stored to the database; and (3) data exploration – the “dataviewer” component provides powerful methods for users to view, search, sort, navigate, explore, map, graph, visualize, aggregate, drill-down, compute, and export data from the database. The Sankofa cyber data management tool supports a user-friendly, secure, and systematic collection of all data. We have screened more than 400 child–caregiver dyads and enrolled nearly 300 dyads, with tens of thousands of data elements. The dataviews have successfully supported all data exploration and analysis needs of the Sankofa Project. Moreover, the ability of the sites to query and view data summaries has proven to be an incentive for collecting complete and accurate data. The data system has all the desirable attributes of an electronic data capture tool. It also provides an added advantage of building data management capacity in resource-limited settings due to its innovative data query and summary views and availability of real-time support by the data management team. PMID:26616131

  9. The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform

    NASA Astrophysics Data System (ADS)

    Xie, Qingyun

    2016-06-01

    This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  10. An agent architecture for an integrated forest ecosystem management decision support system

    Treesearch

    Donald Nute; Walter D. Potter; Mayukh Dass; Astrid Glende; Frederick Maier; Hajime Uchiyama; Jin Wang; Mark Twery; Peter Knopp; Scott Thomasma; H. Michael Rauscher

    2003-01-01

    A wide variety of software tools are available to support decision in the management of forest ecosystems. These tools include databases, growth and yield models, wildlife models, silvicultural expert systems, financial models, geographical informations systems, and visualization tools. Typically, each of these tools has its own complex interface and data format. To...

  11. Ground cover management in walnut and other hardwood plantings

    Treesearch

    J.W. Van Sambeek; H.E. Garrett

    2004-01-01

    Ground cover management in walnut plantings and established stands can include (1) manipulating the resident vegetation, (2) mechanical control, (3) chemical control, (4) mulching, (5) planting cover crops, or (6) interplanting woody nurse crops. Data from over 110 reports were used to compile a database that compared growth of black walnut and other hardwoods under...

  12. Development of an electronic database for Acute Pain Service outcomes

    PubMed Central

    Love, Brandy L; Jensen, Louise A; Schopflocher, Donald; Tsui, Ban CH

    2012-01-01

    BACKGROUND: Quality assurance is increasingly important in the current health care climate. An electronic database can be used for tracking patient information and as a research tool to provide quality assurance for patient care. OBJECTIVE: An electronic database was developed for the Acute Pain Service, University of Alberta Hospital (Edmonton, Alberta) to record patient characteristics, identify at-risk populations, compare treatment efficacies and guide practice decisions. METHOD: Steps in the database development involved identifying the goals for use, relevant variables to include, and a plan for data collection, entry and analysis. Protocols were also created for data cleaning quality control. The database was evaluated with a pilot test using existing data to assess data collection burden, accuracy and functionality of the database. RESULTS: A literature review resulted in an evidence-based list of demographic, clinical and pain management outcome variables to include. Time to assess patients and collect the data was 20 min to 30 min per patient. Limitations were primarily software related, although initial data collection completion was only 65% and accuracy of data entry was 96%. CONCLUSIONS: The electronic database was found to be relevant and functional for the identified goals of data storage and research. PMID:22518364

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bower, J.C.; Burford, M.J.; Downing, T.R.

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using themore » site map database.« less

  14. Version 1.00 programmer`s tools used in constructing the INEL RML/analytical radiochemistry sample tracking database and its user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Femec, D.A.

    This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.

  15. Specification of parameters for development of a spatial database for drought monitoring and famine early warning in the African Sahel

    NASA Technical Reports Server (NTRS)

    Rochon, Gilbert L.

    1989-01-01

    Parameters were described for spatial database to facilitate drought monitoring and famine early warning in the African Sahel. The proposed system, referred to as the African Drought and Famine Information System (ADFIS) is ultimately recommended for implementation with the NASA/FEMA Spatial Analysis and Modeling System (SAMS), a GIS/Dymanic Modeling software package, currently under development. SAMS is derived from FEMA'S Integration Emergency Management Information System (IEMIS) and the Pacific Northwest Laborotory's/Engineering Topographic Laboratory's Airland Battlefield Environment (ALBE) GIS. SAMS is primarily intended for disaster planning and resource management applications with the developing countries. Sources of data for the system would include the Developing Economics Branch of the U.S. Dept. of Agriculture, the World Bank, Tulane University School of Public Health and Tropical Medicine's Famine Early Warning Systems (FEWS) Project, the USAID's Foreign Disaster Assistance Section, the World Resources Institute, the World Meterological Institute, the USGS, the UNFAO, UNICEF, and the United Nations Disaster Relief Organization (UNDRO). Satellite imagery would include decadal AVHRR imagery and Normalized Difference Vegetation Index (NDVI) values from 1981 to the present for the African continent and selected Landsat scenes for the Sudan pilot study. The system is initially conceived for the MicroVAX 2/GPX, running VMS. To facilitate comparative analysis, a global time-series database (1950 to 1987) is included for a basic set of 125 socio-economic variables per country per year. A more detailed database for the Sahelian countries includes soil type, water resources, agricultural production, agricultural import and export, food aid, and consumption. A pilot dataset for the Sudan with over 2,500 variables from the World Bank's ANDREX system, also includes epidemiological data on incidence of kwashiorkor, marasmus, other nutritional deficiencies, and synergistically-related infectious diseases.

  16. The Office of Environmental Management technical reports: a bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-07-01

    The Office of Environmental Management`s (EM) technical reports bibliography is an annual publication that contains information on scientific and technical reports sponsored by the Office of Environmental Management added to the Energy Science and Technology Database from July 1, 1995 through Sept. 30, 1996. This information is divided into the following categories: Focus Areas and Crosscutting Programs. Support Programs, Technology Integration and International Technology Exchange are now included in the General category. EM`s Office of Science and Technology sponsors this bibliography.

  17. SORTEZ: a relational translator for NCBI's ASN.1 database.

    PubMed

    Hart, K W; Searls, D B; Overton, G C

    1994-07-01

    The National Center for Biotechnology Information (NCBI) has created a database collection that includes several protein and nucleic acid sequence databases, a biosequence-specific subset of MEDLINE, as well as value-added information such as links between similar sequences. Information in the NCBI database is modeled in Abstract Syntax Notation 1 (ASN.1) an Open Systems Interconnection protocol designed for the purpose of exchanging structured data between software applications rather than as a data model for database systems. While the NCBI database is distributed with an easy-to-use information retrieval system, ENTREZ, the ASN.1 data model currently lacks an ad hoc query language for general-purpose data access. For that reason, we have developed a software package, SORTEZ, that transforms the ASN.1 database (or other databases with nested data structures) to a relational data model and subsequently to a relational database management system (Sybase) where information can be accessed through the relational query language, SQL. Because the need to transform data from one data model and schema to another arises naturally in several important contexts, including efficient execution of specific applications, access to multiple databases and adaptation to database evolution this work also serves as a practical study of the issues involved in the various stages of database transformation. We show that transformation from the ASN.1 data model to a relational data model can be largely automated, but that schema transformation and data conversion require considerable domain expertise and would greatly benefit from additional support tools.

  18. Functionally Graded Materials Database

    NASA Astrophysics Data System (ADS)

    Kisara, Katsuto; Konno, Tomomi; Niino, Masayuki

    2008-02-01

    Functionally Graded Materials Database (hereinafter referred to as FGMs Database) was open to the society via Internet in October 2002, and since then it has been managed by the Japan Aerospace Exploration Agency (JAXA). As of October 2006, the database includes 1,703 research information entries with 2,429 researchers data, 509 institution data and so on. Reading materials such as "Applicability of FGMs Technology to Space Plane" and "FGMs Application to Space Solar Power System (SSPS)" were prepared in FY 2004 and 2005, respectively. The English version of "FGMs Application to Space Solar Power System (SSPS)" is now under preparation. This present paper explains the FGMs Database, describing the research information data, the sitemap and how to use it. From the access analysis, user access results and users' interests are discussed.

  19. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    ERIC Educational Resources Information Center

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  20. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  1. Development of expert systems for analyzing electronic documents

    NASA Astrophysics Data System (ADS)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  2. The HyMeX database

    NASA Astrophysics Data System (ADS)

    Brissebrat, Guillaume; Mastrorillo, Laurence; Ramage, Karim; Boichard, Jean-Luc; Cloché, Sophie; Fleury, Laurence; Klenov, Ludmila; Labatut, Laurent; Mière, Arnaud

    2013-04-01

    The international HyMeX (HYdrological cycle in the Mediterranean EXperiment) project aims at a better understanding and quantification of the hydrological cycle and related processes in the Mediterranean, with emphasis on high-impact weather events, inter-annual to decadal variability of the Mediterranean coupled system, and associated trends in the context of global change. The project includes long term monitoring of environmental parameters, intensive field campaigns, use of satellite data, modelling studies, as well as post event field surveys and value-added products processing. Therefore HyMeX database incorporates various dataset types from different disciplines, either operational or research. The database relies on a strong collaboration between OMP and IPSL data centres. Field data, which are 1D time series, maps or pictures, are managed by OMP team while gridded data (satellite products, model outputs, radar data...) are managed by IPSL team. At present, the HyMeX database contains about 150 datasets, including 80 hydrological, meteorological, ocean and soil in situ datasets, 30 radar datasets, 15 satellite products, 15 atmosphere, ocean and land surface model outputs from operational (re-)analysis or forecasts and from research simulations, and 5 post event survey datasets. The data catalogue complies with international standards (ISO 19115; INSPIRE; Directory Interchange Format; Global Change Master Directory Thesaurus). It includes all the datasets stored in the HyMeX database, as well as external datasets relevant for the project. All the data, whatever the type is, are accessible through a single gateway. The database website http://mistrals.sedoo.fr/HyMeX offers different tools: - A registration procedure which enables any scientist to accept the data policy and apply for a user database account. - A search tool to browse the catalogue using thematic, geographic and/or temporal criteria. - Sorted lists of the datasets by thematic keywords, by measured parameters, by instruments or by platform type. - Forms to document observations or products that will be provided to the database. - A shopping-cart web interface to order in situ data files. - Ftp facilities to access gridded data. The website will soon propose new facilities. Many in situ datasets have been homogenized and inserted in a relational database yet, in order to enable more accurate data selection and download of different datasets in a shared format. Interoperability between the two data centres will be enhanced by the OpenDAP communication protocol associated with the Thredds catalogue software, which may also be implemented in other data centres that manage data of interest for the HyMeX project. In order to meet the operational needs for the HyMeX 2012 campaigns, a day-to-day quick look and report display website has been developed too: http://sop.hymex.org. It offers a convenient way to browse meteorological conditions and data during the campaign periods.

  3. The relational database model and multiple multicenter clinical trials.

    PubMed

    Blumenstein, B A

    1989-12-01

    The Southwest Oncology Group (SWOG) chose to use a relational database management system (RDBMS) for the management of data from multiple clinical trials because of the underlying relational model's inherent flexibility and the natural way multiple entity types (patients, studies, and participants) can be accommodated. The tradeoffs to using the relational model as compared to using the hierarchical model include added computing cycles due to deferred data linkages and added procedural complexity due to the necessity of implementing protections against referential integrity violations. The SWOG uses its RDBMS as a platform on which to build data operations software. This data operations software, which is written in a compiled computer language, allows multiple users to simultaneously update the database and is interactive with respect to the detection of conditions requiring action and the presentation of options for dealing with those conditions. The relational model facilitates the development and maintenance of data operations software.

  4. Resident database interfaces to the DAVID system, a heterogeneous distributed database management system

    NASA Technical Reports Server (NTRS)

    Moroh, Marsha

    1988-01-01

    A methodology for building interfaces of resident database management systems to a heterogeneous distributed database management system under development at NASA, the DAVID system, was developed. The feasibility of that methodology was demonstrated by construction of the software necessary to perform the interface task. The interface terminology developed in the course of this research is presented. The work performed and the results are summarized.

  5. Software Reviews.

    ERIC Educational Resources Information Center

    Science Software Quarterly, 1984

    1984-01-01

    Provides extensive reviews of computer software, examining documentation, ease of use, performance, error handling, special features, and system requirements. Includes statistics, problem-solving (TK Solver), label printing, database management, experimental psychology, Encyclopedia Britannica biology, and DNA-sequencing programs. A program for…

  6. An Initial Design of ISO 19152:2012 LADM Based Valuation and Taxation Data Model

    NASA Astrophysics Data System (ADS)

    Çağdaş, V.; Kara, A.; van Oosterom, P.; Lemmen, C.; Işıkdağ, Ü.; Kathmann, R.; Stubkjær, E.

    2016-10-01

    A fiscal registry or database is supposed to record geometric, legal, physical, economic, and environmental characteristics in relation to property units, which are subject to immovable property valuation and taxation. Apart from procedural standards, there is no internationally accepted data standard that defines the semantics of fiscal databases. The ISO 19152:2012 Land Administration Domain Model (LADM), as an international land administration standard focuses on legal requirements, but considers out of scope specifications of external information systems including valuation and taxation databases. However, it provides a formalism which allows for an extension that responds to the fiscal requirements. This paper introduces an initial version of a LADM - Fiscal Extension Module for the specification of databases used in immovable property valuation and taxation. The extension module is designed to facilitate all stages of immovable property taxation, namely the identification of properties and taxpayers, assessment of properties through single or mass appraisal procedures, automatic generation of sales statistics, and the management of tax collection, dealing with arrears and appeals. It is expected that the initial version will be refined through further activities held by a possible joint working group under FIG Commission 7 (Cadastre and Land Management) and FIG Commission 9 (Valuation and the Management of Real Estate) in collaboration with other relevant international bodies.

  7. Data Base Management: Proceedings of a Conference, November 1-2, 1984 Held at Monterey, California.

    DTIC Science & Technology

    1985-07-31

    Dolby Put the Information in the San Jose State University Database Not the Program San Jose , California 4:15 Douglas Lenat Relevance of Machine...network model permits multiple owners for one subsidi- ary entity. The DAPLEX network model includes the subset connection as well. I The SOCRATE system... Jose State University San Js, California -. A ..... .. .... [. . . ...- . . . - Js . . . .*es L * Dolby** PUT TIM INFORMATION IN THE DATABASE, NOT THE

  8. Does exercise improve symptoms in fibromyalgia?

    PubMed

    Rain, Carmen; Seguel, Willy; Vergara, Luis

    2015-12-14

    It has been proposed that fibromyalgia could be managed by pharmacological and non-pharmacological interventions. Regular physical exercise is commonly used as a non-pharmacological intervention. Searching in Epistemonikos database, which is maintained by screening 30 databases, we identified 14 systematic reviews including 25 randomized trials. We combined the evidence using meta-analysis and generated a summary of findings table following the GRADE approach. We conclude that regular physical exercise probably reduces pain in patients with fibromyalgia.

  9. Managers' Support for Employee Wellness Programs: An Integrative Review.

    PubMed

    Passey, Deborah G; Brown, Meagan C; Hammerback, Kristen; Harris, Jeffrey R; Hannon, Peggy A

    2018-01-01

    The aim of this integrative literature review is to synthesize the existing evidence regarding managers' support for employee wellness programs. The search utilized multiple electronic databases and libraries. Inclusion criteria comprised peer-reviewed research published in English, between 1990 and 2016, and examining managers' support in the context of a worksite intervention. The final sample included 21 articles for analysis. Two researchers extracted and described results from each of the included articles using a content analysis. Two researchers independently rated the quality of the included articles. Researchers synthesized data into a summary table by study design, sample, data collected, key findings, and quality rating. Factors that may influence managers' support include their organization's management structure, senior leadership support, their expected roles, training on health topics, and their beliefs and attitudes toward wellness programs and employee health. Managers' support may influence the organizational culture, employees' perception of support, and employees' behaviors. When designing interventions, health promotion practitioners and researchers should consider strategies that target senior, middle, and line managers' support. Interventions need to include explicit measures of managers' support as part of the evaluation plan.

  10. Computer Security Products Technology Overview

    DTIC Science & Technology

    1988-10-01

    13 3. DATABASE MANAGEMENT SYSTEMS ................................... 15 Definition...this paper addresses fall into the areas of multi-user hosts, database management systems (DBMS), workstations, networks, guards and gateways, and...provide a portion of that protection, for example, a password scheme, a file protection mechanism, a secure database management system, or even a

  11. An Introduction to Database Management Systems.

    ERIC Educational Resources Information Center

    Warden, William H., III; Warden, Bette M.

    1984-01-01

    Description of database management systems for microcomputers highlights system features and factors to consider in microcomputer system selection. A method for ranking database management systems is explained and applied to a defined need, i.e., software support for indexing a weekly newspaper. A glossary of terms and 32-item bibliography are…

  12. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    NASA Astrophysics Data System (ADS)

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    The area of the Swabian-Franconian cuesta landscape (Southern Germany) is highly prone to landslides. This was apparent in the late spring of 2013, when numerous landslides occurred as a consequence of heavy and long-lasting rainfalls. The specific climatic situation caused numerous damages with serious impact on settlements and infrastructure. Knowledge on spatial distribution of landslides, processes and characteristics are important to evaluate the potential risk that can occur from mass movements in those areas. In the frame of two projects about 400 landslides were mapped and detailed data sets were compiled during years 2011 to 2014 at the Franconian Alb. The studies are related to the project "Slope stability and hazard zones in the northern Bavarian cuesta" (DFG, German Research Foundation) as well as to the LfU (The Bavarian Environment Agency) within the project "Georisks and climate change - hazard indication map Jura". The central goal of the present study is to create a spatial database for landslides. The database should contain all fundamental parameters to characterize the mass movements and should provide the potential for secure data storage and data management, as well as statistical evaluations. The spatial database was created with PostgreSQL, an object-relational database management system and PostGIS, a spatial database extender for PostgreSQL, which provides the possibility to store spatial and geographic objects and to connect to several GIS applications, like GRASS GIS, SAGA GIS, QGIS and GDAL, a geospatial library (Obe et al. 2011). Database access for querying, importing, and exporting spatial and non-spatial data is ensured by using GUI or non-GUI connections. The database allows the use of procedural languages for writing advanced functions in the R, Python or Perl programming languages. It is possible to work directly with the (spatial) data entirety of the database in R. The inventory of the database includes (amongst others), informations on location, landslide types and causes, geomorphological positions, geometries, hazards and damages, as well as assessments related to the activity of landslides. Furthermore, there are stored spatial objects, which represent the components of a landslide, in particular the scarps and the accumulation areas. Besides, waterways, map sheets, contour lines, detailed infrastructure data, digital elevation models, aspect and slope data are included. Examples of spatial queries to the database are intersections of raster and vector data for calculating values for slope gradients or aspects of landslide areas and for creating multiple, overlaying sections for the comparison of slopes, as well as distances to the infrastructure or to the next receiving drainage. Furthermore, getting informations on landslide magnitudes, distribution and clustering, as well as potential correlations concerning geomorphological or geological conditions. The data management concept in this study can be implemented for any academic, public or private use, because it is independent from any obligatory licenses. The created spatial database offers a platform for interdisciplinary research and socio-economic questions, as well as for landslide susceptibility and hazard indication mapping. Obe, R.O., Hsu, L.S. 2011. PostGIS in action. - pp 492, Manning Publications, Stamford

  13. IDESSA: An Integrative Decision Support System for Sustainable Rangeland Management in Southern African Savannas

    NASA Astrophysics Data System (ADS)

    Meyer, Hanna; Authmann, Christian; Dreber, Niels; Hess, Bastian; Kellner, Klaus; Morgenthal, Theunis; Nauss, Thomas; Seeger, Bernhard; Tsvuura, Zivanai; Wiegand, Kerstin

    2017-04-01

    Bush encroachment is a syndrome of land degradation that occurs in many savannas including those of southern Africa. The increase in density, cover or biomass of woody vegetation often has negative effects on a range of ecosystem functions and services, which are hardly reversible. However, despite its importance, neither the causes of bush encroachment, nor the consequences of different resource management strategies to combat or mitigate related shifts in savanna states are fully understood. The project "IDESSA" (An Integrative Decision Support System for Sustainable Rangeland Management in Southern African Savannas) aims to improve the understanding of the complex interplays between land use, climate patterns and vegetation dynamics and to implement an integrative monitoring and decision-support system for the sustainable management of different savanna types. For this purpose, IDESSA follows an innovative approach that integrates local knowledge, botanical surveys, remote-sensing and machine-learning based time-series of atmospheric and land-cover dynamics, spatially explicit simulation modeling and analytical database management. The integration of the heterogeneous data will be implemented in a user oriented database infrastructure and scientific workflow system. Accessible via web-based interfaces, this database and analysis system will allow scientists to manage and analyze monitoring data and scenario computations, as well as allow stakeholders (e. g. land users, policy makers) to retrieve current ecosystem information and seasonal outlooks. We present the concept of the project and show preliminary results of the realization steps towards the integrative savanna management and decision-support system.

  14. Economic evaluation of manual therapy for musculoskeletal diseases: a protocol for a systematic review and narrative synthesis of evidence.

    PubMed

    Kim, Chang-Gon; Mun, Su-Jeong; Kim, Ka-Na; Shin, Byung-Cheul; Kim, Nam-Kwen; Lee, Dong-Hyo; Lee, Jung-Han

    2016-05-13

    Manual therapy is the non-surgical conservative management of musculoskeletal disorders using the practitioner's hands on the patient's body for diagnosing and treating disease. The aim of this study is to systematically review trial-based economic evaluations of manual therapy relative to other interventions used for the management of musculoskeletal diseases. Randomised clinical trials (RCTs) on the economic evaluation of manual therapy for musculoskeletal diseases will be included in the review. The following databases will be searched from their inception: Medline, Embase, Cochrane Central Register of Controlled Trials (CENTRAL), Cumulative Index to Nursing and Allied Health Literature (CINAHL), Econlit, Mantis, Index to Chiropractic Literature, Science Citation Index, Social Science Citation Index, Allied and Complementary Medicine Database (AMED), Cochrane Database of Systematic Reviews (CDSR), National Health Service Database of Abstracts of Reviews of Effects (NHS DARE), National Health Service Health Technology Assessment Database (NHS HTA), National Health Service Economic Evaluation Database (NHS EED), CENTRAL, five Korean medical databases (Oriental Medicine Advanced Searching Integrated System (OASIS), Research Information Service System (RISS), DBPIA, Korean Traditional Knowledge Portal (KTKP) and KoreaMed) and three Chinese databases (China National Knowledge Infrastructure (CNKI), VIP and Wanfang). The evidence for the cost-effectiveness, cost-utility and cost-benefit of manual therapy for musculoskeletal diseases will be assessed as the primary outcome. Health-related quality of life and adverse effects will be assessed as secondary outcomes. We will critically appraise the included studies using the Cochrane risk of bias tool and the Drummond checklist. Results will be summarised using Slavin's qualitative best-evidence synthesis approach. The results of the study will be disseminated via a peer-reviewed journal and/or conference presentations. PROSPERO CRD42015026757. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  15. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    PubMed

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  16. Identification of the condition of crops based on geospatial data embedded in graph databases

    NASA Astrophysics Data System (ADS)

    Idziaszek, P.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Koszela, K.; Fojud, A.

    2017-07-01

    The Web application presented here supports plant production and works with the graph database Neo4j shell to support the assessment of the condition of crops on the basis of geospatial data, including raster and vector data. The adoption of a graph database as a tool to store and manage the data, including geospatial data, is completely justified in the case of those agricultural holdings that have a wide range of types and sizes of crops. In addition, the authors tested the option of using the technology of Microsoft Cognitive Services at the level of produced application that enables an image analysis using the services provided. The presented application was designed using ASP.NET MVC technology and a wide range of leading IT tools.

  17. Ant-App-DB: a smart solution for monitoring arthropods activities, experimental data management and solar calculations without GPS in behavioral field studies.

    PubMed

    Ahmed, Zeeshan; Zeeshan, Saman; Fleischmann, Pauline; Rössler, Wolfgang; Dandekar, Thomas

    2014-01-01

    Field studies on arthropod ecology and behaviour require simple and robust monitoring tools, preferably with direct access to an integrated database. We have developed and here present a database tool allowing smart-phone based monitoring of arthropods. This smart phone application provides an easy solution to collect, manage and process the data in the field which has been a very difficult task for field biologists using traditional methods. To monitor our example species, the desert ant Cataglyphis fortis, we considered behavior, nest search runs, feeding habits and path segmentations including detailed information on solar position and azimuth calculation, ant orientation and time of day. For this we established a user friendly database system integrating the Ant-App-DB with a smart phone and tablet application, combining experimental data manipulation with data management and providing solar position and timing estimations without any GPS or GIS system. Moreover, the new desktop application Dataplus allows efficient data extraction and conversion from smart phone application to personal computers, for further ecological data analysis and sharing. All features, software code and database as well as Dataplus application are made available completely free of charge and sufficiently generic to be easily adapted to other field monitoring studies on arthropods or other migratory organisms. The software applications Ant-App-DB and Dataplus described here are developed using the Android SDK, Java, XML, C# and SQLite Database.

  18. Ant-App-DB: a smart solution for monitoring arthropods activities, experimental data management and solar calculations without GPS in behavioral field studies

    PubMed Central

    Ahmed, Zeeshan; Zeeshan, Saman; Fleischmann, Pauline; Rössler, Wolfgang; Dandekar, Thomas

    2015-01-01

    Field studies on arthropod ecology and behaviour require simple and robust monitoring tools, preferably with direct access to an integrated database. We have developed and here present a database tool allowing smart-phone based monitoring of arthropods. This smart phone application provides an easy solution to collect, manage and process the data in the field which has been a very difficult task for field biologists using traditional methods. To monitor our example species, the desert ant Cataglyphis fortis, we considered behavior, nest search runs, feeding habits and path segmentations including detailed information on solar position and azimuth calculation, ant orientation and time of day. For this we established a user friendly database system integrating the Ant-App-DB with a smart phone and tablet application, combining experimental data manipulation with data management and providing solar position and timing estimations without any GPS or GIS system. Moreover, the new desktop application Dataplus allows efficient data extraction and conversion from smart phone application to personal computers, for further ecological data analysis and sharing. All features, software code and database as well as Dataplus application are made available completely free of charge and sufficiently generic to be easily adapted to other field monitoring studies on arthropods or other migratory organisms. The software applications Ant-App-DB and Dataplus described here are developed using the Android SDK, Java, XML, C# and SQLite Database. PMID:25977753

  19. A phenome database (NEAUHLFPD) designed and constructed for broiler lines divergently selected for abdominal fat content.

    PubMed

    Li, Min; Dong, Xiang-yu; Liang, Hao; Leng, Li; Zhang, Hui; Wang, Shou-zhi; Li, Hui; Du, Zhi-Qiang

    2017-05-20

    Effective management and analysis of precisely recorded phenotypic traits are important components of the selection and breeding of superior livestocks. Over two decades, we divergently selected chicken lines for abdominal fat content at Northeast Agricultural University (Northeast Agricultural University High and Low Fat, NEAUHLF), and collected large volume of phenotypic data related to the investigation on molecular genetic basis of adipose tissue deposition in broilers. To effectively and systematically store, manage and analyze phenotypic data, we built the NEAUHLF Phenome Database (NEAUHLFPD). NEAUHLFPD included the following phenotypic records: pedigree (generations 1-19) and 29 phenotypes, such as body sizes and weights, carcass traits and their corresponding rates. The design and construction strategy of NEAUHLFPD were executed as follows: (1) Framework design. We used Apache as our web server, MySQL and Navicat as database management tools, and PHP as the HTML-embedded language to create dynamic interactive website. (2) Structural components. On the main interface, detailed introduction on the composition, function, and the index buttons of the basic structure of the database could be found. The functional modules of NEAUHLFPD had two main components: the first module referred to the physical storage space for phenotypic data, in which functional manipulation on data can be realized, such as data indexing, filtering, range-setting, searching, etc.; the second module related to the calculation of basic descriptive statistics, where data filtered from the database can be used for the computation of basic statistical parameters and the simultaneous conditional sorting. NEAUHLFPD could be used to effectively store and manage not only phenotypic, but also genotypic and genomics data, which can facilitate further investigation on the molecular genetic basis of chicken adipose tissue growth and development, and expedite the selection and breeding of broilers with low fat content.

  20. Major accident prevention through applying safety knowledge management approach.

    PubMed

    Kalatpour, Omid

    2016-01-01

    Many scattered resources of knowledge are available to use for chemical accident prevention purposes. The common approach to management process safety, including using databases and referring to the available knowledge has some drawbacks. The main goal of this article was to devise a new emerged knowledge base (KB) for the chemical accident prevention domain. The scattered sources of safety knowledge were identified and scanned. Then, the collected knowledge was formalized through a computerized program. The Protégé software was used to formalize and represent the stored safety knowledge. The domain knowledge retrieved as well as data and information. This optimized approach improved safety and health knowledge management (KM) process and resolved some typical problems in the KM process. Upgrading the traditional resources of safety databases into the KBs can improve the interaction between the users and knowledge repository.

  1. The effect of cupping therapy for low back pain: A meta-analysis based on existing randomized controlled trials.

    PubMed

    Wang, Yun-Ting; Qi, Yong; Tang, Fu-Yong; Li, Fei-Meng; Li, Qi-Huo; Xu, Chang-Peng; Xie, Guo-Ping; Sun, Hong-Tao

    2017-11-06

    LBP is one of the most common symptoms with high prevalence throughout the world. Conflicting conclusions exist in RCTs on cupping for LBP. To assess the effects and safety of cupping for the patients with LBP. Pubmed, Cochrane Library databases, and Embase database were electronically researched. RCTs reporting the cupping for the patients with LBP were included. The meta-analysis was conducted using Review Manager software (version 5.3, Nordic Cochrane Centre). The primary outcome was VAS scores. The secondary outcomes included ODI scores, MPPI scores and complications. Six RCTs were included in this synthesized analysis. The results showed that cupping therapy was superior to the control management with respect to VAS scores (SMD: -0.73, [95% CI: -1.42 to -0.04]; P= 0.04), and ODI scores (SMD: -3.64, [95% CI: -5.85 to -1.42]; P= 0.001). There was no statistical significant difference as regard to MPPI scores. No serious adverse event was reported in the included studies. Cupping therapy can significantly decrease the VAS scores and ODI scores for patients with LBP compared to the control management. High heterogeneity and risk of bias existing in studies limit the authenticity of the findings.

  2. 78 FR 15707 - Fisheries of the Atlantic and Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-12

    ... Southeast Data, Assessment and Review (SEDAR) process, a multi-step method for determining the status of... Center. Participants include: data collectors and database managers; stock assessment scientists...

  3. Library Automation.

    ERIC Educational Resources Information Center

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  4. Towards evidence-based management: creating an informative database of nursing-sensitive indicators.

    PubMed

    Patrician, Patricia A; Loan, Lori; McCarthy, Mary; Brosch, Laura R; Davey, Kimberly S

    2010-12-01

    The purpose of this paper is to describe the creation, evolution, and implementation of a database of nursing-sensitive and potentially nursing-sensitive indicators, the Military Nursing Outcomes Database (MilNOD). It discusses data quality, utility, and lessons learned. Prospective data collected each shift include direct staff hours by levels (i.e., registered nurse, other licensed and unlicensed providers), staff categories (i.e., military, civilian, contract, and reservist), patient census, acuity, and admissions, discharges, and transfers. Retrospective adverse event data (falls, medication errors, and needle-stick injuries) were collected from existing records. Annual patient satisfaction, nurse work environment, and pressure ulcer and restraint prevalence surveys were conducted. The MilNOD contains shift level data from 56 units in 13 military hospitals and is used to target areas for managerial and clinical performance improvement. This methodology can be modified for use in other healthcare systems. As standard tools for evidence-based management, databases such as MilNOD allow nurse leaders to track the status of nursing and adverse events in their facilities. No claim to original US government works.

  5. Experiment Management System for the SND Detector

    NASA Astrophysics Data System (ADS)

    Pugachev, K.

    2017-10-01

    We present a new experiment management system for the SND detector at the VEPP-2000 collider (Novosibirsk). An important part to report about is access to experimental databases (configuration, conditions and metadata). The system is designed in client-server architecture. User interaction comes true using web-interface. The server side includes several logical layers: user interface templates; template variables description and initialization; implementation details. The templates are meant to involve as less IT knowledge as possible. Experiment configuration, conditions and metadata are stored in a database. To implement the server side Node.js, a modern JavaScript framework, has been chosen. A new template engine having an interesting feature is designed. A part of the system is put into production. It includes templates dealing with showing and editing first level trigger configuration and equipment configuration and also showing experiment metadata and experiment conditions data index.

  6. National launch strategy vehicle data management system

    NASA Technical Reports Server (NTRS)

    Cordes, David

    1990-01-01

    The national launch strategy vehicle data management system (NLS/VDMS) was developed as part of the 1990 NASA Summer Faculty Fellowship Program. The system was developed under the guidance of the Engineering Systems Branch of the Information Systems Office, and is intended for use within the Program Development Branch PD34. The NLS/VDMS is an on-line database system that permits the tracking of various launch vehicle configurations within the program development office. The system is designed to permit the definition of new launch vehicles, as well as the ability to display and edit existing launch vehicles. Vehicles can be grouped in logical architectures within the system. Reports generated from this package include vehicle data sheets, architecture data sheets, and vehicle flight rate reports. The topics covered include: (1) system overview; (2) initial system development; (3) supercard hypermedia authoring system; (4) the ORACLE database; and (5) system evaluation.

  7. Database Systems. Course Three. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the third of seven in the Information Systems curriculum. The purpose of the course is to familiarize students with database management concepts and standard database management software. Databases and their roles, advantages, and limitations are explained. An overview of the course sets forth the condition and performance standard…

  8. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  9. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  10. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  11. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  12. Ridge 2000 Data Management System

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.; Carbotte, S. M.; Arko, R. A.; Haxby, W. F.; Ryan, W. B.; Chayes, D. N.; Lehnert, K. A.; Shank, T. M.

    2005-12-01

    Hosted at Lamont by the marine geoscience Data Management group, mgDMS, the NSF-funded Ridge 2000 electronic database, http://www.marine-geo.org/ridge2000/, is a key component of the Ridge 2000 multi-disciplinary program. The database covers each of the three Ridge 2000 Integrated Study Sites: Endeavour Segment, Lau Basin, and 8-11N Segment. It promotes the sharing of information to the broader community, facilitates integration of the suite of information collected at each study site, and enables comparisons between sites. The Ridge 2000 data system provides easy web access to a relational database that is built around a catalogue of cruise metadata. Any web browser can be used to perform a versatile text-based search which returns basic cruise and submersible dive information, sample and data inventories, navigation, and other relevant metadata such as shipboard personnel and links to NSF program awards. In addition, non-proprietary data files, images, and derived products which are hosted locally or in national repositories, as well as science and technical reports, can be freely downloaded. On the Ridge 2000 database page, our Data Link allows users to search the database using a broad range of parameters including data type, cruise ID, chief scientist, geographical location. The first Ridge 2000 field programs sailed in 2004 and, in addition to numerous data sets collected prior to the Ridge 2000 program, the database currently contains information on fifteen Ridge 2000-funded cruises and almost sixty Alvin dives. Track lines can be viewed using a recently- implemented Web Map Service button labelled Map View. The Ridge 2000 database is fully integrated with databases hosted by the mgDMS group for MARGINS and the Antarctic multibeam and seismic reflection data initiatives. Links are provided to partner databases including PetDB, SIOExplorer, and the ODP Janus system. Improved inter-operability with existing and new partner repositories continues to be strengthened. One major effort involves the gradual unification of the metadata across these partner databases. Standardised electronic metadata forms that can be filled in at sea are available from our web site. Interactive map-based exploration and visualisation of the Ridge 2000 database is provided by GeoMapApp, a freely-available Java(tm) application being developed within the mgDMS group. GeoMapApp includes high-resolution bathymetric grids for the 8-11N EPR segment and allows customised maps and grids for any of the Ridge 2000 ISS to be created. Vent and instrument locations can be plotted and saved as images, and Alvin dive photos are also available.

  13. Employing the Intelligence Cycle Process Model Within the Homeland Security Enterprise

    DTIC Science & Technology

    2013-12-01

    the Iraq anti-war movement, a former U.S. Congresswoman, the U.S. Treasury Department and hip hop bands to spread Sharia law in the U.S. A Virginia...challenges remain with threat notification, access to information, and database management of information that may have contributed the 2013 Boston...The FBI said it took a number of investigative steps to check on the request, including looking at his travel history, checking databases for

  14. Coordination and Data Management of the International Arctic Buoy Program

    DTIC Science & Technology

    1997-09-30

    which can drive sea ice models , and for input into climate change studies. Recent research using the IABP databases includes back and forward trajectory...present. Figure 2 shows the mean annual field of ice motion and sea level pressure. APPROACH Coordination of the IABP falls into the categories of...products of the IABP are now also available on the World Wide Web. Our recent efforts to improve the database have been directed towards producing a

  15. Fish Karyome: A karyological information network database of Indian Fishes.

    PubMed

    Nagpure, Naresh Sahebrao; Pathak, Ajey Kumar; Pati, Rameshwar; Singh, Shri Prakash; Singh, Mahender; Sarkar, Uttam Kumar; Kushwaha, Basdeo; Kumar, Ravindra

    2012-01-01

    'Fish Karyome', a database on karyological information of Indian fishes have been developed that serves as central source for karyotype data about Indian fishes compiled from the published literature. Fish Karyome has been intended to serve as a liaison tool for the researchers and contains karyological information about 171 out of 2438 finfish species reported in India and is publically available via World Wide Web. The database provides information on chromosome number, morphology, sex chromosomes, karyotype formula and cytogenetic markers etc. Additionally, it also provides the phenotypic information that includes species name, its classification, and locality of sample collection, common name, local name, sex, geographical distribution, and IUCN Red list status. Besides, fish and karyotype images, references for 171 finfish species have been included in the database. Fish Karyome has been developed using SQL Server 2008, a relational database management system, Microsoft's ASP.NET-2008 and Macromedia's FLASH Technology under Windows 7 operating environment. The system also enables users to input new information and images into the database, search and view the information and images of interest using various search options. Fish Karyome has wide range of applications in species characterization and identification, sex determination, chromosomal mapping, karyo-evolution and systematics of fishes.

  16. Software for Managing Inventory of Flight Hardware

    NASA Technical Reports Server (NTRS)

    Salisbury, John; Savage, Scott; Thomas, Shirman

    2003-01-01

    The Flight Hardware Support Request System (FHSRS) is a computer program that relieves engineers at Marshall Space Flight Center (MSFC) of most of the non-engineering administrative burden of managing an inventory of flight hardware. The FHSRS can also be adapted to perform similar functions for other organizations. The FHSRS affords a combination of capabilities, including those formerly provided by three separate programs in purchasing, inventorying, and inspecting hardware. The FHSRS provides a Web-based interface with a server computer that supports a relational database of inventory; electronic routing of requests and approvals; and electronic documentation from initial request through implementation of quality criteria, acquisition, receipt, inspection, storage, and final issue of flight materials and components. The database lists both hardware acquired for current projects and residual hardware from previous projects. The increased visibility of residual flight components provided by the FHSRS has dramatically improved the re-utilization of materials in lieu of new procurements, resulting in a cost savings of over $1.7 million. The FHSRS includes subprograms for manipulating the data in the database, informing of the status of a request or an item of hardware, and searching the database on any physical or other technical characteristic of a component or material. The software structure forces normalization of the data to facilitate inquiries and searches for which users have entered mixed or inconsistent values.

  17. The GermOnline cross-species systems browser provides comprehensive information on genes and gene products relevant for sexual reproduction.

    PubMed

    Gattiker, Alexandre; Niederhauser-Wiederkehr, Christa; Moore, James; Hermida, Leandro; Primig, Michael

    2007-01-01

    We report a novel release of the GermOnline knowledgebase covering genes relevant for the cell cycle, gametogenesis and fertility. GermOnline was extended into a cross-species systems browser including information on DNA sequence annotation, gene expression and the function of gene products. The database covers eight model organisms and Homo sapiens, for which complete genome annotation data are available. The database is now built around a sophisticated genome browser (Ensembl), our own microarray information management and annotation system (MIMAS) used to extensively describe experimental data obtained with high-density oligonucleotide microarrays (GeneChips) and a comprehensive system for online editing of database entries (MediaWiki). The RNA data include results from classical microarrays as well as tiling arrays that yield information on RNA expression levels, transcript start sites and lengths as well as exon composition. Members of the research community are solicited to help GermOnline curators keep database entries on genes and gene products complete and accurate. The database is accessible at http://www.germonline.org/.

  18. TRENDS: A flight test relational database user's guide and reference manual

    NASA Technical Reports Server (NTRS)

    Bondi, M. J.; Bjorkman, W. S.; Cross, J. L.

    1994-01-01

    This report is designed to be a user's guide and reference manual for users intending to access rotocraft test data via TRENDS, the relational database system which was developed as a tool for the aeronautical engineer with no programming background. This report has been written to assist novice and experienced TRENDS users. TRENDS is a complete system for retrieving, searching, and analyzing both numerical and narrative data, and for displaying time history and statistical data in graphical and numerical formats. This manual provides a 'guided tour' and a 'user's guide' for the new and intermediate-skilled users. Examples for the use of each menu item within TRENDS is provided in the Menu Reference section of the manual, including full coverage for TIMEHIST, one of the key tools. This manual is written around the XV-15 Tilt Rotor database, but does include an appendix on the UH-60 Blackhawk database. This user's guide and reference manual establishes a referrable source for the research community and augments NASA TM-101025, TRENDS: The Aeronautical Post-Test, Database Management System, Jan. 1990, written by the same authors.

  19. Migration of legacy mumps applications to relational database servers.

    PubMed

    O'Kane, K C

    2001-07-01

    An extended implementation of the Mumps language is described that facilitates vendor neutral migration of legacy Mumps applications to SQL-based relational database servers. Implemented as a compiler, this system translates Mumps programs to operating system independent, standard C code for subsequent compilation to fully stand-alone, binary executables. Added built-in functions and support modules extend the native hierarchical Mumps database with access to industry standard, networked, relational database management servers (RDBMS) thus freeing Mumps applications from dependence upon vendor specific, proprietary, unstandardized database models. Unlike Mumps systems that have added captive, proprietary RDMBS access, the programs generated by this development environment can be used with any RDBMS system that supports common network access protocols. Additional features include a built-in web server interface and the ability to interoperate directly with programs and functions written in other languages.

  20. The HITRAN 2008 Molecular Spectroscopic Database

    NASA Technical Reports Server (NTRS)

    Rothman, Laurence S.; Gordon, Iouli E.; Barbe, Alain; Benner, D. Chris; Bernath, Peter F.; Birk, Manfred; Boudon, V.; Brown, Linda R.; Campargue, Alain; Champion, J.-P.; hide

    2009-01-01

    This paper describes the status of the 2008 edition of the HITRAN molecular spectroscopic database. The new edition is the first official public release since the 2004 edition, although a number of crucial updates had been made available online since 2004. The HITRAN compilation consists of several components that serve as input for radiative-transfer calculation codes: individual line parameters for the microwave through visible spectra of molecules in the gas phase; absorption cross-sections for molecules having dense spectral features, i.e., spectra in which the individual lines are not resolved; individual line parameters and absorption cross sections for bands in the ultra-violet; refractive indices of aerosols, tables and files of general properties associated with the database; and database management software. The line-by-line portion of the database contains spectroscopic parameters for forty-two molecules including many of their isotopologues.

  1. Combining new technologies for effective collection development: a bibliometric study using CD-ROM and a database management program.

    PubMed Central

    Burnham, J F; Shearer, B S; Wall, J C

    1992-01-01

    Librarians have used bibliometrics for many years to assess collections and to provide data for making selection and deselection decisions. With the advent of new technology--specifically, CD-ROM databases and reprint file database management programs--new cost-effective procedures can be developed. This paper describes a recent multidisciplinary study conducted by two library faculty members and one allied health faculty member to test a bibliometric method that used the MEDLINE and CINAHL databases on CD-ROM and the Papyrus database management program to produce a new collection development methodology. PMID:1600424

  2. SU-G-TeP4-06: An Integrated Application for Radiation Therapy Treatment Plan Directives, Management, and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matuszak, M; Anderson, C; Lee, C

    Purpose: With electronic medical records, patient information for the treatment planning process has become disseminated across multiple applications with limited quality control and many associated failure modes. We present the development of a single application with a centralized database to manage the planning process. Methods: The system was designed to replace current functionalities of (i) static directives representing the physician intent for the prescription and planning goals, localization information for delivery, and other information, (ii) planning objective reports, (iii) localization and image guidance documents and (iv) the official radiation therapy prescription in the medical record. Using the Eclipse Scripting Applicationmore » Programming Interface, a plug-in script with an associated domain-specific SQL Server database was created to manage the information in (i)–(iv). The system’s user interface and database were designed by a team of physicians, clinical physicists, database experts, and software engineers to ensure usability and robustness for clinical use. Results: The resulting system has been fully integrated within the TPS via a custom script and database. Planning scenario templates, version control, approvals, and logic-based quality control allow this system to fully track and document the planning process as well as physician approval of tradeoffs while improving the consistency of the data. Multiple plans and prescriptions are supported along with non-traditional dose objectives and evaluation such as biologically corrected models, composite dose limits, and management of localization goals. User-specific custom views were developed for the attending physician review, physicist plan checks, treating therapists, and peer review in chart rounds. Conclusion: A method was developed to maintain cohesive information throughout the planning process within one integrated system by using a custom treatment planning management application that interfaces directly with the TPS. Future work includes quantifying the improvements in quality, safety and efficiency that are possible with the routine clinical use of this system. Supported in part by NIH-P01-CA-059827.« less

  3. Nuclear science abstracts (NSA) database 1948--1974 (on the Internet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Nuclear Science Abstracts (NSA) is a comprehensive abstract and index collection of the International Nuclear Science and Technology literature for the period 1948 through 1976. Included are scientific and technical reports of the US Atomic Energy Commission, US Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Coverage of the literature since 1976 is provided by Energy Science and Technology Database. Approximately 25% of the records in the file contain abstracts. These are from the following volumes of the print Nuclear Science Abstracts: Volumes 12--18, Volume 29, and Volume 33. The database containsmore » over 900,000 bibliographic records. All aspects of nuclear science and technology are covered, including: Biomedical Sciences; Metals, Ceramics, and Other Materials; Chemistry; Nuclear Materials and Waste Management; Environmental and Earth Sciences; Particle Accelerators; Engineering; Physics; Fusion Energy; Radiation Effects; Instrumentation; Reactor Technology; Isotope and Radiation Source Technology. The database includes all records contained in Volume 1 (1948) through Volume 33 (1976) of the printed version of Nuclear Science Abstracts (NSA). This worldwide coverage includes books, conference proceedings, papers, patents, dissertations, engineering drawings, and journal literature. This database is now available for searching through the GOV. Research Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less

  4. Scale out databases for CERN use cases

    NASA Astrophysics Data System (ADS)

    Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Lanza Garcia, Daniel; Surdy, Kacper

    2015-12-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database.

  5. Global Play Evaluation TOol (GPETO) assists Mobil explorationists with play evaluation and ranking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Withers, K.D.; Brown, P.J.; Clary, R.C.

    1996-01-01

    GPETO is a relational database and application containing information about over 2500 plays around the world. It also has information about approximately 30,000 fields and the related provinces. The GPETO application has been developed to assist Mobil geoscientists, planners and managers with global play evaluations and portfolio management. The, main features of GPETO allow users to: (1) view or modify play and province information, (2) composite user specified plays in a statistically valid way, (3) view threshold information for plays and provinces, including curves, (4) examine field size data, including discovered, future and ultimate field sizes for provinces and plays,more » (5) use a database browser to lookup and validate data by geographic, volumetric, technical and business criteria, (6) display ranged values and graphical displays of future and ultimate potential for plays, provinces, countries, and continents, (7) run, view and print a number of informative reports containing input and output data from the system. The GPETO application is written in c and fortran, runs on a unix based system, utilizes an Ingres database, and was implemented using a 3-tiered client/server architecture.« less

  6. Global Play Evaluation TOol (GPETO) assists Mobil explorationists with play evaluation and ranking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Withers, K.D.; Brown, P.J.; Clary, R.C.

    1996-12-31

    GPETO is a relational database and application containing information about over 2500 plays around the world. It also has information about approximately 30,000 fields and the related provinces. The GPETO application has been developed to assist Mobil geoscientists, planners and managers with global play evaluations and portfolio management. The, main features of GPETO allow users to: (1) view or modify play and province information, (2) composite user specified plays in a statistically valid way, (3) view threshold information for plays and provinces, including curves, (4) examine field size data, including discovered, future and ultimate field sizes for provinces and plays,more » (5) use a database browser to lookup and validate data by geographic, volumetric, technical and business criteria, (6) display ranged values and graphical displays of future and ultimate potential for plays, provinces, countries, and continents, (7) run, view and print a number of informative reports containing input and output data from the system. The GPETO application is written in c and fortran, runs on a unix based system, utilizes an Ingres database, and was implemented using a 3-tiered client/server architecture.« less

  7. The Admissions Office Goes Scientific.

    ERIC Educational Resources Information Center

    Bryant, Peter; Crockett, Kevin

    1993-01-01

    Data-based planning and management is revolutionizing college student recruitment. Data analysis focuses on historical trends, marketing and recruiting strategies, cost-effectiveness strategy, and markets. Data sources include primary market demographics, geo-demographics, secondary sources, student price response information, and institutional…

  8. Microcomputers in Libraries: The Quiet Revolution.

    ERIC Educational Resources Information Center

    Boss, Richard

    1985-01-01

    This article defines three separate categories of microcomputers--personal, desk-top, multi-user devices--and relates storage capabilities (expandability, floppy disks) to library applications. Highlghts include de facto standards, operating systems, database management systems, applications software, circulation control systems, dumb and…

  9. Publish or perish: Scientists must write or How do I climb the paper mountain?

    USDA-ARS?s Scientific Manuscript database

    This will be an interactive workshop for scientists discussing strategies for improving writing efficiency. Topics covered include database search engines, reference managing software, authorship, journal determination, writing tips and good writing habits....

  10. Computer Literacy for Teachers.

    ERIC Educational Resources Information Center

    Sarapin, Marvin I.; Post, Paul E.

    Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…

  11. Executive Summaries: CIL '90.

    ERIC Educational Resources Information Center

    Elsweiler, John A., Jr.; And Others

    1990-01-01

    Presents summaries of 12 papers presented at the 1990 Computers in Libraries Conference. Topics discussed include online searching; microcomputer-based serials management; microcomputer-based workstations; online public access catalogs (OPACs); multitype library networking; CD-ROM searches; locally mounted online databases; collection evaluation;…

  12. Development of a functional, internet-accessible department of surgery outcomes database.

    PubMed

    Newcomb, William L; Lincourt, Amy E; Gersin, Keith; Kercher, Kent; Iannitti, David; Kuwada, Tim; Lyons, Cynthia; Sing, Ronald F; Hadzikadic, Mirsad; Heniford, B Todd; Rucho, Susan

    2008-06-01

    The need for surgical outcomes data is increasing due to pressure from insurance companies, patients, and the need for surgeons to keep their own "report card". Current data management systems are limited by inability to stratify outcomes based on patients, surgeons, and differences in surgical technique. Surgeons along with research and informatics personnel from an academic, hospital-based Department of Surgery and a state university's Department of Information Technology formed a partnership to develop a dynamic, internet-based, clinical data warehouse. A five-component model was used: data dictionary development, web application creation, participating center education and management, statistics applications, and data interpretation. A data dictionary was developed from a list of data elements to address needs of research, quality assurance, industry, and centers of excellence. A user-friendly web interface was developed with menu-driven check boxes, multiple electronic data entry points, direct downloads from hospital billing information, and web-based patient portals. Data were collected on a Health Insurance Portability and Accountability Act-compliant server with a secure firewall. Protected health information was de-identified. Data management strategies included automated auditing, on-site training, a trouble-shooting hotline, and Institutional Review Board oversight. Real-time, daily, monthly, and quarterly data reports were generated. Fifty-eight publications and 109 abstracts have been generated from the database during its development and implementation. Seven national academic departments now use the database to track patient outcomes. The development of a robust surgical outcomes database requires a combination of clinical, informatics, and research expertise. Benefits of surgeon involvement in outcomes research include: tracking individual performance, patient safety, surgical research, legal defense, and the ability to provide accurate information to patient and payers.

  13. A Quality-Control-Oriented Database for a Mesoscale Meteorological Observation Network

    NASA Astrophysics Data System (ADS)

    Lussana, C.; Ranci, M.; Uboldi, F.

    2012-04-01

    In the operational context of a local weather service, data accessibility and quality related issues must be managed by taking into account a wide set of user needs. This work describes the structure and the operational choices made for the operational implementation of a database system storing data from highly automated observing stations, metadata and information on data quality. Lombardy's environmental protection agency, ARPA Lombardia, manages a highly automated mesoscale meteorological network. A Quality Assurance System (QAS) ensures that reliable observational information is collected and disseminated to the users. The weather unit in ARPA Lombardia, at the same time an important QAS component and an intensive data user, has developed a database specifically aimed to: 1) providing quick access to data for operational activities and 2) ensuring data quality for real-time applications, by means of an Automatic Data Quality Control (ADQC) procedure. Quantities stored in the archive include hourly aggregated observations of: precipitation amount, temperature, wind, relative humidity, pressure, global and net solar radiation. The ADQC performs several independent tests on raw data and compares their results in a decision-making procedure. An important ADQC component is the Spatial Consistency Test based on Optimal Interpolation. Interpolated and Cross-Validation analysis values are also stored in the database, providing further information to human operators and useful estimates in case of missing data. The technical solution adopted is based on a LAMP (Linux, Apache, MySQL and Php) system, constituting an open source environment suitable for both development and operational practice. The ADQC procedure itself is performed by R scripts directly interacting with the MySQL database. Users and network managers can access the database by using a set of web-based Php applications.

  14. Design and Development of a Clinical Risk Management Tool Using Radio Frequency Identification (RFID)

    PubMed Central

    Pourasghar, Faramarz; Tabrizi, Jafar Sadegh; Yarifard, Khadijeh

    2016-01-01

    Background: Patient safety is one of the most important elements of quality of healthcare. It means preventing any harm to the patients during medical care process. Objective: This paper introduces a cost-effective tool in which the Radio Frequency Identification (RFID) technology is used to identify medical errors in hospital. Methods: The proposed clinical error management system (CEMS) is consisted of a reader device, a transfer/receiver device, a database and managing software. The reader device works using radio waves and is wireless. The reader sends and receives data to/from the database via the transfer/receiver device which is connected to the computer via USB port. The database contains data about patients’ medication orders. Results: The CEMS has the ability to identify the clinical errors before they occur and then warns the care-giver with voice and visual messages to prevent the error. This device reduces the errors and thus improves the patient safety. Conclusion: A new tool including software and hardware was developed in this study. Application of this tool in clinical settings can help the nurses prevent medical errors. It can also be a useful tool for clinical risk management. Using this device can improve the patient safety to a considerable extent and thus improve the quality of healthcare. PMID:27147802

  15. Design and Development of a Clinical Risk Management Tool Using Radio Frequency Identification (RFID).

    PubMed

    Pourasghar, Faramarz; Tabrizi, Jafar Sadegh; Yarifard, Khadijeh

    2016-04-01

    Patient safety is one of the most important elements of quality of healthcare. It means preventing any harm to the patients during medical care process. This paper introduces a cost-effective tool in which the Radio Frequency Identification (RFID) technology is used to identify medical errors in hospital. The proposed clinical error management system (CEMS) is consisted of a reader device, a transfer/receiver device, a database and managing software. The reader device works using radio waves and is wireless. The reader sends and receives data to/from the database via the transfer/receiver device which is connected to the computer via USB port. The database contains data about patients' medication orders. The CEMS has the ability to identify the clinical errors before they occur and then warns the care-giver with voice and visual messages to prevent the error. This device reduces the errors and thus improves the patient safety. A new tool including software and hardware was developed in this study. Application of this tool in clinical settings can help the nurses prevent medical errors. It can also be a useful tool for clinical risk management. Using this device can improve the patient safety to a considerable extent and thus improve the quality of healthcare.

  16. Graph Databases for Large-Scale Healthcare Systems: A Framework for Efficient Data Management and Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Yubin; Shankar, Mallikarjun; Park, Byung H.

    Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcaremore » RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.« less

  17. The Iranian National Geodata Revision Strategy and Realization Based on Geodatabase

    NASA Astrophysics Data System (ADS)

    Haeri, M.; Fasihi, A.; Ayazi, S. M.

    2012-07-01

    In recent years, using of spatial database for storing and managing spatial data has become a hot topic in the field of GIS. Accordingly National Cartographic Center of Iran (NCC) produces - from time to time - some spatial data which is usually included in some databases. One of the NCC major projects was designing National Topographic Database (NTDB). NCC decided to create National Topographic Database of the entire country-based on 1:25000 coverage maps. The standard of NTDB was published in 1994 and its database was created at the same time. In NTDB geometric data was stored in MicroStation design format (DGN) which each feature has a link to its attribute data (stored in Microsoft Access file). Also NTDB file was produced in a sheet-wise mode and then stored in a file-based style. Besides map compilation, revision of existing maps has already been started. Key problems of NCC are revision strategy, NTDB file-based style storage and operator challenges (NCC operators are almost preferred to edit and revise geometry data in CAD environments). A GeoDatabase solution for national Geodata, based on NTDB map files and operators' revision preferences, is introduced and released herein. The proposed solution extends the traditional methods to have a seamless spatial database which it can be revised in CAD and GIS environment, simultaneously. The proposed system is the common data framework to create a central data repository for spatial data storage and management.

  18. A Middle-Range Explanatory Theory of Self-Management Behavior for Collaborative Research and Practice.

    PubMed

    Blok, Amanda C

    2017-04-01

    To report an analysis of the concept of self-management behaviors. Self-management behaviors are typically associated with disease management, with frequent use by nurse researchers related to chronic illness management and by international health organizations for development of disease management interventions. A concept analysis was conducted within the context of Orem's self-care framework. Walker and Avant's eight-step concept analysis approach guided the analysis. Academic databases were searched for relevant literature including CIHAHL, Cochrane Databases of Systematic Reviews and Register of Controlled Trials, MEDLINE, PsycARTICLES and PsycINFO, and SocINDEX. Literature using the term "self-management behavior" and published between April 2001 and March 2015 was analyzed for attributes, antecedents, and consequences. A total of 189 journal articles were reviewed. Self-management behaviors are defined as proactive actions related to lifestyle, a problem, planning, collaborating, and mental support, as well as reactive actions related to a circumstantial change, to achieve a goal influenced by the antecedents of physical, psychological, socioeconomic, and cultural characteristics, as well as collaborative and received support. The theoretical definition and middle-range explanatory theory of self-management behaviors will guide future collaborative research and clinical practice for disease management. © 2016 Wiley Periodicals, Inc.

  19. Attitudes of people with osteoarthritis towards their conservative management: a systematic review and meta-ethnography.

    PubMed

    Smith, Toby O; Purdy, Rachel; Lister, Sarah; Salter, Charlotte; Fleetcroft, Robert; Conaghan, Philip G

    2014-03-01

    This paper determines the perceptions of people diagnosed with osteoarthritis towards their conservative management strategies. A systematic review of the published (AMED, CINAHL, EMBASE, PsychINFO, SportsDisc, MEDLINE, Cochrane Clinical Trials Registry, PubMed) and unpublished/trial registry databases (WHO International Clinical Trials Registry Platform, Current Controlled Trials, the United States National Institute of Health Trials Registry, NIHR Clinical Research Portfolio Database) searched from their inception to July 2013. Eligible studies included those which presented the attitudes or perceptions of people with osteoarthritis towards non-operative management strategies. Study quality was appraised using the CASP and the Gough's weight of evidence appraisal tools. Data were analysed through a meta-ethnography approach. Thirty-three studies including 1,314 people with osteoarthritis were sampled; the majority diagnosed with knee osteoarthritis. The overarching themes indicated people with osteoarthritis delay their diagnosis, opting for self-management and informal information gathering. This informal rather than health professional-led guidance is sought and maintained as an important resource throughout the care of this population and is valued. Diagnosis is sought at a 'critical point'. Healthcare interventions largely provided are poorly perceived. The period of subsequent self-management is an expectation before the inevitable requirement for joint replacement. There remains uncertainty regarding when this is required, but the expected failure of conservative treatment to manage pain and symptoms is common. In conclusion, patients should be enthused towards the principles of self-management and clinicians should not trivialise osteoarthritis. This may provide a more valuable perception of non-operative management to promote its adoption and adherence in managing osteoarthritis.

  20. Development of a bird banding recapture database

    USGS Publications Warehouse

    Tautin, J.; Doherty, P.F.; Metras, L.

    2001-01-01

    Recaptures (and resightings) constitute the vast majority of post-release data from banded or otherwise marked nongame birds. A powerful suite of contemporary analytical models is available for using recapture data to estimate population size, survival rates and other parameters, and many banders collect recapture data for their project specific needs. However, despite widely recognized, broader programmatic needs for more and better data, banders' recapture data are not centrally reposited and made available for use by others. To address this need, the US Bird Banding Laboratory, the Canadian Bird Banding Office and the Georgia Cooperative Fish and Wildlife Research Unit are developing a bird banding recapture database. In this poster we discuss the critical steps in developing the database, including: determining exactly which recapture data should be included; developing a standard record format and structure for the database; developing electronic means for collecting, vetting and disseminating the data; and most importantly, developing metadata descriptions and individual data set profiles to facilitate the user's selection of appropriate analytical models. We provide examples of individual data sets to be included in the database, and we assess the feasibility of developing a prescribed program for obtaining recapture data from banders who do not presently collect them. It is expected that the recapture database eventually will contain millions of records made available publicly for a variety of avian research and management purposes

  1. Coordination and Data Management of the International Arctic Buoy Programme (IABP)

    DTIC Science & Technology

    1998-01-01

    estimate the mean surface wind, which can drive sea ice models , and for input into climate change studies. Recent research using the IABP databases includes...Coordination and Data Management of the International Arctic Buoy Programme ( IABP ) Ignatius G. Rigor Polar Science Center, Applied Physics Laboratory...the National Center for Environmental Projection underlayed. APPROACH Coordination of the IABP involves distribution of information, resource

  2. Proceedings of the tenth annual DOE low-level waste management conference: Session 2: Site performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-12-01

    This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)

  3. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    PubMed

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance. Copyright © 2015. Published by Elsevier B.V.

  4. Analysis of national and regional landslide inventories in Europe

    NASA Astrophysics Data System (ADS)

    Hervás, J.; Van Den Eeckhaut, M.

    2012-04-01

    A landslide inventory can be defined as a detailed register of the distribution and characteristics of past landslides in an area. Today most landslide inventories have the form of digital databases including landslide distribution maps and associated alphanumeric information for each landslide. While landslide inventories are of the utmost importance for land use planning and risk management through the generation of landslide zonation (susceptibility, hazard and risk) maps, landslide databases are thought to greatly differ from one country to another and often also within the same country. This hampers the generation of comparable, harmonised landslide zonation maps at national and continental scales, which is needed for policy and decision making at EU level as regarded for instance in the INSPIRE Directive and the Thematic Strategy for Soil Protection. In order to have a clear understanding of the landslide inventories available in Europe and their potential to produce landslide zonation maps as well as to draw recommendations to improve harmonisation and interoperability between landslide databases, we have surveyed 37 countries. In total, information has been collected and analysed for 24 national databases in 22 countries (Albania, Andorra, Austria, Bosnia and Herzegovina, Bulgaria, Czech Republic, Former Yugoslav Republic of Macedonia, France, Greece, Hungary, Iceland, Ireland, Italy, Norway, Poland, Portugal, Slovakia, Slovenia, Spain, Sweden, Switzerland and UK) and 22 regional databases in 10 countries. At the moment, over 633,000 landslides are recorded in national databases, representing on average less than 50% of the estimated landslides occurred in these countries. The sample of regional databases included over 103,000 landslides, with an estimated completeness substantially higher than that of national databases, as more attention can be paid for data collection over smaller regions. Yet, both for national and regional coverage, the data collection methods only occasionally included advanced technologies such as remote sensing. With regard to the inventory maps of most databases, the analysis illustrates the high variability of scales (between 1:10 000 and 1:1 M for national inventories, and from 1:10 000 to 1:25 000 for regional inventories), landslide classification systems and representation symbology. It also shows the difficulties to precisely locate landslides referred to in historical documents only. In addition, information on landslide magnitude, geometrical characteristics and age reported in national and regional databases greatly differs, even within the same database, as it strongly depends on the objectives of the database, the data collection methods used, the resources employed and the remaining landslide expression. In particular, landslide initiation and/or reactivation dates are generally estimated in less than 25% of records, thus making hazard and hence risk assessment difficult. In most databases, scarce information on landslide impact (damage and casualties) further hinders risk assessment at regional and national scales. Estimated landslide activity, which is very relevant to early warning and emergency management, is only included in half of the national databases and restricted to part of the landslides registered. Moreover, the availability of this information is not substantially higher in regional databases than in national ones. Most landslide databases further included information on geo-environmental characteristics at the landslide site, which is very important for modelling landslide zoning. Although a number of national and regional agencies provide free web-GIS visualisation services, the potential of existing landslide databases is often not fully exploited as, in many cases, access by the general public and external researchers is restricted. Additionally, the availability of information only in the national or local language is common to most national and regional databases, thus hampering consultation for most foreigners. Finally, some suggestions for a minimum set of attributes to be collected and made available by European countries for building up a continental landslide database in support of EU policies are presented. This study has been conducted in the framework of the EU-FP7 project SafeLand (Grant Agreement 22647).

  5. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FOREST SERVICE... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases...

  6. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS NATIONAL PARK... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases...

  7. A Data Management System for International Space Station Simulation Tools

    NASA Technical Reports Server (NTRS)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  8. The NASA Program Management Tool: A New Vision in Business Intelligence

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Swanson, Keith; Putz, Peter; Bell, David G.; Gawdiak, Yuri

    2006-01-01

    This paper describes a novel approach to business intelligence and program management for large technology enterprises like the U.S. National Aeronautics and Space Administration (NASA). Two key distinctions of the approach are that 1) standard business documents are the user interface, and 2) a "schema-less" XML database enables flexible integration of technology information for use by both humans and machines in a highly dynamic environment. The implementation utilizes patent-pending NASA software called the NASA Program Management Tool (PMT) and its underlying "schema-less" XML database called Netmark. Initial benefits of PMT include elimination of discrepancies between business documents that use the same information and "paperwork reduction" for program and project management in the form of reducing the effort required to understand standard reporting requirements and to comply with those reporting requirements. We project that the underlying approach to business intelligence will enable significant benefits in the timeliness, integrity and depth of business information available to decision makers on all organizational levels.

  9. Information systems in food safety management.

    PubMed

    McMeekin, T A; Baranyi, J; Bowman, J; Dalgaard, P; Kirk, M; Ross, T; Schmid, S; Zwietering, M H

    2006-12-01

    Information systems are concerned with data capture, storage, analysis and retrieval. In the context of food safety management they are vital to assist decision making in a short time frame, potentially allowing decisions to be made and practices to be actioned in real time. Databases with information on microorganisms pertinent to the identification of foodborne pathogens, response of microbial populations to the environment and characteristics of foods and processing conditions are the cornerstone of food safety management systems. Such databases find application in: Identifying pathogens in food at the genus or species level using applied systematics in automated ways. Identifying pathogens below the species level by molecular subtyping, an approach successfully applied in epidemiological investigations of foodborne disease and the basis for national surveillance programs. Predictive modelling software, such as the Pathogen Modeling Program and Growth Predictor (that took over the main functions of Food Micromodel) the raw data of which were combined as the genesis of an international web based searchable database (ComBase). Expert systems combining databases on microbial characteristics, food composition and processing information with the resulting "pattern match" indicating problems that may arise from changes in product formulation or processing conditions. Computer software packages to aid the practical application of HACCP and risk assessment and decision trees to bring logical sequences to establishing and modifying food safety management practices. In addition there are many other uses of information systems that benefit food safety more globally, including: Rapid dissemination of information on foodborne disease outbreaks via websites or list servers carrying commentary from many sources, including the press and interest groups, on the reasons for and consequences of foodborne disease incidents. Active surveillance networks allowing rapid dissemination of molecular subtyping information between public health agencies to detect foodborne outbreaks and limit the spread of human disease. Traceability of individual animals or crops from (or before) conception or germination to the consumer as an integral part of food supply chain management. Provision of high quality, online educational packages to food industry personnel otherwise precluded from access to such courses.

  10. Health information and communication system for emergency management in a developing country, Iran.

    PubMed

    Seyedin, Seyed Hesam; Jamali, Hamid R

    2011-08-01

    Disasters are fortunately rare occurrences. However, accurate and timely information and communication are vital to adequately prepare individual health organizations for such events. The current article investigates the health related communication and information systems for emergency management in Iran. A mixed qualitative and quantitative methodology was used in this study. A sample of 230 health service managers was surveyed using a questionnaire and 65 semi-structured interviews were also conducted with public health and therapeutic affairs managers who were responsible for emergency management. A range of problems were identified including fragmentation of information, lack of local databases, lack of clear information strategy and lack of a formal system for logging disaster related information at regional or local level. Recommendations were made for improving the national emergency management information and communication system. The findings have implications for health organizations in developing and developed countries especially in the Middle East. Creating disaster related information databases, creating protocols and standards, setting an information strategy, training staff and hosting a center for information system in the Ministry of Health to centrally manage and share the data could improve the current information system.

  11. Alternatives to relational databases in precision medicine: Comparison of NoSQL approaches for big data storage using supercomputers

    NASA Astrophysics Data System (ADS)

    Velazquez, Enrique Israel

    Improvements in medical and genomic technologies have dramatically increased the production of electronic data over the last decade. As a result, data management is rapidly becoming a major determinant, and urgent challenge, for the development of Precision Medicine. Although successful data management is achievable using Relational Database Management Systems (RDBMS), exponential data growth is a significant contributor to failure scenarios. Growing amounts of data can also be observed in other sectors, such as economics and business, which, together with the previous facts, suggests that alternate database approaches (NoSQL) may soon be required for efficient storage and management of big databases. However, this hypothesis has been difficult to test in the Precision Medicine field since alternate database architectures are complex to assess and means to integrate heterogeneous electronic health records (EHR) with dynamic genomic data are not easily available. In this dissertation, we present a novel set of experiments for identifying NoSQL database approaches that enable effective data storage and management in Precision Medicine using patients' clinical and genomic information from the cancer genome atlas (TCGA). The first experiment draws on performance and scalability from biologically meaningful queries with differing complexity and database sizes. The second experiment measures performance and scalability in database updates without schema changes. The third experiment assesses performance and scalability in database updates with schema modifications due dynamic data. We have identified two NoSQL approach, based on Cassandra and Redis, which seems to be the ideal database management systems for our precision medicine queries in terms of performance and scalability. We present NoSQL approaches and show how they can be used to manage clinical and genomic big data. Our research is relevant to the public health since we are focusing on one of the main challenges to the development of Precision Medicine and, consequently, investigating a potential solution to the progressively increasing demands on health care.

  12. --No Title--

    Science.gov Websites

    interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and

  13. Using databases in medical education research: AMEE Guide No. 77.

    PubMed

    Cleland, Jennifer; Scott, Neil; Harrild, Kirsten; Moffat, Mandy

    2013-05-01

    This AMEE Guide offers an introduction to the use of databases in medical education research. It is intended for those who are contemplating conducting research in medical education but are new to the field. The Guide is structured around the process of planning your research so that data collection, management and analysis are appropriate for the research question. Throughout we consider contextual possibilities and constraints to educational research using databases, such as the resources available, and provide concrete examples of medical education research to illustrate many points. The first section of the Guide explains the difference between different types of data and classifying data, and addresses the rationale for research using databases in medical education. We explain the difference between qualitative research and qualitative data, the difference between categorical and quantitative data, and the difference types of data which fall into these categories. The Guide reviews the strengths and weaknesses of qualitative and quantitative research. The next section is structured around how to work with quantitative and qualitative databases and provides guidance on the many practicalities of setting up a database. This includes how to organise your database, including anonymising data and coding, as well as preparing and describing your data so it is ready for analysis. The critical matter of the ethics of using databases in medical educational research, including using routinely collected data versus data collected for research purposes, and issues of confidentiality, is discussed. Core to the Guide is drawing out the similarities and differences in working with different types of data and different types of databases. Future AMEE Guides in the research series will address statistical analysis of data in more detail.

  14. [Design and application of user managing system of cardiac remote monitoring network].

    PubMed

    Chen, Shouqiang; Zhang, Jianmin; Yuan, Feng; Gao, Haiqing

    2007-12-01

    According to inpatient records, data managing demand of cardiac remote monitoring network and computer, this software was designed with relative database ACCESS. Its interface, operational button and menu were designed in VBA language assistantly. Its design included collective design, amity, practicability and compatibility. Its function consisted of registering, inquiring, statisticing and printing, et al. It could be used to manage users effectively and could be helpful to exerting important action of cardiac remote monitoring network in preventing cardiac-vascular emergency ulteriorly.

  15. An Integrated Nursing Management Information System: From Concept to Reality

    PubMed Central

    Pinkley, Connie L.; Sommer, Patricia K.

    1988-01-01

    This paper addresses the transition from the conceptualization of a Nursing Management Information System (NMIS) integrated and interdependent with the Hospital Information System (HIS) to its realization. Concepts of input, throughout, and output are presented to illustrate developmental strategies used to achieve nursing information products. Essential processing capabilities include: 1) ability to interact with multiple data sources; 2) database management, statistical, and graphics software packages; 3) online, batch and reporting; and 4) interactive data analysis. Challenges encountered in system construction are examined.

  16. Toward an open-access global database for mapping, control, and surveillance of neglected tropical diseases.

    PubMed

    Hürlimann, Eveline; Schur, Nadine; Boutsika, Konstantina; Stensgaard, Anna-Sofie; Laserna de Himpsl, Maiti; Ziegelbauer, Kathrin; Laizer, Nassor; Camenzind, Lukas; Di Pasquale, Aurelio; Ekpo, Uwem F; Simoonga, Christopher; Mushinge, Gabriel; Saarnak, Christopher F L; Utzinger, Jürg; Kristensen, Thomas K; Vounatsou, Penelope

    2011-12-01

    After many years of general neglect, interest has grown and efforts came under way for the mapping, control, surveillance, and eventual elimination of neglected tropical diseases (NTDs). Disease risk estimates are a key feature to target control interventions, and serve as a benchmark for monitoring and evaluation. What is currently missing is a georeferenced global database for NTDs providing open-access to the available survey data that is constantly updated and can be utilized by researchers and disease control managers to support other relevant stakeholders. We describe the steps taken toward the development of such a database that can be employed for spatial disease risk modeling and control of NTDs. With an emphasis on schistosomiasis in Africa, we systematically searched the literature (peer-reviewed journals and 'grey literature'), contacted Ministries of Health and research institutions in schistosomiasis-endemic countries for location-specific prevalence data and survey details (e.g., study population, year of survey and diagnostic techniques). The data were extracted, georeferenced, and stored in a MySQL database with a web interface allowing free database access and data management. At the beginning of 2011, our database contained more than 12,000 georeferenced schistosomiasis survey locations from 35 African countries available under http://www.gntd.org. Currently, the database is expanded to a global repository, including a host of other NTDs, e.g. soil-transmitted helminthiasis and leishmaniasis. An open-access, spatially explicit NTD database offers unique opportunities for disease risk modeling, targeting control interventions, disease monitoring, and surveillance. Moreover, it allows for detailed geostatistical analyses of disease distribution in space and time. With an initial focus on schistosomiasis in Africa, we demonstrate the proof-of-concept that the establishment and running of a global NTD database is feasible and should be expanded without delay.

  17. ANTI-VIRAL EFFECTS OF MEDICINAL PLANTS IN THE MANAGEMENT OF DENGUE: A SYSTEMATIC REVIEW.

    PubMed

    Frederico, Éric Heleno Freira Ferreira; Cardoso, André Luiz Bandeira Dionísio; Moreira-Marconi, Eloá; de Sá-Caputo, Danúbia da Cunha; Guimarães, Carlos Alberto Sampaio; Dionello, Carla da Fontoura; Morel, Danielle Soares; Paineiras-Domingos, Laisa Liane; de Souza, Patricia Lopes; Brandão-Sobrinho-Neto, Samuel; Carvalho-Lima, Rafaelle Pacheco; Guedes-Aguiar, Eliane de Oliveira; Costa-Cavalcanti, Rebeca Graça; Kutter, Cristiane Ribeiro; Bernardo-Filho, Mario

    2017-01-01

    Dengue is considered as an important arboviral disease. Safe, low-cost, and effective drugs that possess inhibitory activity against dengue virus (DENV) are mostly needed to try to combat the dengue infection worldwide. Medicinal plants have been considered as an important alternative to manage several diseases, such as dengue. As authors have demonstrated the antiviral effect of medicinal plants against DENV, the aim of this study was to review systematically the published research concerning the use of medicinal plants in the management of dengue using the PubMed database. Search and selection of publications were made using the PubMed database following the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA statement). Six publications met the inclusion criteria and were included in the final selection after thorough analysis. It is suggested that medicinal plants' products could be used as potential anti-DENV agents.

  18. The Houston Academy of Medicine--Texas Medical Center Library management information system.

    PubMed Central

    Camille, D; Chadha, S; Lyders, R A

    1993-01-01

    A management information system (MIS) provides a means for collecting, reporting, and analyzing data from all segments of an organization. Such systems are common in business but rare in libraries. The Houston Academy of Medicine-Texas Medical Center Library developed an MIS that operates on a system of networked IBM PCs and Paradox, a commercial database software package. The data collected in the system include monthly reports, client profile information, and data collected at the time of service requests. The MIS assists with enforcement of library policies, ensures that correct information is recorded, and provides reports for library managers. It also can be used to help answer a variety of ad hoc questions. Future plans call for the development of an MIS that could be adapted to other libraries' needs, and a decision-support interface that would facilitate access to the data contained in the MIS databases. PMID:8251972

  19. Mapping the World's Marine Protected and Managed Areas - Promoting Awareness, Compliance, and Enforcement via Open Data and Tools.

    NASA Astrophysics Data System (ADS)

    Vincent, T.; Zetterlind, V.; Tougher, B.

    2016-12-01

    Marine Protected and Managed Areas (MPAs) are a cornerstone of coastal and ocean conservation efforts and reflect years of dedicated effort to protect species and habitats through science-based regulation. When they are effective, biomass increases dramatically, and up to 14 fold and play a significant role in conserving biodiversity. Effective MPAs have enforcement. Enforcement cannot occur without awareness of their location among ocean stakeholders and the general public. The Anthropocene Institute, in partnership with the NOAA Marine Protected Area Center, is creating an actively managed, free and open, worldwide database of MPAs, including normalized metadata and regulation summaries, full GIS boundaries, revision history, and public facing interactive web maps. This project employs 2 full-time lawyers that first comb the relevant regulation; 2 full-time geographers and a full-time GIS database/web engineer.

  20. Response to BACT Determinations Inquiry from Alabama Department of Environmental Management

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  1. Federal Land Managers Notification and Visibility Assessment Requirements for PSD Permitting

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  2. The Vendors' Corner: Biblio-Techniques' Library and Information System (BLIS).

    ERIC Educational Resources Information Center

    Library Software Review, 1984

    1984-01-01

    Describes online catalog and integrated library computer system designed to enhance Washington Library Network's software. Highlights include system components; implementation options; system features (integrated library functions, database design, system management facilities); support services (installation and training, software maintenance and…

  3. Mass-storage management for distributed image/video archives

    NASA Astrophysics Data System (ADS)

    Franchi, Santina; Guarda, Roberto; Prampolini, Franco

    1993-04-01

    The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.

  4. Telecommunications issues of intelligent database management for ground processing systems in the EOS era

    NASA Technical Reports Server (NTRS)

    Touch, Joseph D.

    1994-01-01

    Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

  5. Technology transfer at NASA - A librarian's view

    NASA Technical Reports Server (NTRS)

    Buchan, Ronald L.

    1991-01-01

    The NASA programs, publications, and services promoting the transfer and utilization of aerospace technology developed by and for NASA are briefly surveyed. Topics addressed include the corporate sources of NASA technical information and its interest for corporate users of information services; the IAA and STAR abstract journals; NASA/RECON, NTIS, and the AIAA Aerospace Database; the RECON Space Commercialization file; the Computer Software Management and Information Center file; company information in the RECON database; and services to small businesses. Also discussed are the NASA publications Tech Briefs and Spinoff, the Industrial Applications Centers, NASA continuing bibliographies on management and patent abstracts (indexed using the NASA Thesaurus), the Index to NASA News Releases and Speeches, and the Aerospace Research Information Network (ARIN).

  6. Global Coordination and Standardisation in Marine Biodiversity through the World Register of Marine Species (WoRMS) and Related Databases

    PubMed Central

    Bouchet, Philippe; Boxshall, Geoff; Fauchald, Kristian; Gordon, Dennis; Hoeksema, Bert W.; Poore, Gary C. B.; van Soest, Rob W. M.; Stöhr, Sabine; Walter, T. Chad; Vanhoorne, Bart; Decock, Wim

    2013-01-01

    The World Register of Marine Species is an over 90% complete open-access inventory of all marine species names. Here we illustrate the scale of the problems with species names, synonyms, and their classification, and describe how WoRMS publishes online quality assured information on marine species. Within WoRMS, over 100 global, 12 regional and 4 thematic species databases are integrated with a common taxonomy. Over 240 editors from 133 institutions and 31 countries manage the content. To avoid duplication of effort, content is exchanged with 10 external databases. At present WoRMS contains 460,000 taxonomic names (from Kingdom to subspecies), 368,000 species level combinations of which 215,000 are currently accepted marine species names, and 26,000 related but non-marine species. Associated information includes 150,000 literature sources, 20,000 images, and locations of 44,000 specimens. Usage has grown linearly since its launch in 2007, with about 600,000 unique visitors to the website in 2011, and at least 90 organisations from 12 countries using WoRMS for their data management. By providing easy access to expert-validated content, WoRMS improves quality control in the use of species names, with consequent benefits to taxonomy, ecology, conservation and marine biodiversity research and management. The service manages information on species names that would otherwise be overly costly for individuals, and thus minimises errors in the application of nomenclature standards. WoRMS' content is expanding to include host-parasite relationships, additional literature sources, locations of specimens, images, distribution range, ecological, and biological data. Species are being categorised as introduced (alien, invasive), of conservation importance, and on other attributes. These developments have a multiplier effect on its potential as a resource for biodiversity research and management. As a consequence of WoRMS, we are witnessing improved communication within the scientific community, and anticipate increased taxonomic efficiency and quality control in marine biodiversity research and management. PMID:23505408

  7. Global coordination and standardisation in marine biodiversity through the World Register of Marine Species (WoRMS) and related databases.

    PubMed

    Costello, Mark J; Bouchet, Philippe; Boxshall, Geoff; Fauchald, Kristian; Gordon, Dennis; Hoeksema, Bert W; Poore, Gary C B; van Soest, Rob W M; Stöhr, Sabine; Walter, T Chad; Vanhoorne, Bart; Decock, Wim; Appeltans, Ward

    2013-01-01

    The World Register of Marine Species is an over 90% complete open-access inventory of all marine species names. Here we illustrate the scale of the problems with species names, synonyms, and their classification, and describe how WoRMS publishes online quality assured information on marine species. Within WoRMS, over 100 global, 12 regional and 4 thematic species databases are integrated with a common taxonomy. Over 240 editors from 133 institutions and 31 countries manage the content. To avoid duplication of effort, content is exchanged with 10 external databases. At present WoRMS contains 460,000 taxonomic names (from Kingdom to subspecies), 368,000 species level combinations of which 215,000 are currently accepted marine species names, and 26,000 related but non-marine species. Associated information includes 150,000 literature sources, 20,000 images, and locations of 44,000 specimens. Usage has grown linearly since its launch in 2007, with about 600,000 unique visitors to the website in 2011, and at least 90 organisations from 12 countries using WoRMS for their data management. By providing easy access to expert-validated content, WoRMS improves quality control in the use of species names, with consequent benefits to taxonomy, ecology, conservation and marine biodiversity research and management. The service manages information on species names that would otherwise be overly costly for individuals, and thus minimises errors in the application of nomenclature standards. WoRMS' content is expanding to include host-parasite relationships, additional literature sources, locations of specimens, images, distribution range, ecological, and biological data. Species are being categorised as introduced (alien, invasive), of conservation importance, and on other attributes. These developments have a multiplier effect on its potential as a resource for biodiversity research and management. As a consequence of WoRMS, we are witnessing improved communication within the scientific community, and anticipate increased taxonomic efficiency and quality control in marine biodiversity research and management.

  8. A spatial-temporal system for dynamic cadastral management.

    PubMed

    Nan, Liu; Renyi, Liu; Guangliang, Zhu; Jiong, Xie

    2006-03-01

    A practical spatio-temporal database (STDB) technique for dynamic urban land management is presented. One of the STDB models, the expanded model of Base State with Amendments (BSA), is selected as the basis for developing the dynamic cadastral management technique. Two approaches, the Section Fast Indexing (SFI) and the Storage Factors of Variable Granularity (SFVG), are used to improve the efficiency of the BSA model. Both spatial graphic data and attribute data, through a succinct engine, are stored in standard relational database management systems (RDBMS) for the actual implementation of the BSA model. The spatio-temporal database is divided into three interdependent sub-databases: present DB, history DB and the procedures-tracing DB. The efficiency of database operation is improved by the database connection in the bottom layer of the Microsoft SQL Server. The spatio-temporal system can be provided at a low-cost while satisfying the basic needs of urban land management in China. The approaches presented in this paper may also be of significance to countries where land patterns change frequently or to agencies where financial resources are limited.

  9. Enhanced project management tool

    NASA Technical Reports Server (NTRS)

    Hsu, Chen-Jung (Inventor); Patel, Hemil N. (Inventor); Maluf, David A. (Inventor); Moh Hashim, Jairon C. (Inventor); Tran, Khai Peter B. (Inventor)

    2012-01-01

    A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as one or more of a monthly report, a task plan report, a schedule report, a budget report and a risk management report, are generated and made available for display or further analysis or collection into a customized report template. An extensible database allows searching for information based upon context and upon content. Seven different types of project risks are addressed, including non-availability of required skill mix of workers. The system can be configured to exchange data and results with corresponding portions of similar project analyses, and to provide user-specific access to specified information.

  10. Strategies to promote adherence to treatment by pulmonary tuberculosis patients: a systematic review.

    PubMed

    Suwankeeree, Wongduan; Picheansathian, Wilawan

    2014-03-01

    The objective of this study is to review and synthesise the best available research evidence that investigates the effectiveness of strategies to promote adherence to treatment by patients with newly diagnosed pulmonary tuberculosis (TB). The search sought to find published and unpublished studies. The search covered articles published from 1990 to 2010 in English and Thai. The database search included Cumulative Index to Nursing and Allied Health Literature (CINAHL), EMBASE, Cochrane Library, PubMed, Science Direct, Current Content Connect, Thai Nursing Research Database, Thai thesis database, Digital Library of Thailand Research Fund, Research of National Research Council of Thailand and Database of Office of Higher Education Commission. Studies were additionally identified from reference lists of all studies retrieved. Eligible studies were randomised controlled trials that explored different strategies to promote adherence to TB treatment of patients with newly diagnosed pulmonary TB and also included quasiexperimental studies. Two of the investigators independently assessed the studies and then extracted and summarised data from eligible studies. Extracted data were entered into Review Manager software and analysed. A total of 7972 newly diagnosed pulmonary TB patients participated in 10 randomised controlled trials and eight quasiexperimental studies. The studies reported on the effectiveness of a number of specific interventions to improve adherence to TB treatment among newly diagnosed pulmonary TB patients. These interventions included directly observed treatment (DOT) coupled with alternative patient supervision options, case management with DOT, short-course directly observed treatment, the intensive triad-model programme and an intervention package aimed at improved counselling and communication, decentralisation of treatment, patient choice of a DOT supporter and reinforcement of supervision activities. This review found evidence of beneficial effects from the DOT with regard to the medication adherence among TB patients in terms of cure rate and success rate. However, no beneficial effect was found from DOT intervention with increasing completion rate. In addition, the combined interventions to improve adherence to tuberculosis treatment included case management with directly observed treatment short-course program, the intensive triad-model programme and intervention package. These interventions should be implemented by healthcare providers and tailored to local contexts and circumstances, wherever appropriate.

  11. An Extensible Information Grid for Risk Management

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David G.

    2003-01-01

    This paper describes recent work on developing an extensible information grid for risk management at NASA - a RISK INFORMATION GRID. This grid is being developed by integrating information grid technology with risk management processes for a variety of risk related applications. To date, RISK GRID applications are being developed for three main NASA processes: risk management - a closed-loop iterative process for explicit risk management, program/project management - a proactive process that includes risk management, and mishap management - a feedback loop for learning from historical risks that escaped other processes. This is enabled through an architecture involving an extensible database, structuring information with XML, schemaless mapping of XML, and secure server-mediated communication using standard protocols.

  12. NED-IIS: An Intelligent Information System for Forest Ecosystem Management

    Treesearch

    W.D. Potter; S. Somasekar; R. Kommineni; H.M. Rauscher

    1999-01-01

    We view Intelligent Information System (IIS) as composed of a unified knowledge base, database, and model base. The model base includes decision support models, forecasting models, and cvsualization models for example. In addition, we feel that the model base should include domain specific porblems solving modules as well as decision support models. This, then,...

  13. VIEWCACHE: An incremental pointer-base access method for distributed databases. Part 1: The universal index system design document. Part 2: The universal index system low-level design document. Part 3: User's guide. Part 4: Reference manual. Part 5: UIMS test suite

    NASA Technical Reports Server (NTRS)

    Kelley, Steve; Roussopoulos, Nick; Sellis, Timos

    1992-01-01

    The goal of the Universal Index System (UIS), is to provide an easy-to-use and reliable interface to many different kinds of database systems. The impetus for this system was to simplify database index management for users, thus encouraging the use of indexes. As the idea grew into an actual system design, the concept of increasing database performance by facilitating the use of time-saving techniques at the user level became a theme for the project. This Final Report describes the Design, the Implementation of UIS, and its Language Interfaces. It also includes the User's Guide and the Reference Manual.

  14. Impact of data base structure in a successful in vitro-in vivo correlation for pharmaceutical products.

    PubMed

    Roudier, B; Davit, B; Schütz, H; Cardot, J-M

    2015-01-01

    The in vitro-in vivo correlation (IVIVC) (Food and Drug Administration 1997) aims to predict performances in vivo of a pharmaceutical formulation based on its in vitro characteristics. It is a complex process that (i) incorporates in a gradual and incremental way a large amount of information and (ii) requires information from different properties (formulation, analytical, clinical) and associated dedicated treatments (statistics, modeling, simulation). These results in many studies that are initiated and integrated into the specifications (quality target product profile, QTPP). This latter defines the appropriate experimental designs (quality by design, QbD) (Food and Drug Administration 2011, 2012) whose main objectives are determination (i) of key factors of development and manufacturing (critical process parameters, CPPs) and (ii) of critical points of physicochemical nature relating to active ingredients (API) and critical quality attribute (CQA) which may have implications in terms of efficiency, safety, and inoffensiveness for the patient, due to their non-inclusion. These processes generate a very large amount of data that is necessary to structure. In this context, the storage of information in a database (DB) and the management of this database (database management system, DBMS) become an important issue for the management of projects and IVIVC and more generally for development of new pharmaceutical forms. This article describes the implementation of a prototype object-oriented database (OODB) considered as a tool, which is helpful for decision taking, responding in a structured and consistent way to the issues of project management of IVIVC (including bioequivalence and bioavailability) (Food and Drug Administration 2003) necessary for the implementation of QTPP.

  15. PhamDB: a web-based application for building Phamerator databases.

    PubMed

    Lamine, James G; DeJong, Randall J; Nelesen, Serita M

    2016-07-01

    PhamDB is a web application which creates databases of bacteriophage genes, grouped by gene similarity. It is backwards compatible with the existing Phamerator desktop software while providing an improved database creation workflow. Key features include a graphical user interface, validation of uploaded GenBank files, and abilities to import phages from existing databases, modify existing databases and queue multiple jobs. Source code and installation instructions for Linux, Windows and Mac OSX are freely available at https://github.com/jglamine/phage PhamDB is also distributed as a docker image which can be managed via Kitematic. This docker image contains the application and all third party software dependencies as a pre-configured system, and is freely available via the installation instructions provided. snelesen@calvin.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Computer Science and Technology: Modeling and Measurement Techniques for Evaluation of Design Alternatives in the Implementation of Database Management Software. Final Report.

    ERIC Educational Resources Information Center

    Deutsch, Donald R.

    This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…

  17. Database Management System

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In 1981 Wayne Erickson founded Microrim, Inc, a company originally focused on marketing a microcomputer version of RIM (Relational Information Manager). Dennis Comfort joined the firm and is now vice president, development. The team developed an advanced spinoff from the NASA system they had originally created, a microcomputer database management system known as R:BASE 4000. Microrim added many enhancements and developed a series of R:BASE products for various environments. R:BASE is now the second largest selling line of microcomputer database management software in the world.

  18. Addition of a breeding database in the Genome Database for Rosaceae

    PubMed Central

    Evans, Kate; Jung, Sook; Lee, Taein; Brutcher, Lisa; Cho, Ilhyung; Peace, Cameron; Main, Dorrie

    2013-01-01

    Breeding programs produce large datasets that require efficient management systems to keep track of performance, pedigree, geographical and image-based data. With the development of DNA-based screening technologies, more breeding programs perform genotyping in addition to phenotyping for performance evaluation. The integration of breeding data with other genomic and genetic data is instrumental for the refinement of marker-assisted breeding tools, enhances genetic understanding of important crop traits and maximizes access and utility by crop breeders and allied scientists. Development of new infrastructure in the Genome Database for Rosaceae (GDR) was designed and implemented to enable secure and efficient storage, management and analysis of large datasets from the Washington State University apple breeding program and subsequently expanded to fit datasets from other Rosaceae breeders. The infrastructure was built using the software Chado and Drupal, making use of the Natural Diversity module to accommodate large-scale phenotypic and genotypic data. Breeders can search accessions within the GDR to identify individuals with specific trait combinations. Results from Search by Parentage lists individuals with parents in common and results from Individual Variety pages link to all data available on each chosen individual including pedigree, phenotypic and genotypic information. Genotypic data are searchable by markers and alleles; results are linked to other pages in the GDR to enable the user to access tools such as GBrowse and CMap. This breeding database provides users with the opportunity to search datasets in a fully targeted manner and retrieve and compare performance data from multiple selections, years and sites, and to output the data needed for variety release publications and patent applications. The breeding database facilitates efficient program management. Storing publicly available breeding data in a database together with genomic and genetic data will further accelerate the cross-utilization of diverse data types by researchers from various disciplines. Database URL: http://www.rosaceae.org/breeders_toolbox PMID:24247530

  19. Addition of a breeding database in the Genome Database for Rosaceae.

    PubMed

    Evans, Kate; Jung, Sook; Lee, Taein; Brutcher, Lisa; Cho, Ilhyung; Peace, Cameron; Main, Dorrie

    2013-01-01

    Breeding programs produce large datasets that require efficient management systems to keep track of performance, pedigree, geographical and image-based data. With the development of DNA-based screening technologies, more breeding programs perform genotyping in addition to phenotyping for performance evaluation. The integration of breeding data with other genomic and genetic data is instrumental for the refinement of marker-assisted breeding tools, enhances genetic understanding of important crop traits and maximizes access and utility by crop breeders and allied scientists. Development of new infrastructure in the Genome Database for Rosaceae (GDR) was designed and implemented to enable secure and efficient storage, management and analysis of large datasets from the Washington State University apple breeding program and subsequently expanded to fit datasets from other Rosaceae breeders. The infrastructure was built using the software Chado and Drupal, making use of the Natural Diversity module to accommodate large-scale phenotypic and genotypic data. Breeders can search accessions within the GDR to identify individuals with specific trait combinations. Results from Search by Parentage lists individuals with parents in common and results from Individual Variety pages link to all data available on each chosen individual including pedigree, phenotypic and genotypic information. Genotypic data are searchable by markers and alleles; results are linked to other pages in the GDR to enable the user to access tools such as GBrowse and CMap. This breeding database provides users with the opportunity to search datasets in a fully targeted manner and retrieve and compare performance data from multiple selections, years and sites, and to output the data needed for variety release publications and patent applications. The breeding database facilitates efficient program management. Storing publicly available breeding data in a database together with genomic and genetic data will further accelerate the cross-utilization of diverse data types by researchers from various disciplines. Database URL: http://www.rosaceae.org/breeders_toolbox.

  20. NBIC: National Ballast Information Clearinghouse

    Science.gov Websites

    Smithsonian Environmental Research Center Logo US Coast Guard Logo Submit BW Report | Search NBIC Database / Database Manager: Tami Huber Senior Analyst / Ecologist: Mark Minton Data Managers Ashley Arnwine Jessica Hardee Amanda Reynolds Database Design and Programming / Application Programming: Paul Winterbauer

  1. Knowledge management for efficient quantitative analyses during regulatory reviews.

    PubMed

    Krudys, Kevin; Li, Fang; Florian, Jeffry; Tornoe, Christoffer; Chen, Ying; Bhattaram, Atul; Jadhav, Pravin; Neal, Lauren; Wang, Yaning; Gobburu, Joga; Lee, Peter I D

    2011-11-01

    Knowledge management comprises the strategies and methods employed to generate and leverage knowledge within an organization. This report outlines the activities within the Division of Pharmacometrics at the US FDA to effectively manage knowledge with the ultimate goal of improving drug development and advancing public health. The infrastructure required for pharmacometric knowledge management includes provisions for data standards, queryable databases, libraries of modeling tools, archiving of analysis results and reporting templates for effective communication. Two examples of knowledge management systems developed within the Division of Pharmacometrics are used to illustrate these principles. The benefits of sound knowledge management include increased productivity, allowing reviewers to focus on research questions spanning new drug applications, such as improved trial design and biomarker development. The future of knowledge management depends on the collaboration between the FDA and industry to implement data and model standards to enhance sharing and dissemination of knowledge.

  2. Perceived Self-Efficacy: A Concept Analysis for Symptom Management in Patients With Cancer
.

    PubMed

    White, Lynn L; Cohen, Marlene Z; Berger, Ann M; Kupzyk, Kevin A; Swore-Fletcher, Barbara A; Bierman, Philip J

    2017-12-01

    Perceived self-efficacy (PSE) for symptom management plays a key role in outcomes for patients with cancer, such as quality of life, functional status, symptom distress, and healthcare use. Definition of the concept is necessary for use in research and to guide the development of interventions to facilitate PSE for symptom management in patients with cancer.
. This analysis will describe the concept of PSE for symptom management in patients with cancer.
. A database search was performed for related publications from 2006-2016. Landmark publications published prior to 2006 that informed the concept analysis were included.
. Greater PSE for symptom management predicts improved performance outcomes, including functional health status, cognitive function, and disease status. Clarification of the concept of PSE for symptom management will accelerate the progress of self-management research and allow for comparison of research data and intervention development.

  3. AGRICULTURAL BEST MANAGEMENT PRACTICE EFFECTIVENESS DATABASE

    EPA Science Inventory

    Resource Purpose:The Agricultural Best Management Practice Effectiveness Database contains the results of research projects which have collected water quality data for the purpose of determining the effectiveness of agricultural management practices in reducing pollutants ...

  4. Software Tools Streamline Project Management

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query-Based Document Management (QBDM) is a tool that enables content or context searches, either simple or hierarchical, across a variety of databases. The system enables users to specify notification subscriptions where they associate "contexts of interest" and "events of interest" to one or more documents or collection(s) of documents. Based on these subscriptions, users receive notification when the events of interest occur within the contexts of interest for associated document or collection(s) of documents. Users can also associate at least one notification time as part of the notification subscription, with at least one option for the time period of notifications.

  5. The Watershed and River Systems Management Program: Decision Support for Water- and Environmental-Resource Management

    NASA Astrophysics Data System (ADS)

    Leavesley, G.; Markstrom, S.; Frevert, D.; Fulp, T.; Zagona, E.; Viger, R.

    2004-12-01

    Increasing demands for limited fresh-water supplies, and increasing complexity of water-management issues, present the water-resource manager with the difficult task of achieving an equitable balance of water allocation among a diverse group of water users. The Watershed and River System Management Program (WARSMP) is a cooperative effort between the U.S. Geological Survey (USGS) and the Bureau of Reclamation (BOR) to develop and deploy a database-centered, decision-support system (DSS) to address these multi-objective, resource-management problems. The decision-support system couples the USGS Modular Modeling System (MMS) with the BOR RiverWare tools using a shared relational database. MMS is an integrated system of computer software that provides a research and operational framework to support the development and integration of a wide variety of hydrologic and ecosystem models, and their application to water- and ecosystem-resource management. RiverWare is an object-oriented reservoir and river-system modeling framework developed to provide tools for evaluating and applying water-allocation and management strategies. The modeling capabilities of MMS and Riverware include simulating watershed runoff, reservoir inflows, and the impacts of resource-management decisions on municipal, agricultural, and industrial water users, environmental concerns, power generation, and recreational interests. Forecasts of future climatic conditions are a key component in the application of MMS models to resource-management decisions. Forecast methods applied in MMS include a modified version of the National Weather Service's Extended Streamflow Prediction Program (ESP) and statistical downscaling from atmospheric models. The WARSMP DSS is currently operational in the Gunnison River Basin, Colorado; Yakima River Basin, Washington; Rio Grande Basin in Colorado and New Mexico; and Truckee River Basin in California and Nevada.

  6. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    PubMed

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. 'Isotopo' a database application for facile analysis and management of mass isotopomer data.

    PubMed

    Ahmed, Zeeshan; Zeeshan, Saman; Huber, Claudia; Hensel, Michael; Schomburg, Dietmar; Münch, Richard; Eylert, Eva; Eisenreich, Wolfgang; Dandekar, Thomas

    2014-01-01

    The composition of stable-isotope labelled isotopologues/isotopomers in metabolic products can be measured by mass spectrometry and supports the analysis of pathways and fluxes. As a prerequisite, the original mass spectra have to be processed, managed and stored to rapidly calculate, analyse and compare isotopomer enrichments to study, for instance, bacterial metabolism in infection. For such applications, we provide here the database application 'Isotopo'. This software package includes (i) a database to store and process isotopomer data, (ii) a parser to upload and translate different data formats for such data and (iii) an improved application to process and convert signal intensities from mass spectra of (13)C-labelled metabolites such as tertbutyldimethylsilyl-derivatives of amino acids. Relative mass intensities and isotopomer distributions are calculated applying a partial least square method with iterative refinement for high precision data. The data output includes formats such as graphs for overall enrichments in amino acids. The package is user-friendly for easy and robust data management of multiple experiments. The 'Isotopo' software is available at the following web link (section Download): http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. The package contains three additional files: software executable setup (installer), one data set file (discussed in this article) and one excel file (which can be used to convert data from excel to '.iso' format). The 'Isotopo' software is compatible only with the Microsoft Windows operating system. http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. © The Author(s) 2014. Published by Oxford University Press.

  8. A Database of Historical Information on Landslides and Floods in Italy

    NASA Astrophysics Data System (ADS)

    Guzzetti, F.; Tonelli, G.

    2003-04-01

    For the past 12 years we have maintained and updated a database of historical information on landslides and floods in Italy, known as the National Research Council's AVI (Damaged Urban Areas) Project archive. The database was originally designed to respond to a specific request of the Minister of Civil Protection, and was aimed at helping the regional assessment of landslide and flood risk in Italy. The database was first constructed in 1991-92 to cover the period 1917 to 1990. Information of damaging landslide and flood event was collected by searching archives, by screening thousands of newspaper issues, by reviewing the existing technical and scientific literature on landslides and floods in Italy, and by interviewing landslide and flood experts. The database was then updated chiefly through the analysis of hundreds of newspaper articles, and it now covers systematically the period 1900 to 1998, and non-systematically the periods 1900 to 1916 and 1999 to 2002. Non systematic information on landslide and flood events older than 20th century is also present in the database. The database currently contains information on more than 32,000 landslide events occurred at more than 25,700 sites, and on more than 28,800 flood events occurred at more than 15,600 sites. After a brief outline of the history and evolution of the AVI Project archive, we present and discuss: (a) the present structure of the database, including the hardware and software solutions adopted to maintain, manage, use and disseminate the information stored in the database, (b) the type and amount of information stored in the database, including an estimate of its completeness, and (c) examples of recent applications of the database, including a web-based GIS systems to show the location of sites historically affected by landslides and floods, and an estimate of geo-hydrological (i.e., landslide and flood) risk in Italy based on the available historical information.

  9. Geodata Modeling and Query in Geographic Information Systems

    NASA Technical Reports Server (NTRS)

    Adam, Nabil

    1996-01-01

    Geographic information systems (GIS) deal with collecting, modeling, man- aging, analyzing, and integrating spatial (locational) and non-spatial (attribute) data required for geographic applications. Examples of spatial data are digital maps, administrative boundaries, road networks, and those of non-spatial data are census counts, land elevations and soil characteristics. GIS shares common areas with a number of other disciplines such as computer- aided design, computer cartography, database management, and remote sensing. None of these disciplines however, can by themselves fully meet the requirements of a GIS application. Examples of such requirements include: the ability to use locational data to produce high quality plots, perform complex operations such as network analysis, enable spatial searching and overlay operations, support spatial analysis and modeling, and provide data management functions such as efficient storage, retrieval, and modification of large datasets; independence, integrity, and security of data; and concurrent access to multiple users. It is on the data management issues that we devote our discussions in this monograph. Traditionally, database management technology have been developed for business applications. Such applications require, among other things, capturing the data requirements of high-level business functions and developing machine- level implementations; supporting multiple views of data and yet providing integration that would minimize redundancy and maintain data integrity and security; providing a high-level language for data definition and manipulation; allowing concurrent access to multiple users; and processing user transactions in an efficient manner. The demands on database management systems have been for speed, reliability, efficiency, cost effectiveness, and user-friendliness. Significant progress have been made in all of these areas over the last two decades to the point that many generalized database platforms are now available for developing data intensive applications that run in real-time. While continuous improvement is still being made at a very fast-paced and competitive rate, new application areas such as computer aided design, image processing, VLSI design, and GIS have been identified by many as the next generation of database applications. These new application areas pose serious challenges to the currently available database technology. At the core of these challenges is the nature of data that is manipulated. In traditional database applications, the database objects do not have any spatial dimension, and as such, can be thought of as point data in a multi-dimensional space. For example, each instance of an entity EMPLOYEE will have a unique value corresponding to every attribute such as employee id, employee name, employee address and so on. Thus, every Employee instance can be thought of as a point in a multi-dimensional space where each dimension is represented by an attribute. Furthermore, all operations on such data are one-dimensional. Thus, users may retrieve all entities satisfying one or more constraints. Examples of such constraints include employees with addresses in a certain area code, or salaries within a certain range. Even though constraints can be specified on multiple attributes (dimensions), the search for such data is essentially orthogonal across these dimensions.

  10. Copyright, Licensing Agreements and Gateways.

    ERIC Educational Resources Information Center

    Elias, Arthur W.

    1990-01-01

    Discusses technological developments in information distribution and management in relation to concepts of ownership. A historical overview of the concept of copyright is presented; licensing elements for databases are examined; and implications for gateway systems are explored, including ownership, identification of users, and allowable uses of…

  11. UNIX: A Tool for Information Management.

    ERIC Educational Resources Information Center

    Frey, Dean

    1989-01-01

    Describes UNIX, a computer operating system that supports multi-task and multi-user operations. Characteristics that make it especially suitable for library applications are discussed, including a hierarchical file structure and utilities for text processing, database activities, and bibliographic work. Sources of information on hardware…

  12. Ureteral endometriosis: A systematic literature review

    PubMed Central

    Palla, Viktoria-Varvara; Karaolanis, Georgios; Katafigiotis, Ioannis; Anastasiou, Ioannis

    2017-01-01

    Introduction: Ureteral endometriosis is a rare disease affecting women of childbearing age which presents with nonspecific symptoms and it may result in severe morbidity. The aim of this study was to review evidence about incidence, pathogenesis, clinical presentation, diagnosis, and management of ureteral endometriosis. Materials and Methods: PubMed Central database was searched to identify studies reporting cases of ureteral endometriosis. “Ureter” or “Ureteral” and “Endometriosis” were used as key words. Database was searched for articles published since 1996, in English without restrictions regarding the study design. Results: From 420 studies obtained through database search, 104 articles were finally included in this review, including a total of 1384 patients with ureteral endometriosis. Data regarding age, location, pathological findings, and interventions were extracted. Mean patients' age was 38.6 years, whereas the therapeutic arsenal included hormonal, endoscopic, and/or surgical treatment. Conclusions: Ureteral endometriosis represents a diagnostic and therapeutic challenge for the clinicians and high clinical suspicion is needed to identify it. PMID:29021650

  13. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FISH AND... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  14. A web based relational database management system for filariasis control

    PubMed Central

    Murty, Upadhyayula Suryanarayana; Kumar, Duvvuri Venkata Rama Satya; Sriram, Kumaraswamy; Rao, Kadiri Madhusudhan; Bhattacharyulu, Chakravarthula Hayageeva Narasimha Venakata; Praveen, Bhoopathi; Krishna, Amirapu Radha

    2005-01-01

    The present study describes a RDBMS (relational database management system) for the effective management of Filariasis, a vector borne disease. Filariasis infects 120 million people from 83 countries. The possible re-emergence of the disease and the complexity of existing control programs warrant the development of new strategies. A database containing comprehensive data associated with filariasis finds utility in disease control. We have developed a database containing information on the socio-economic status of patients, mosquito collection procedures, mosquito dissection data, filariasis survey report and mass blood data. The database can be searched using a user friendly web interface. Availability http://www.webfil.org (login and password can be obtained from the authors) PMID:17597846

  15. An integrative review on conflict management styles among nursing students: Implications for nurse education.

    PubMed

    Labrague, Leodoro J; McEnroe-Petitte, Denise M

    2017-12-01

    Nurse education plays a critical role in the achievement of conflict management skills in nursing students. However, a wider perspective on this concept has not been explored. This paper is a report of a review appraising and synthesizing existing empirical studies describing conflict management styles among nursing students. An integrative review method guided this review. Five (5) bibliographic databases (CINAHL, Medline, Psych Info, Embase and SCOPUS) were searched to locate relevant articles. An electronic database search was performed in December 2016 to locate studies published from 2007 onwards. The search words included: 'conflict', 'management resolution', 'management style', 'management strategy', 'nursing', 'student'. Thirteen (13) articles met the inclusion criteria. Nursing students preferred 'constructive/positive conflict management styles' when handling conflicts. However, more studies are needed to identify factors that may affect their choice of styles. Further, this review emphasizes the need for empirical studies to identify appropriate interventions that would effectively enhance nursing students' skills in managing conflicts using rigorous methods. Nursing faculty play a critical role in teaching, training, and modeling constructive conflict resolution styles in nursing students. Simulation scenarios, reflective exercises, and role playing may be useful to facilitate such learning in choosing constructive conflict management styles. Structured training programme on conflict management will assist nursing students develop positive conflict management styles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A database system for characterization of munitions items in conventional ammunition demilitarization stockpiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chun, K.C.; Chiu, S.Y.; Ditmars, J.D.

    1994-05-01

    The MIDAS (Munition Items Disposition Action System) database system is an electronic data management system capable of storage and retrieval of information on the detailed structures and material compositions of munitions items designated for demilitarization. The types of such munitions range from bulk propellants and small arms to projectiles and cluster bombs. The database system is also capable of processing data on the quantities of inert, PEP (propellant, explosives and pyrotechnics) and packaging materials associated with munitions, components, or parts, and the quantities of chemical compounds associated with parts made of PEP materials. Development of the MIDAS database system hasmore » been undertaken by the US Army to support disposition of unwanted ammunition stockpiles. The inventory of such stockpiles currently includes several thousand items, which total tens of thousands of tons, and is still growing. Providing systematic procedures for disposing of all unwanted conventional munitions is the mission of the MIDAS Demilitarization Program. To carry out this mission, all munitions listed in the Single Manager for Conventional Ammunition inventory must be characterized, and alternatives for resource recovery and recycling and/or disposal of munitions in the demilitarization inventory must be identified.« less

  17. A radiology department intranet: development and applications.

    PubMed

    Willing, S J; Berland, L L

    1999-01-01

    An intranet is a "private Internet" that uses the protocols of the World Wide Web to share information resources within a company or with the company's business partners and clients. The hardware requirements for an intranet begin with a dedicated Web server permanently connected to the departmental network. The heart of a Web server is the hypertext transfer protocol (HTTP) service, which receives a page request from a client's browser and transmits the page back to the client. Although knowledge of hypertext markup language (HTML) is not essential for authoring a Web page, a working familiarity with HTML is useful, as is knowledge of programming and database management. Security can be ensured by using scripts to write information in hidden fields or by means of "cookies." Interfacing databases and database management systems with the Web server and conforming the user interface to HTML syntax can be achieved by means of the common gateway interface (CGI), Active Server Pages (ASP), or other methods. An intranet in a radiology department could include the following types of content: on-call schedules, work schedules and a calendar, a personnel directory, resident resources, memorandums and discussion groups, software for a radiology information system, and databases.

  18. Recurrent urinary tract infections in women.

    PubMed

    Aydin, Abdullatif; Ahmed, Kamran; Zaman, Iftikhar; Khan, Muhammad Shamim; Dasgupta, Prokar

    2015-06-01

    Recurrent urinary tract infections (UTIs) are more common in women and are frequently defined as ≥2 episodes in the last 6 months or ≥3 episodes in the last 12 months. In a primary care setting, 53 % of women above the age of 55 years and 36 % of younger women report a recurrence within 1 year. Thus, management and prevention of recurrent UTI is of utmost significance. This review aims to highlight the latest research in prevention strategies and suggest a management pathway. A search was conducted on MEDLINE, Embase and the Cochrane Database of Systematic Reviews databases for the latest systematic reviews and high-quality randomized controlled trials. Special emphasis was placed on the remit "recurrent" and strongly adhered to. Furthermore, a Google search was conducted for current guidelines on the management of UTIs. Current prevention strategies include eliminating risk factors that increase the risk of acquiring recurrent UTI and continuous, post-coital and self-initiated antimicrobial prophylaxis. Other prospective preventative strategies, currently under trial, include use of vaccinations, D-mannose and lactobacillus (probiotics). Although risk factors should be identified and addressed accordingly, individualized antibiotic prophylaxis remains the most effective method of management. Non-antibiotic prevention strategies such as cranberry, vitamin C and methenamine salts lack strong evidence to be introduced as routine management options and as alternatives to antibiotics. Based on current evidence and guidelines, a management pathway is recommended. Emerging therapies require further evaluation before they can be recommended.

  19. The design and implementation of hydrographical information management system (HIMS)

    NASA Astrophysics Data System (ADS)

    Sui, Haigang; Hua, Li; Wang, Qi; Zhang, Anming

    2005-10-01

    With the development of hydrographical work and information techniques, the large variety of hydrographical information including electronic charts, documents and other materials are widely used, and the traditional management mode and techniques are unsuitable for the development of the Chinese Marine Safety Administration Bureau (CMSAB). How to manage all kinds of hydrographical information has become an important and urgent problem. A lot of advanced techniques including GIS, RS, spatial database management and VR techniques are introduced for solving these problems. Some design principles and key techniques of the HIMS including the mixed mode base on B/S, C/S and stand-alone computer mode, multi-source & multi-scale data organization and management, multi-source data integration and diverse visualization of digital chart, efficient security control strategies are illustrated in detail. Based on the above ideas and strategies, an integrated system named Hydrographical Information Management System (HIMS) was developed. And the HIMS has been applied in the Shanghai Marine Safety Administration Bureau and obtained good evaluation.

  20. Alternatives to relational database: comparison of NoSQL and XML approaches for clinical data storage.

    PubMed

    Lee, Ken Ka-Yin; Tang, Wai-Choi; Choi, Kup-Sze

    2013-04-01

    Clinical data are dynamic in nature, often arranged hierarchically and stored as free text and numbers. Effective management of clinical data and the transformation of the data into structured format for data analysis are therefore challenging issues in electronic health records development. Despite the popularity of relational databases, the scalability of the NoSQL database model and the document-centric data structure of XML databases appear to be promising features for effective clinical data management. In this paper, three database approaches--NoSQL, XML-enabled and native XML--are investigated to evaluate their suitability for structured clinical data. The database query performance is reported, together with our experience in the databases development. The results show that NoSQL database is the best choice for query speed, whereas XML databases are advantageous in terms of scalability, flexibility and extensibility, which are essential to cope with the characteristics of clinical data. While NoSQL and XML technologies are relatively new compared to the conventional relational database, both of them demonstrate potential to become a key database technology for clinical data management as the technology further advances. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  1. Management Guidelines for Database Developers' Teams in Software Development Projects

    NASA Astrophysics Data System (ADS)

    Rusu, Lazar; Lin, Yifeng; Hodosi, Georg

    Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.

  2. The measurement of quality of care in the Veterans Health Administration.

    PubMed

    Halpern, J

    1996-03-01

    The Veterans Health Administration (VHA) is committed to continual refinement of its system of quality measurement. The VHA organizational structure for quality measurement has three levels. At the national level, the Associate Chief Medical Director for Quality Management provides leadership, sets policy, furnishes measurement tools, develops and distributes measures of quality, and delivers educational programs. At the intermediate level, VHA has four regional offices with staff responsible for reviewing risk management data, investigating quality problems, and ensuring compliance with accreditation requirements. At the hospital level, staff reporting directly to the chief of staff or the hospital director are responsible for implementing VHA quality management policy. The Veterans Health Administration's philosophy of quality measurement recognizes the agency's moral imperative to provide America's veterans with care that meets accepted standards. Because the repair of faulty systems is more efficient than the identification of poor performers, VHA has integrated the techniques of total quality into a multifaceted improvement program that also includes the accreditation program and traditional quality assurance activities. VHA monitors its performance by maintaining adverse incident databases, conducting patient satisfaction surveys, contracting for external peer review of 50,000 records per year, and comparing process and outcome rates internally and when possible with external benchmarks. The near-term objectives of VHA include providing medical centers with a quality matrix that will permit local development of quality indicators, construction of a report card for VHA's customers, and implementing the Malcolm W. Baldrige system for quality improvement as the road map for systemwide continuous improvement. Other goals include providing greater access to data, creating a patient-centered database, providing real-time clinical decision support, and expanding the databases.

  3. Analysis and preliminary design of Kunming land use and planning management information system

    NASA Astrophysics Data System (ADS)

    Li, Li; Chen, Zhenjie

    2007-06-01

    This article analyzes Kunming land use planning and management information system from the system building objectives and system building requirements aspects, nails down the system's users, functional requirements and construction requirements. On these bases, the three-tier system architecture based on C/S and B/S is defined: the user interface layer, the business logic layer and the data services layer. According to requirements for the construction of land use planning and management information database derived from standards of the Ministry of Land and Resources and the construction program of the Golden Land Project, this paper divides system databases into planning document database, planning implementation database, working map database and system maintenance database. In the design of the system interface, this paper uses various methods and data formats for data transmission and sharing between upper and lower levels. According to the system analysis results, main modules of the system are designed as follows: planning data management, the planning and annual plan preparation and control function, day-to-day planning management, planning revision management, decision-making support, thematic inquiry statistics, planning public participation and so on; besides that, the system realization technologies are discussed from the system operation mode, development platform and other aspects.

  4. A personal digital assistant application (MobilDent) for dental fieldwork data collection, information management and database handling.

    PubMed

    Forsell, M; Häggström, M; Johansson, O; Sjögren, P

    2008-11-08

    To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.

  5. Semantic-JSON: a lightweight web service interface for Semantic Web contents integrating multiple life science databases.

    PubMed

    Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro

    2011-07-01

    Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org.

  6. Serials Management by Microcomputer: The Potential of DBMS.

    ERIC Educational Resources Information Center

    Vogel, J. Thomas; Burns, Lynn W.

    1984-01-01

    Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are…

  7. A Multimodal Database for a Home Remote Medical Care Application

    NASA Astrophysics Data System (ADS)

    Medjahed, Hamid; Istrate, Dan; Boudy, Jerome; Steenkeste, François; Baldinger, Jean-Louis; Dorizzi, Bernadette

    The home remote monitoring systems aim to make a protective contribution to the well being of individuals (patients, elderly persons) requiring moderate amounts of support for independent living spaces, and improving their everyday life. Existing researches of these systems suffer from lack of experimental data and a standard medical database intended for their validation and improvement. This paper presents a multi-sensors environment for acquiring and recording a multimodal medical database, which includes physiological data (cardiac frequency, activity or agitation, posture, fall), environment sounds and localization data. It provides graphical interface functions to manage, process and index these data. The paper focuses on the system implementation, its usage and it points out possibilities for future work.

  8. Some Reliability Issues in Very Large Databases.

    ERIC Educational Resources Information Center

    Lynch, Clifford A.

    1988-01-01

    Describes the unique reliability problems of very large databases that necessitate specialized techniques for hardware problem management. The discussion covers the use of controlled partial redundancy to improve reliability, issues in operating systems and database management systems design, and the impact of disk technology on very large…

  9. Tufts Health Sciences Database: Lessons, Issues, and Opportunities.

    ERIC Educational Resources Information Center

    Lee, Mary Y.; Albright, Susan A.; Alkasab, Tarik; Damassa, David A.; Wang, Paul J.; Eaton, Elizabeth K.

    2003-01-01

    Describes a seven-year experience with developing the Tufts Health Sciences Database, a database-driven information management system that combines the strengths of a digital library, content delivery tools, and curriculum management. Identifies major effects on teaching and learning. Also addresses issues of faculty development, copyright and…

  10. ExtraTrain: a database of Extragenic regions and Transcriptional information in prokaryotic organisms

    PubMed Central

    Pareja, Eduardo; Pareja-Tobes, Pablo; Manrique, Marina; Pareja-Tobes, Eduardo; Bonal, Javier; Tobes, Raquel

    2006-01-01

    Background Transcriptional regulation processes are the principal mechanisms of adaptation in prokaryotes. In these processes, the regulatory proteins and the regulatory DNA signals located in extragenic regions are the key elements involved. As all extragenic spaces are putative regulatory regions, ExtraTrain covers all extragenic regions of available genomes and regulatory proteins from bacteria and archaea included in the UniProt database. Description ExtraTrain provides integrated and easily manageable information for 679816 extragenic regions and for the genes delimiting each of them. In addition ExtraTrain supplies a tool to explore extragenic regions, named Palinsight, oriented to detect and search palindromic patterns. This interactive visual tool is totally integrated in the database, allowing the search for regulatory signals in user defined sets of extragenic regions. The 26046 regulatory proteins included in ExtraTrain belong to the families AraC/XylS, ArsR, AsnC, Cold shock domain, CRP-FNR, DeoR, GntR, IclR, LacI, LuxR, LysR, MarR, MerR, NtrC/Fis, OmpR and TetR. The database follows the InterPro criteria to define these families. The information about regulators includes manually curated sets of references specifically associated to regulator entries. In order to achieve a sustainable and maintainable knowledge database ExtraTrain is a platform open to the contribution of knowledge by the scientific community providing a system for the incorporation of textual knowledge. Conclusion ExtraTrain is a new database for exploring Extragenic regions and Transcriptional information in bacteria and archaea. ExtraTrain database is available at . PMID:16539733

  11. Agroclimate.Org: Tools and Information for a Climate Resilient Agriculture in the Southeast USA

    NASA Astrophysics Data System (ADS)

    Fraisse, C.

    2014-12-01

    AgroClimate (http://agroclimate.org) is a web-based system developed to help the agricultural industry in the southeastern USA reduce risks associated with climate variability and change. It includes climate related information and dynamic application tools that interact with a climate and crop database system. Information available includes climate monitoring and forecasts combined with information about crop management practices that help increase the resiliency of the agricultural industry in the region. Recently we have included smartphone apps in the AgroClimate suite of tools, including irrigation management and crop disease alert systems. Decision support tools available in AgroClimate include: (a) Climate risk: expected (probabilistic) and historical climate information and freeze risk; (b) Crop yield risk: expected yield based on soil type, planting date, and basic management practices for selected commodities and historical county yield databases; (c) Crop diseases: disease risk monitoring and forecasting for strawberry and citrus; (d) Crop development: monitoring and forecasting of growing degree-days and chill accumulation; (e) Drought: monitoring and forecasting of selected drought indices, (f) Footprints: Carbon and water footprint calculators. The system also provides background information about the main drivers of climate variability and basic information about climate change in the Southeast USA. AgroClimate has been widely used as an educational tool by the Cooperative Extension Services in the region and also by producers. It is now being replicated internationally with version implemented in Mozambique and Paraguay.

  12. Database on Demand: insight how to build your own DBaaS

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, Ruben; Coterillo Coz, Ignacio

    2015-12-01

    At CERN, a number of key database applications are running on user-managed MySQL, PostgreSQL and Oracle database services. The Database on Demand (DBoD) project was born out of an idea to provide CERN user community with an environment to develop and run database services as a complement to the central Oracle based database service. The Database on Demand empowers the user to perform certain actions that had been traditionally done by database administrators, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently three major RDBMS (relational database management system) vendors are offered. In this article we show the actual status of the service after almost three years of operations, some insight of our new redesign software engineering and near future evolution.

  13. Development of a Comprehensive Database System for Safety Analyst

    PubMed Central

    Paz, Alexander; Veeramisti, Naveen; Khanal, Indira; Baker, Justin

    2015-01-01

    This study addressed barriers associated with the use of Safety Analyst, a state-of-the-art tool that has been developed to assist during the entire Traffic Safety Management process but that is not widely used due to a number of challenges as described in this paper. As part of this study, a comprehensive database system and tools to provide data to multiple traffic safety applications, with a focus on Safety Analyst, were developed. A number of data management tools were developed to extract, collect, transform, integrate, and load the data. The system includes consistency-checking capabilities to ensure the adequate insertion and update of data into the database. This system focused on data from roadways, ramps, intersections, and traffic characteristics for Safety Analyst. To test the proposed system and tools, data from Clark County, which is the largest county in Nevada and includes the cities of Las Vegas, Henderson, Boulder City, and North Las Vegas, was used. The database and Safety Analyst together help identify the sites with the potential for safety improvements. Specifically, this study examined the results from two case studies. The first case study, which identified sites having a potential for safety improvements with respect to fatal and all injury crashes, included all roadway elements and used default and calibrated Safety Performance Functions (SPFs). The second case study identified sites having a potential for safety improvements with respect to fatal and all injury crashes, specifically regarding intersections; it used default and calibrated SPFs as well. Conclusions were developed for the calibration of safety performance functions and the classification of site subtypes. Guidelines were provided about the selection of a particular network screening type or performance measure for network screening. PMID:26167531

  14. Insertion algorithms for network model database management systems

    NASA Astrophysics Data System (ADS)

    Mamadolimov, Abdurashid; Khikmat, Saburov

    2017-12-01

    The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, forms partial order. When a database is large and a query comparison is expensive then the efficiency requirement of managing algorithms is minimizing the number of query comparisons. We consider updating operation for network model database management systems. We develop a new sequantial algorithm for updating operation. Also we suggest a distributed version of the algorithm.

  15. A PATO-compliant zebrafish screening database (MODB): management of morpholino knockdown screen information.

    PubMed

    Knowlton, Michelle N; Li, Tongbin; Ren, Yongliang; Bill, Brent R; Ellis, Lynda Bm; Ekker, Stephen C

    2008-01-07

    The zebrafish is a powerful model vertebrate amenable to high throughput in vivo genetic analyses. Examples include reverse genetic screens using morpholino knockdown, expression-based screening using enhancer trapping and forward genetic screening using transposon insertional mutagenesis. We have created a database to facilitate web-based distribution of data from such genetic studies. The MOrpholino DataBase is a MySQL relational database with an online, PHP interface. Multiple quality control levels allow differential access to data in raw and finished formats. MODBv1 includes sequence information relating to almost 800 morpholinos and their targets and phenotypic data regarding the dose effect of each morpholino (mortality, toxicity and defects). To improve the searchability of this database, we have incorporated a fixed-vocabulary defect ontology that allows for the organization of morpholino affects based on anatomical structure affected and defect produced. This also allows comparison between species utilizing Phenotypic Attribute Trait Ontology (PATO) designated terminology. MODB is also cross-linked with ZFIN, allowing full searches between the two databases. MODB offers users the ability to retrieve morpholino data by sequence of morpholino or target, name of target, anatomical structure affected and defect produced. MODB data can be used for functional genomic analysis of morpholino design to maximize efficacy and minimize toxicity. MODB also serves as a template for future sequence-based functional genetic screen databases, and it is currently being used as a model for the creation of a mutagenic insertional transposon database.

  16. NETMARK

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Koga, Dennis (Technical Monitor)

    2002-01-01

    This presentation discuss NASA's proposed NETMARK knowledge management tool which aims 'to control and interoperate with every block in a document, email, spreadsheet, power point, database, etc. across the lifecycle'. Topics covered include: system software requirements and hardware requirements, seamless information systems, computer architecture issues, and potential benefits to NETMARK users.

  17. An Online Library Catalogue.

    ERIC Educational Resources Information Center

    Alloro, Giovanna; Ugolini, Donatella

    1992-01-01

    Describes the implementation of an online catalog in the library of the National Institute for Cancer Research and the Clinical and Experimental Oncology Institute of the University of Genoa. Topics addressed include automation of various library functions, software features, database management, training, and user response. (10 references) (MES)

  18. Assistive Technology and Adults with Learning Disabilities: A Blueprint for Exploration and Advancement.

    ERIC Educational Resources Information Center

    Raskind, Marshall

    1993-01-01

    This article describes assistive technologies for persons with learning disabilities, including word processing, spell checking, proofreading programs, outlining/"brainstorming" programs, abbreviation expanders, speech recognition, speech synthesis/screen review, optical character recognition systems, personal data managers, free-form databases,…

  19. Leveraging Information Technology. Track VII: Outstanding Applications.

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    Eight papers from the 1987 CAUSE conference's Track VII, Outstanding Applications, are presented. They include: "Image Databases in the University" (Reid Kaplan and Gordon Mathieson); "Using Information Technology for Travel Management at the University of Michigan" (Robert E. Russell and John C. Hufziger); "On-Line Access…

  20. Managing troubled data: Coastal data partnerships smooth data integration

    USGS Publications Warehouse

    Hale, S.S.; Hale, Miglarese A.; Bradley, M.P.; Belton, T.J.; Cooper, L.D.; Frame, M.T.; Friel, C.A.; Harwell, L.M.; King, R.E.; Michener, W.K.; Nicolson, D.T.; Peterjohn, B.G.

    2003-01-01

    Understanding the ecology, condition, and changes of coastal areas requires data from many sources. Broad-scale and long-term ecological questions, such as global climate change, biodiversity, and cumulative impacts of human activities, must be addressed with databases that integrate data from several different research and monitoring programs. Various barriers, including widely differing data formats, codes, directories, systems, and metadata used by individual programs, make such integration troublesome. Coastal data partnerships, by helping overcome technical, social, and organizational barriers, can lead to a better understanding of environmental issues, and may enable better management decisions. Characteristics of successful data partnerships include a common need for shared data, strong collaborative leadership, committed partners willing to invest in the partnership, and clear agreements on data standards and data policy. Emerging data and metadata standards that become widely accepted are crucial. New information technology is making it easier to exchange and integrate data. Data partnerships allow us to create broader databases than would be possible for any one organization to create by itself.

  1. Managing troubled data: coastal data partnerships smooth data integration.

    PubMed

    Hale, Stephen S; Miglarese, Anne Hale; Bradley, M Patricia; Belton, Thomas J; Cooper, Larry D; Frame, Michael T; Friel, Christopher A; Harwell, Linda M; King, Robert E; Michener, William K; Nicolson, David T; Peterjohn, Bruce G

    2003-01-01

    Understanding the ecology, condition, and changes of coastal areas requires data from many sources. Broad-scale and long-term ecological questions, such as global climate change, biodiversity, and cumulative impacts of human activities, must be addressed with databases that integrate data from several different research and monitoring programs. Various barriers, including widely differing data formats, codes, directories, systems, and metadata used by individual programs, make such integration troublesome. Coastal data partnerships, by helping overcome technical, social, and organizational barriers, can lead to a better understanding of environmental issues, and may enable better management decisions. Characteristics of successful data partnerships include a common need for shared data, strong collaborative leadership, committed partners willing to invest in the partnership, and clear agreements on data standards and data policy. Emerging data and metadata standards that become widely accepted are crucial. New information technology is making it easier to exchange and integrate data. Data partnerships allow us to create broader databases than would be possible for any one organization to create by itself.

  2. Using Statistics for Database Management in an Academic Library.

    ERIC Educational Resources Information Center

    Hyland, Peter; Wright, Lynne

    1996-01-01

    Collecting statistical data about database usage by library patrons aids in the management of CD-ROM and database offerings, collection development, and evaluation of training programs. Two approaches to data collection are presented which should be used together: an automated or nonintrusive method which monitors search sessions while the…

  3. Database Software Selection for the Egyptian National STI Network.

    ERIC Educational Resources Information Center

    Slamecka, Vladimir

    The evaluation and selection of information/data management system software for the Egyptian National Scientific and Technical (STI) Network are described. An overview of the state-of-the-art of database technology elaborates on the differences between information retrieval and database management systems (DBMS). The desirable characteristics of…

  4. Teaching Database Management System Use in a Library School Curriculum.

    ERIC Educational Resources Information Center

    Cooper, Michael D.

    1985-01-01

    Description of database management systems course being taught to students at School of Library and Information Studies, University of California, Berkeley, notes course structure, assignments, and course evaluation. Approaches to teaching concepts of three types of database systems are discussed and systems used by students in the course are…

  5. A Web application for the management of clinical workflow in image-guided and adaptive proton therapy for prostate cancer treatments.

    PubMed

    Yeung, Daniel; Boes, Peter; Ho, Meng Wei; Li, Zuofeng

    2015-05-08

    Image-guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X-rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post-treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state-of-the-art Web technologies, a domain model closely matching the workflow, a database-supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model-View-Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client-side technologies, such as jQuery, jQuery Plug-ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process.

  6. The land management and operations database (LMOD)

    USDA-ARS?s Scientific Manuscript database

    This paper presents the design, implementation, deployment, and application of the Land Management and Operations Database (LMOD). LMOD is the single authoritative source for reference land management and operation reference data within the USDA enterprise data warehouse. LMOD supports modeling appl...

  7. Development of bilateral data transferability in the Virginia Department of Transportation's Geotechnical Database Management System Framework.

    DOT National Transportation Integrated Search

    2006-01-01

    An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was designed, developed, and implemented at the Virginia Department of Transportation (VDOT) in 2002 to retrieve, manage, archive, and analyze geotechnical da...

  8. Clinical study of the Erlanger silver catheter--data management and biometry.

    PubMed

    Martus, P; Geis, C; Lugauer, S; Böswald, M; Guggenbichler, J P

    1999-01-01

    The clinical evaluation of venous catheters for catheter-induced infections must conform to a strict biometric methodology. The statistical planning of the study (target population, design, degree of blinding), data management (database design, definition of variables, coding), quality assurance (data inspection at several levels) and the biometric evaluation of the Erlanger silver catheter project are described. The three-step data flow included: 1) primary data from the hospital, 2) relational database, 3) files accessible for statistical evaluation. Two different statistical models were compared: analyzing the first catheter only of a patient in the analysis (independent data) and analyzing several catheters from the same patient (dependent data) by means of the generalized estimating equations (GEE) method. The main result of the study was based on the comparison of both statistical models.

  9. Geospatial Database for Strata Objects Based on Land Administration Domain Model (ladm)

    NASA Astrophysics Data System (ADS)

    Nasorudin, N. N.; Hassan, M. I.; Zulkifli, N. A.; Rahman, A. Abdul

    2016-09-01

    Recently in our country, the construction of buildings become more complex and it seems that strata objects database becomes more important in registering the real world as people now own and use multilevel of spaces. Furthermore, strata title was increasingly important and need to be well-managed. LADM is a standard model for land administration and it allows integrated 2D and 3D representation of spatial units. LADM also known as ISO 19152. The aim of this paper is to develop a strata objects database using LADM. This paper discusses the current 2D geospatial database and needs for 3D geospatial database in future. This paper also attempts to develop a strata objects database using a standard data model (LADM) and to analyze the developed strata objects database using LADM data model. The current cadastre system in Malaysia includes the strata title is discussed in this paper. The problems in the 2D geospatial database were listed and the needs for 3D geospatial database in future also is discussed. The processes to design a strata objects database are conceptual, logical and physical database design. The strata objects database will allow us to find the information on both non-spatial and spatial strata title information thus shows the location of the strata unit. This development of strata objects database may help to handle the strata title and information.

  10. Development of Human Face Literature Database Using Text Mining Approach: Phase I.

    PubMed

    Kaur, Paramjit; Krishan, Kewal; Sharma, Suresh K

    2018-06-01

    The face is an important part of the human body by which an individual communicates in the society. Its importance can be highlighted by the fact that a person deprived of face cannot sustain in the living world. The amount of experiments being performed and the number of research papers being published under the domain of human face have surged in the past few decades. Several scientific disciplines, which are conducting research on human face include: Medical Science, Anthropology, Information Technology (Biometrics, Robotics, and Artificial Intelligence, etc.), Psychology, Forensic Science, Neuroscience, etc. This alarms the need of collecting and managing the data concerning human face so that the public and free access of it can be provided to the scientific community. This can be attained by developing databases and tools on human face using bioinformatics approach. The current research emphasizes on creating a database concerning literature data of human face. The database can be accessed on the basis of specific keywords, journal name, date of publication, author's name, etc. The collected research papers will be stored in the form of a database. Hence, the database will be beneficial to the research community as the comprehensive information dedicated to the human face could be found at one place. The information related to facial morphologic features, facial disorders, facial asymmetry, facial abnormalities, and many other parameters can be extracted from this database. The front end has been developed using Hyper Text Mark-up Language and Cascading Style Sheets. The back end has been developed using hypertext preprocessor (PHP). The JAVA Script has used as scripting language. MySQL (Structured Query Language) is used for database development as it is most widely used Relational Database Management System. XAMPP (X (cross platform), Apache, MySQL, PHP, Perl) open source web application software has been used as the server.The database is still under the developmental phase and discusses the initial steps of its creation. The current paper throws light on the work done till date.

  11. Local Community Verification of Coastal Erosion Risks in the Arctic: Insights from Alaska's North Slope

    NASA Astrophysics Data System (ADS)

    Brady, M.

    2016-12-01

    During his historic trip to Alaska in 2015, U.S. President Barack Obama announced a collaborative effort to update maps of the Arctic region in anticipation of increased maritime access and resource development and to support climate resilience. Included in this effort is development of an Arctic-wide satellite-based digital elevation model (DEM) to provide a baseline to monitor landscape change such as coastal erosion. Focusing in Alaska's North Slope, an objective of this study is to transform emerging Arctic environment spatial data products including the new DEM into information that can support local level planning and decision-making in the face of extreme coastal erosion and related environmental threats. In pursuit of this, in 2016, 4 workshops were held in three North Slope villages highly exposed to coastal erosion. The first workshop with approximately 10 managers in Barrow solicited feedback on an erosion risk database developed in a previous research stage and installed onto the North Slope's planning Web portal. The database includes a physical risk indicator based on factors such as historical erosion and effects of sea ice loss summarized at asset locations. After a demonstration of the database, participants discussed usability aspects such as data reliability. The focus of the mapping workshops in Barrow and two smaller villages Wainwright and Kaktovik was to verify and expand the risk database by interactively mapping erosion observations and community asset impacts. Using coded stickers and paper maps of the shoreline showing USGS erosion rates, a total of 50 participants provided feedback on erosion data accuracy. Approximately 25 of the total 50 participants were elders and hunters who also provided in-depth community risk information. The workshop with managers confirmed physical risk factors used in the risk database, and revealed that the information may be relied upon to support some development decisions and better engage developers about erosion risks. Results from the three mapping workshops revealed that most participants agree that the USGS data are consistent with their observations. Also, in-depth contributions from elders and hunters confirmed that there is a need to monitor loss of specific assets including hunting grounds and historic places and associated community impacts.

  12. A web-based, relational database for studying glaciers in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Nigrelli, G.; Chiarle, M.; Nuzzi, A.; Perotti, L.; Torta, G.; Giardino, M.

    2013-02-01

    Glaciers are among the best terrestrial indicators of climate change and thus glacier inventories have attracted a growing, worldwide interest in recent years. In Italy, the first official glacier inventory was completed in 1925 and 774 glacial bodies were identified. As the amount of data continues to increase, and new techniques become available, there is a growing demand for computer tools that can efficiently manage the collected data. The Research Institute for Geo-hydrological Protection of the National Research Council, in cooperation with the Departments of Computer Science and Earth Sciences of the University of Turin, created a database that provides a modern tool for storing, processing and sharing glaciological data. The database was developed according to the need of storing heterogeneous information, which can be retrieved through a set of web search queries. The database's architecture is server-side, and was designed by means of an open source software. The website interface, simple and intuitive, was intended to meet the needs of a distributed public: through this interface, any type of glaciological data can be managed, specific queries can be performed, and the results can be exported in a standard format. The use of a relational database to store and organize a large variety of information about Italian glaciers collected over the last hundred years constitutes a significant step forward in ensuring the safety and accessibility of such data. Moreover, the same benefits also apply to the enhanced operability for handling information in the future, including new and emerging types of data formats, such as geographic and multimedia files. Future developments include the integration of cartographic data, such as base maps, satellite images and vector data. The relational database described in this paper will be the heart of a new geographic system that will merge data, data attributes and maps, leading to a complete description of Italian glacial environments.

  13. Kristin Munch | NREL

    Science.gov Websites

    Information Management System, Materials Research Society Fall Meeting (2013) Photovoltaics Informatics scientific data management, database and data systems design, database clusters, storage systems integration , and distributed data analytics. She has used her experience in laboratory data management systems, lab

  14. Development of the interconnectivity and enhancement (ICE) module in the Virginia Department of Transportation's Geotechnical Database Management System Framework.

    DOT National Transportation Integrated Search

    2007-01-01

    An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was implemented at the Virginia Department of Transportation (VDOT) in 2002 to manage geotechnical data using a distributed Geographical Information System (G...

  15. 23 CFR 973.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS MANAGEMENT... system; (2) A process to operate and maintain the management systems and their associated databases; (3... systems shall use databases with a common or coordinated reference system that can be used to geolocate...

  16. 23 CFR 973.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS MANAGEMENT... system; (2) A process to operate and maintain the management systems and their associated databases; (3... systems shall use databases with a common or coordinated reference system that can be used to geolocate...

  17. Comments on the Indiana Department of Environmental Management Proposed Permit for NIPSCO Bailly Generating Station

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  18. Notification to Federal Land Managers Under Section 165(d) of the Clean Air Act

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  19. Notification to Federal Land Manager Under Section 165(d) of the Clean Air Act

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  20. Soil and Land Resources Information System (SLISYS-Tarim) for Sustainable Management of River Oases along the Tarim River, China

    NASA Astrophysics Data System (ADS)

    Othmanli, Hussein; Zhao, Chengyi; Stahr, Karl

    2017-04-01

    The Tarim River Basin is the largest continental basin in China. The region has extremely continental desert climate characterized by little rainfall <50 mm/a and high potential evaporation >3000 mm/a. The climate change is affecting severely the basin causing soil salinization, water shortage, and regression in crop production. Therefore, a Soil and Land Resources Information System (SLISYS-Tarim) for the regional simulation of crop yield production in the basin was developed. The SLISYS-Tarim consists of a database and an agro-ecological simulation model EPIC (Environmental Policy Integrated Climate). The database comprises relational tables including information about soils, terrain conditions, land use, and climate. The soil data implicate information of 50 soil profiles which were dug, analyzed, described and classified in order to characterize the soils in the region. DEM data were integrated with geological maps to build a digital terrain structure. Remote sensing data of Landsat images were applied for soil mapping, and for land use and land cover classification. An additional database for climate data, land management and crop information were linked to the system, too. Construction of the SLISYS-Tarim database was accomplished by integrating and overlaying the recommended thematic maps within environment of the geographic information system (GIS) to meet the data standard of the global and national SOTER digital database. This database forms appropriate input- and output data for the crop modelling with the EPIC model at various scales in the Tarim Basin. The EPIC model was run for simulating cotton production under a constructed scenario characterizing the current management practices, soil properties and climate conditions. For the EPIC model calibration, some parameters were adjusted so that the modeled cotton yield fits to the measured yield on the filed scale. The validation of the modeling results was achieved in a later step based on remote sensing data. The simulated cotton yield varied according to field management, soil type and salinity level, where soil salinity was the main limiting factor. Furthermore, the calibrated and validated EPIC model was run under several scenarios of climate conditions and land management practices to estimate the effect of climate change on cotton production and sustainability of agriculture systems in the basin. The application of SLISYS-Tarim showed that this database can be a suitable framework for storage and retrieval of soil and terrain data at various scales. The simulation with the EPIC model can assess the impact of climate change and management strategies. Therefore, SLISYS-Tarim can be a good tool for regional planning and serve the decision support system on regional and national scale.

  1. Crisis management aspects of bam catastrophic earthquake: review article.

    PubMed

    Sadeghi-Bazargani, Homayoun; Azami-Aghdash, Saber; Kazemi, Abdolhassan; Ziapour, Behrad

    2015-01-01

    Bam earthquake was the most catastrophic natural disasters in recent years. The aim of this study was to review different aspects of crisis management during and after the catastrophic earthquake in Bam City, Iran. Data needed for this systematic review were collected through searching PubMed, EMBASE and SID databases, for the period from 2003 to 2011. Keywords included earthquake, Iran and Bam earthquake. The data were summarized and were analyzed using Content Analysis. Out of 422 articles, 25 articles were included in the study. Crisis Management aspects and existing pitfalls were classified into seven categories including planning and organization, human resource management, management of logistics, international humanitarian aids, field performance of the military and security forces, health and medical service provision, and information management. Positive aspects and major pitfalls of crisis management have been introduced in all the mentioned categories. The available evidence indicated poor crisis management during Bam earthquake that resulted in aggravating the losses as well as diminishing the effect of interventions. Thus, concerning the importance of different aspects of the crisis management and the high prevalence of disasters in Iran, the observed vulnerability in disaster management process should be addressed.

  2. Nuclear Science References (NSR)

    Science.gov Websites

    be included. For more information, see the help page. The NSR database schema and Web applications have undergone some recent changes. This is a revised version of the NSR Web Interface. NSR Quick Manager: Boris Pritychenko, NNDC, Brookhaven National Laboratory Web Programming: Boris Pritychenko, NNDC

  3. Creating an Online Library To Support a Virtual Learning Community.

    ERIC Educational Resources Information Center

    Sandelands, Eric

    1998-01-01

    International Management Centres (IMC), an independent business school, and Anbar Electronic Intelligence (AEI), a database publisher, have created a virtual library for IMC's virtual business school. Topics discussed include action learning; IMC's partnership with AEI; the virtual university model; designing virtual library resources; and…

  4. How to Program the Principal's Office for the Computer Age.

    ERIC Educational Resources Information Center

    Frankel, Steven

    1983-01-01

    Explains why principals' offices need computers and discusses the characteristics of inexpensive personal business computers, including their operating systems, disk drives, memory, and compactness. Reviews software available for word processing, accounting, database management, and communications, and compares the Kaypro II, Morrow, and Osborne I…

  5. Landslide hazard rating matrix and database : vol. 2 of 2, a manual for landslide inventory.

    DOT National Transportation Integrated Search

    2008-12-01

    The rehabilitation decision for highway slope failure is one of the many important tasks : to be tackled by Ohio Department of Transportation (ODOT). A rational approach to : manage the unsafe or failed slopes/embankments should ideally include a sys...

  6. Computerizing Your Program.

    ERIC Educational Resources Information Center

    Curtis, Rick

    This paper summarizes information about using computer hardware and software to aid in making purchase decisions that are based on user needs. The two major options in hardware are IBM-compatible machines and the Apple Macintosh line. The three basic software applications include word processing, database management, and spreadsheet applications.…

  7. Using Internet Technologies To Enhance Training.

    ERIC Educational Resources Information Center

    Pollock, Carl; Masters, Robert

    1997-01-01

    Describes how to use Internet technologies to create an intranet, or an online training database system, for improving company communications, effectiveness, and job performance. Topics include technology and performance; educating managers and key decision makers; creating a graphic model of the training system; and fitting into the existing…

  8. 40 CFR 70.8 - Permit review by EPA and affected States.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... compatible with EPA's national database management system. (2) The Administrator may waive the requirements...) Transmission of information to the Administrator. (1) The permit program shall require that the permitting authority provide to the Administrator a copy of each permit application (including any application for...

  9. 40 CFR 70.8 - Permit review by EPA and affected States.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... compatible with EPA's national database management system. (2) The Administrator may waive the requirements...) Transmission of information to the Administrator. (1) The permit program shall require that the permitting authority provide to the Administrator a copy of each permit application (including any application for...

  10. 40 CFR 70.8 - Permit review by EPA and affected States.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compatible with EPA's national database management system. (2) The Administrator may waive the requirements...) Transmission of information to the Administrator. (1) The permit program shall require that the permitting authority provide to the Administrator a copy of each permit application (including any application for...

  11. 40 CFR 70.8 - Permit review by EPA and affected States.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... compatible with EPA's national database management system. (2) The Administrator may waive the requirements...) Transmission of information to the Administrator. (1) The permit program shall require that the permitting authority provide to the Administrator a copy of each permit application (including any application for...

  12. 40 CFR 70.8 - Permit review by EPA and affected States.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... compatible with EPA's national database management system. (2) The Administrator may waive the requirements...) Transmission of information to the Administrator. (1) The permit program shall require that the permitting authority provide to the Administrator a copy of each permit application (including any application for...

  13. 78 FR 59344 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-26

    ... includes the Navy Access Control Management System (NACMS) and the U.S. Marine Corps Biometric and... identifying or verifying an individual through the use of biometric databases and associated data processing... appear in person, record their personal identifiable information on the SECNAV 5512/1 Department of the...

  14. Factors contributing to managerial competence of first-line nurse managers: A systematic review.

    PubMed

    Gunawan, Joko; Aungsuroch, Yupin; Fisher, Mary L

    2018-02-01

    To determine the factors contributing to managerial competence of first-line nurse managers. Understanding factors affecting managerial competence of nurse managers remains important to increase the performance of organizations; however, there is sparse research examining factors that influence managerial competence of first-line nurse managers. Systematic review. The search strategy was conducted from April to July 2017 that included 6 electronic databases: Science Direct, PROQUEST Dissertations and Theses, MEDLINE, CINAHL, EMBASE, and Google Scholar for the years 2000 to 2017 with full text in English. Quantitative and qualitative research papers that examined relationships among managerial competence and antecedent factors were included. Quality assessment, data extractions, and analysis were completed on all included studies. Content analysis was used to categorize factors into themes. Eighteen influencing factors were examined and categorized into 3 themes-organizational factors, characteristics and personality traits of individual managers, and role factors. Findings suggest that managerial competence of first-line nurse managers is multifactorial. Further research is needed to develop strategies to develop managerial competence of first-line nurse managers. © 2017 John Wiley & Sons Australia, Ltd.

  15. 48 CFR 52.232-33 - Payment by Electronic Funds Transfer-System for Award Management.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... contained in the System for Award Management (SAM) database. In the event that the EFT information changes, the Contractor shall be responsible for providing the updated information to the SAM database. (c... 210. (d) Suspension of payment. If the Contractor's EFT information in the SAM database is incorrect...

  16. 48 CFR 52.232-33 - Payment by Electronic Funds Transfer-System for Award Management.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... contained in the System for Award Management (SAM) database. In the event that the EFT information changes, the Contractor shall be responsible for providing the updated information to the SAM database. (c... 210. (d) Suspension of payment. If the Contractor's EFT information in the SAM database is incorrect...

  17. Microcomputer-Based Access to Machine-Readable Numeric Databases.

    ERIC Educational Resources Information Center

    Wenzel, Patrick

    1988-01-01

    Describes the use of microcomputers and relational database management systems to improve access to numeric databases by the Data and Program Library Service at the University of Wisconsin. The internal records management system, in-house reference tools, and plans to extend these tools to the entire campus are discussed. (3 references) (CLB)

  18. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    ERIC Educational Resources Information Center

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  19. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  20. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  1. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  2. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  3. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  4. A Database Design and Development Case: NanoTEK Networks

    ERIC Educational Resources Information Center

    Ballenger, Robert M.

    2010-01-01

    This case provides a real-world project-oriented case study for students enrolled in a management information systems, database management, or systems analysis and design course in which database design and development are taught. The case consists of a business scenario to provide background information and details of the unique operating…

  5. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    NASA Astrophysics Data System (ADS)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  6. Towards eliminating malaria in high endemic countries: the roles of community health workers and related cadres and their challenges in integrated community case management for malaria: a systematic review.

    PubMed

    Sunguya, Bruno F; Mlunde, Linda B; Ayer, Rakesh; Jimba, Masamine

    2017-01-03

    Human resource for health crisis has impaired global efforts against malaria in highly endemic countries. To address this, the World Health Organization (WHO) recommended scaling-up of community health workers (CHWs) and related cadres owing to their documented success in malaria and other disease prevention and management. Evidence is inconsistent on the roles and challenges they encounter in malaria interventions. This systematic review aims to summarize evidence on roles and challenges of CHWs and related cadres in integrated community case management for malaria (iCCM). This systematic review retrieved evidence from PubMed, CINAHL, ISI Web of Knowledge, and WHO regional databases. Terms extracted from the Boolean phrase used for PubMed were also used in other databases. The review included studies with Randomized Control Trial, Quasi-experimental, Pre-post interventional, Longitudinal and cohort, Cross-sectional, Case study, and Secondary data analysis. Because of heterogeneity, only narrative synthesis was conducted for this review. A total of 66 articles were eligible for analysis out of 1380 studies retrieved. CHWs and related cadre roles in malaria interventions included: malaria case management, prevention including health surveillance and health promotion specific to malaria. Despite their documented success, CHWs and related cadres succumb to health system challenges. These are poor and unsustainable finance for iCCM, workforce related challenges, lack of and unsustainable supply of medicines and diagnostics, lack of information and research, service delivery and leadership challenges. Community health workers and related cadres had important preventive, case management and promotive roles in malaria interventions. To enable their effective integration into the health systems, the identified challenges should be addressed. They include: introducing sustainable financing on iCCM programmes, tailoring their training to address the identified gaps, improving sustainable supply chain management of malaria drugs and diagnostics, and addressing regulatory challenges in the local contexts.

  7. Network Configuration of Oracle and Database Programming Using SQL

    NASA Technical Reports Server (NTRS)

    Davis, Melton; Abdurrashid, Jibril; Diaz, Philip; Harris, W. C.

    2000-01-01

    A database can be defined as a collection of information organized in such a way that it can be retrieved and used. A database management system (DBMS) can further be defined as the tool that enables us to manage and interact with the database. The Oracle 8 Server is a state-of-the-art information management environment. It is a repository for very large amounts of data, and gives users rapid access to that data. The Oracle 8 Server allows for sharing of data between applications; the information is stored in one place and used by many systems. My research will focus primarily on SQL (Structured Query Language) programming. SQL is the way you define and manipulate data in Oracle's relational database. SQL is the industry standard adopted by all database vendors. When programming with SQL, you work on sets of data (i.e., information is not processed one record at a time).

  8. Oracle Applications Patch Administration Tool (PAT) Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2002-01-04

    PAT is a Patch Administration Tool that provides analysis, tracking, and management of Oracle Application patches. This includes capabilities as outlined below: Patch Analysis & Management Tool Outline of capabilities: Administration Patch Data Maintenance -- track Oracle Application patches applied to what database instance & machine Patch Analysis capture text files (readme.txt and driver files) form comparison detail report comparison detail PL/SQL package comparison detail SQL scripts detail JSP module comparison detail Parse and load the current applptch.txt (10.7) or load patch data from Oracle Application database patch tables (11i) Display Analysis -- Compare patch to be applied with currentmore » Oracle Application installed Appl_top code versions Patch Detail Module comparison detail Analyze and display one Oracle Application module patch. Patch Management -- automatic queue and execution of patches Administration Parameter maintenance -- setting for directory structure of Oracle Application appl_top Validation data maintenance -- machine names and instances to patch Operation Patch Data Maintenance Schedule a patch (queue for later execution) Run a patch (queue for immediate execution) Review the patch logs Patch Management Reports« less

  9. A proposed group management scheme for XTP multicast

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    The purpose of a group management scheme is to enable its associated transfer layer protocol to be responsive to user determined reliability requirements for multicasting. Group management (GM) must assist the client process in coordinating multicast group membership, allow the user to express the subset of the multicast group that a particular multicast distribution must reach in order to be successful (reliable), and provide the transfer layer protocol with the group membership information necessary to guarantee delivery to this subset. GM provides services and mechanisms that respond to the need of the client process or process level management protocols to coordinate, modify, and determine attributes of the multicast group, especially membership. XTP GM provides a link between process groups and their multicast groups by maintaining a group membership database that identifies members in a name space understood by the underlying transfer layer protocol. Other attributes of the multicast group useful to both the client process and the data transfer protocol may be stored in the database. Examples include the relative dispersion, most recent update, and default delivery parameters of a group.

  10. A geospatial database model for the management of remote sensing datasets at multiple spectral, spatial, and temporal scales

    NASA Astrophysics Data System (ADS)

    Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George

    2017-10-01

    In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.

  11. Extending GIS Technology to Study Karst Features of Southeastern Minnesota

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Tipping, R. G.; Alexander, E. C.; Alexander, S. C.

    2001-12-01

    This paper summarizes ongoing research on karst feature distribution of southeastern Minnesota. The main goals of this interdisciplinary research are: 1) to look for large-scale patterns in the rate and distribution of sinkhole development; 2) to conduct statistical tests of hypotheses about the formation of sinkholes; 3) to create management tools for land-use managers and planners; and 4) to deliver geomorphic and hydrogeologic criteria for making scientifically valid land-use policies and ethical decisions in karst areas of southeastern Minnesota. Existing county and sub-county karst feature datasets of southeastern Minnesota have been assembled into a large GIS-based database capable of analyzing the entire data set. The central database management system (DBMS) is a relational GIS-based system interacting with three modules: GIS, statistical and hydrogeologic modules. ArcInfo and ArcView were used to generate a series of 2D and 3D maps depicting karst feature distributions in southeastern Minnesota. IRIS ExplorerTM was used to produce satisfying 3D maps and animations using data exported from GIS-based database. Nearest-neighbor analysis has been used to test sinkhole distributions in different topographic and geologic settings. All current nearest-neighbor analyses testify that sinkholes in southeastern Minnesota are not evenly distributed in this area (i.e., they tend to be clustered). More detailed statistical methods such as cluster analysis, histograms, probability estimation, correlation and regression have been used to study the spatial distributions of some mapped karst features of southeastern Minnesota. A sinkhole probability map for Goodhue County has been constructed based on sinkhole distribution, bedrock geology, depth to bedrock, GIS buffer analysis and nearest-neighbor analysis. A series of karst features for Winona County including sinkholes, springs, seeps, stream sinks and outcrop has been mapped and entered into the Karst Feature Database of Southeastern Minnesota. The Karst Feature Database of Winona County is being expanded to include all the mapped karst features of southeastern Minnesota. Air photos from 1930s to 1990s of Spring Valley Cavern Area in Fillmore County were scanned and geo-referenced into our GIS system. This technology has been proved to be very useful to identify sinkholes and study the rate of sinkhole development.

  12. Universal Index System

    NASA Technical Reports Server (NTRS)

    Kelley, Steve; Roussopoulos, Nick; Sellis, Timos; Wallace, Sarah

    1993-01-01

    The Universal Index System (UIS) is an index management system that uses a uniform interface to solve the heterogeneity problem among database management systems. UIS provides an easy-to-use common interface to access all underlying data, but also allows different underlying database management systems, storage representations, and access methods.

  13. Maintaining Research Documents with Database Management Software.

    ERIC Educational Resources Information Center

    Harrington, Stuart A.

    1999-01-01

    Discusses taking notes for research projects and organizing them into card files; reviews the literature on personal filing systems; introduces the basic process of database management; and offers a plan for managing research notes. Describes field groups and field definitions, data entry, and creating reports. (LRW)

  14. 76 FR 15953 - Agency Information Collection Activities; Announcement of Office of Management and Budget...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-22

    ... CONSUMER PRODUCT SAFETY COMMISSION Agency Information Collection Activities; Announcement of Office of Management and Budget Approval; Publicly Available Consumer Product Safety Information Database... Product Safety Information Database has been approved by the Office of Management and Budget (OMB) under...

  15. Buprenorphine: revisiting the efficacy of transdermal delivery system.

    PubMed

    Kitzmiller, Joseph P; Barnett, Christopher J; Steiner, Nathan S; Stoicea, Nicoleta; Kamar, Nawal; Luzum, Jasmine A; Mikulik, Eduard; Bergese, Sergio D

    2015-01-01

    Buprenorphine is a lipid-soluble pharmaceutic used in the management of chronic pain. It is a partial agonist at μ-opioid receptors, an antagonist at κ-opioid receptors, an agonist at δ-opioid receptors and a partial agonist at ORL-1 (nociceptin) receptors. An extensive literature search, including Google Scholar and Pubmed database, was conducted. Terms including and associated to 'efficacy of transdermal buprenorphine' were utilized to procure contemporary research articles in order to evaluate and compare the transdermal buprenorphine patch to commonly used traditional pain management medications. Transdermal buprenorphine has demonstrated better efficacy than conventional pain management pharmacotherapies. Side effects were similar to those associated with other opioids and included headache, dizziness, somnolence, constipation, dry mouth, nausea, vomiting, pruritus and erythema. Similar to transdermal delivery systems used with other medication, transdermal buprenorphine was associated with application-site pruritus and application-site reactions. Transdermal buprenorphine has significant potential for managing chronic pain. In addition to increased convenience and efficacy, advantages of transdermal buprenorphine include decreased tolerance and decreased withdrawal.

  16. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    NASA Astrophysics Data System (ADS)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  17. Prognosis and management of myocardial infarction: Comparisons between the French FAST-MI 2010 registry and the French public health database.

    PubMed

    Massoullié, Grégoire; Wintzer-Wehekind, Jérome; Chenaf, Chouki; Mulliez, Aurélien; Pereira, Bruno; Authier, Nicolas; Eschalier, Alain; Clerfond, Guillaume; Souteyrand, Géraud; Tabassome, Simon; Danchin, Nicolas; Citron, Bernard; Lusson, Jean-René; Puymirat, Étienne; Motreff, Pascal; Eschalier, Romain

    2016-05-01

    Multicentre registries of myocardial infarction management show a steady improvement in prognosis and greater access to myocardial revascularization in a more timely manner. While French registries are the standard references, the question arises: are data stemming solely from the activity of French cardiac intensive care units (ICUs) a true reflection of the entire French population with ST-segment elevation myocardial infarction (STEMI)? To compare data on patients hospitalized for STEMI from two French registries: the French registry of acute ST-elevation or non-ST-elevation myocardial infarction (FAST-MI) and the Échantillon généraliste des bénéficiaires (EGB) database. We compared patients treated for STEMI listed in the FAST-MI 2010 registry (n=1716) with those listed in the EGB database, which comprises a sample of 1/97th of the French population, also from 2010 (n=403). Compared with the FAST-MI 2010 registry, the EGB database population were older (67.2±15.3 vs 63.3±14.5 years; P<0.001), had a higher percentage of women (36.0% vs 24.7%; P<0.001), were less likely to undergo emergency coronary angiography (75.2% vs 96.3%; P<0.001) and were less often treated in university hospitals (27.1% vs 37.0%; P=0.001). There were no significant differences between the two registries in terms of cardiovascular risk factors, comorbidities and drug treatment at admission. Thirty-day mortality was higher in the EGB database (10.2% vs 4.4%; P<0.001). Registries such as FAST-MI are indispensable, not only for assessing epidemiological changes over time, but also for evaluating the prognostic effect of modern STEMI management. Meanwhile, exploitation of data from general databases, such as EGB, provides additional relevant information, as they include a broader population not routinely admitted to cardiac ICUs. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  18. Bradycardia as an early warning sign for cardiac arrest during routine laparoscopic surgery.

    PubMed

    Yong, Jonathan; Hibbert, Peter; Runciman, William B; Coventry, Brendon J

    2015-12-01

    The aim of this study was to identify clinical patterns of occurrence, management and outcomes surrounding cardiac arrest during laparoscopic surgery using the Australian Incident Monitoring Study (AIMS) database to guide possible prevention and treatment. The AIMS database includes incident reports from participating clinicians from secondary and tertiary healthcare centres across Australia and New Zealand. The AIMS database holds over 11 000 peri- and intraoperative incidents. The primary outcome was to characterize the pattern of events surrounding cardiac arrest. The secondary outcome was to identify successful management strategies in the possible prevention and treatment of cardiac arrest during laparoscopic surgery. Fourteen cases of cardiac arrest during laparoscopic surgery were identified. The majority of cases occurred in 'fit and healthy' patients during elective gynaecological and general surgical procedures. Twelve cases of cardiac arrest were directly associated with pneumoperitoneum with bradycardia preceding cardiac arrest in 75% of these. Management included deflation of pneumoperitoneum, atropine administration and cardiopulmonary resuscitation with circulatory restoration in all cases. The results imply vagal mechanisms associated with peritoneal distension as the predominant contributor to bradycardia and subsequent cardiac arrest during laparoscopy. Bradycardia during gas insufflation is not necessarily a benign event and appears to be a critical early warning sign for possible impending and unexpected cardiac arrest. Immediate deflation of pneumoperitoneum and atropine administration are effective measures that may alleviate bradycardia and possibly avert progression to cardiac arrest. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  19. 'The surface management system' (SuMS) database: a surface-based database to aid cortical surface reconstruction, visualization and analysis

    NASA Technical Reports Server (NTRS)

    Dickson, J.; Drury, H.; Van Essen, D. C.

    2001-01-01

    Surface reconstructions of the cerebral cortex are increasingly widely used in the analysis and visualization of cortical structure, function and connectivity. From a neuroinformatics perspective, dealing with surface-related data poses a number of challenges. These include the multiplicity of configurations in which surfaces are routinely viewed (e.g. inflated maps, spheres and flat maps), plus the diversity of experimental data that can be represented on any given surface. To address these challenges, we have developed a surface management system (SuMS) that allows automated storage and retrieval of complex surface-related datasets. SuMS provides a systematic framework for the classification, storage and retrieval of many types of surface-related data and associated volume data. Within this classification framework, it serves as a version-control system capable of handling large numbers of surface and volume datasets. With built-in database management system support, SuMS provides rapid search and retrieval capabilities across all the datasets, while also incorporating multiple security levels to regulate access. SuMS is implemented in Java and can be accessed via a Web interface (WebSuMS) or using downloaded client software. Thus, SuMS is well positioned to act as a multiplatform, multi-user 'surface request broker' for the neuroscience community.

  20. Understanding interprofessional collaboration in the context of chronic disease management for older adults living in communities: a concept analysis.

    PubMed

    Bookey-Bassett, Sue; Markle-Reid, Maureen; Mckey, Colleen A; Akhtar-Danesh, Noori

    2017-01-01

    To report a concept analysis of interprofessional collaboration in the context of chronic disease management, for older adults living in communities. Increasing prevalence of chronic disease among older adults is creating significant burden for patients, families and healthcare systems. Managing chronic disease for older adults living in the community requires interprofessional collaboration across different health and other care providers, organizations and sectors. However, there is a lack of consensus about the definition and use of interprofessional collaboration for community-based chronic disease management. Concept analysis. Electronic databases CINAHL, Medline, HealthStar, EMBASE, PsychINFO, Ageline and Cochrane Database were searched from 2000 - 2013. Rodgers' evolutionary method for concept analysis. The most common surrogate term was interdisciplinary collaboration. Related terms were interprofessional team, multidisciplinary team and teamwork. Attributes included: an evolving interpersonal process; shared goals, decision-making and care planning; interdependence; effective and frequent communication; evaluation of team processes; involving older adults and family members in the team; and diverse and flexible team membership. Antecedents comprised: role awareness; interprofessional education; trust between team members; belief that interprofessional collaboration improves care; and organizational support. Consequences included impacts on team composition and function, care planning processes and providers' knowledge, confidence and job satisfaction. Interprofessional collaboration is a complex evolving concept. Key components of interprofessional collaboration in chronic disease management for community-living older adults are identified. Implications for nursing practice, education and research are proposed. © 2016 John Wiley & Sons Ltd.

  1. Taking Risk Assessment and Management to the Next Level: Program-Level Risk Analysis to Enable Solid Decision-Making on Priorities and Funding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, J. G.; Morton, R. L.; Castillo, C.

    2011-02-01

    A multi-level (facility and programmatic) risk assessment was conducted for the facilities in the Nevada National Security Site (NNSS) Readiness in Technical Base and Facilities (RTBF) Program and results were included in a new Risk Management Plan (RMP), which was incorporated into the fiscal year (FY) 2010 Integrated Plans. Risks, risk events, probability, consequence(s), and mitigation strategies were identified and captured, for most scope areas (i.e., risk categories) during the facilitated risk workshops. Risk mitigations (i.e., efforts in addition to existing controls) were identified during the facilitated risk workshops when the risk event was identified. Risk mitigation strategies fell intomore » two broad categories: threats or opportunities. Improvement projects were identified and linked to specific risks they mitigate, making the connection of risk reduction through investments for the annual Site Execution Plan. Due to the amount of that was collected, analysis to be performed, and reports to be generated, a Risk Assessment/ Management Tool (RAMtool) database was developed to analyze the risks in real-time, at multiple levels, which reinforced the site-level risk management process and procedures. The RAMtool database was developed and designed to assist in the capturing and analysis of the key elements of risk: probability, consequence, and impact. The RAMtool calculates the facility-level and programmatic-level risk factors to enable a side-by-side comparison to see where the facility manager and program manager should focus their risk reduction efforts and funding. This enables them to make solid decisions on priorities and funding to maximize the risk reduction. A more active risk management process was developed where risks and opportunities are actively managed, monitored, and controlled by each facility more aggressively and frequently. risk owners have the responsibility and accountability to manage their assigned risk in real-time, using the RAMtool database.« less

  2. [Data supporting quality circle management of inpatient depression treatment].

    PubMed

    Brand, S; Härter, M; Sitta, P; van Calker, D; Menke, R; Heindl, A; Herold, K; Kudling, R; Luckhaus, C; Rupprecht, U; Sanner, Dirk; Schmitz, D; Schramm, E; Berger, M; Gaebel, W; Schneider, F

    2005-07-01

    Several quality assurance initiatives in health care have been undertaken during the past years. The next step consists of systematically combining single initiatives in order to built up a strategic quality management. In a German multicenter study, the quality of inpatient depression treatment was measured in ten psychiatric hospitals. Half of the hospitals received comparative feedback on their individual results in comparison to the other hospitals (bench marking). Those bench markings were used by each hospital as a statistic basis for in-house quality work, to improve the quality of depression treatment. According to hospital differences concerning procedure and outcome, different goals were chosen. There were also differences with respect to structural characteristics, strategies, and outcome. The feedback from participants about data-based quality circles in general and the availability of bench-marking data was positive. The necessity of carefully choosing quality circle members and professional moderation became obvious. Data-based quality circles including bench-marking have proven to be useful for quality management in inpatient depression care.

  3. ANTI-VIRAL EFFECTS OF MEDICINAL PLANTS IN THE MANAGEMENT OF DENGUE: A SYSTEMATIC REVIEW

    PubMed Central

    Frederico, Éric Heleno Freira Ferreira; Cardoso, André Luiz Bandeira Dionísio; Moreira-Marconi, Eloá; de Sá-Caputo, Danúbia da Cunha; Guimarães, Carlos Alberto Sampaio; Dionello, Carla da Fontoura; Morel, Danielle Soares; Paineiras-Domingos, Laisa Liane; de Souza, Patricia Lopes; Brandão-Sobrinho-Neto, Samuel; Carvalho-Lima, Rafaelle Pacheco; Guedes-Aguiar, Eliane de Oliveira; Costa-Cavalcanti, Rebeca Graça; Kutter, Cristiane Ribeiro; Bernardo-Filho, Mario

    2017-01-01

    Background: Dengue is considered as an important arboviral disease. Safe, low-cost, and effective drugs that possess inhibitory activity against dengue virus (DENV) are mostly needed to try to combat the dengue infection worldwide. Medicinal plants have been considered as an important alternative to manage several diseases, such as dengue. As authors have demonstrated the antiviral effect of medicinal plants against DENV, the aim of this study was to review systematically the published research concerning the use of medicinal plants in the management of dengue using the PubMed database. Materials and Methods: Search and selection of publications were made using the PubMed database following the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA statement). Results: Six publications met the inclusion criteria and were included in the final selection after thorough analysis. Conclusion: It is suggested that medicinal plants’ products could be used as potential anti-DENV agents. PMID:28740942

  4. Development of the ageing management database of PUSPATI TRIGA reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramli, Nurhayati, E-mail: nurhayati@nm.gov.my; Tom, Phongsakorn Prak; Husain, Nurfazila

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  5. The Design and Implementation of a Relational to Network Query Translator for a Distributed Database Management System.

    DTIC Science & Technology

    1985-12-01

    RELATIONAL TO NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM TH ESI S .L Kevin H. Mahoney -- Captain, USAF AFIT/GCS/ENG/85D-7...NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM - THESIS Presented to the Faculty of the School of Engineering of the Air Force...Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Computer Systems - Kevin H. Mahoney

  6. Aero/fluids database system

    NASA Technical Reports Server (NTRS)

    Reardon, John E.; Violett, Duane L., Jr.

    1991-01-01

    The AFAS Database System was developed to provide the basic structure of a comprehensive database system for the Marshall Space Flight Center (MSFC) Structures and Dynamics Laboratory Aerophysics Division. The system is intended to handle all of the Aerophysics Division Test Facilities as well as data from other sources. The system was written for the DEC VAX family of computers in FORTRAN-77 and utilizes the VMS indexed file system and screen management routines. Various aspects of the system are covered, including a description of the user interface, lists of all code structure elements, descriptions of the file structures, a description of the security system operation, a detailed description of the data retrieval tasks, a description of the session log, and a description of the archival system.

  7. Insect infestations crop development and evolving management approaches on a northeast Arkansas cotton farm

    USDA-ARS?s Scientific Manuscript database

    COTMAN information, cotton production records and insect scouting reports for Wildy Farms in Mississippi County, Arkansas were organized into large databases and studied for variability among years and fields in a wide range of crop and insect indices. The study included records from 126 individual...

  8. A Virtual "Hello": A Web-Based Orientation to the Library.

    ERIC Educational Resources Information Center

    Borah, Eloisa Gomez

    1997-01-01

    Describes the development of Web-based library services and resources available at the Rosenfeld Library of the Anderson Graduate School of Management at University of California at Los Angeles. Highlights include library orientation sessions; virtual tours of the library; a database of basic business sources; and research strategies, including…

  9. Converting the H. W. Wilson Company Indexes to an Automated System: A Functional Analysis.

    ERIC Educational Resources Information Center

    Regazzi, John J.

    1984-01-01

    Description of the computerized information system that supports the editorial and manufacturing processes involved in creation of Wilson's subject indexes and catalogs includes the major subsystems--online data entry, batch input processing, validation and release, file generation and database management, online and offline retrieval, publication…

  10. Common Sense Planning for a Computer, or, What's It Worth to You?

    ERIC Educational Resources Information Center

    Crawford, Walt

    1984-01-01

    Suggests factors to be considered in planning for the purchase of a microcomputer, including budgets, benefits, costs, and decisions. Major uses of a personal computer are described--word processing, financial analysis, file and database management, programming and computer literacy, education, entertainment, and thrill of high technology. (EJS)

  11. 1995 AAAS annual meeting and science innovation exposition: Unity in diversity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strauss, M.S.; Heasley, C.

    1995-12-31

    Abstracts are presented from the 161st National Meeting of the American Association for the advancement of Science. Topics include environmental technologies, genetics, physical science research, information management, nuclear weapon issues, and education. Individual topics have been processed separately for the United States Department of Energy databases.

  12. Tank 241-AY-102 Leak Assessment Supporting Documentation: Miscellaneous Reports, Letters, Memoranda, And Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engeman, J. K.; Girardot, C. L.; Harlow, D. G.

    2012-12-20

    This report contains reference materials cited in RPP-ASMT -53793, Tank 241-AY-102 Leak Assessment Report, that were obtained from the National Archives Federal Records Repository in Seattle, Washington, or from other sources including the Hanford Site's Integrated Data Management System database (IDMS).

  13. Consortial IT Services: Collaborating To Reduce the Pain.

    ERIC Educational Resources Information Center

    Klonoski, Ed

    The Connecticut Distance Learning Consortium (CTDLC) provides its 32 members with Information Technologies (IT) services including a portal Web site, course management software, course hosting and development, faculty training, a help desk, online assessment, and a student financial aid database. These services are supplied to two- and four-year…

  14. Using dBASE II for Bibliographic Files.

    ERIC Educational Resources Information Center

    Sullivan, Jeanette

    1985-01-01

    Describes use of a database management system (dBASE II, produced by Ashton-Tate), noting best features and disadvantages. Highlights include data entry, multiple access points available, training requirements, use of dBASE for a bibliographic application, auxiliary software, and dBASE updates. Sample searches, auxiliary programs, and requirements…

  15. Networking the Light Fantastic--CD-ROMs on LANs.

    ERIC Educational Resources Information Center

    Kittle, Paul W.

    1992-01-01

    Describes the development of a local area network (LAN) at Loma Linda University that allows remote access for both IBM and Macintosh microcomputers to CD-ROMs. Topics discussed include types of networks; fiber optic technology; networking CD-ROM drives; remote access; modems; CD-ROM databases; memory management; interface software; and future…

  16. A review of the use of bromelain in cardiovascular diseases.

    PubMed

    Ley, Chit Moy; Tsiami, Amalia; Ni, Qing; Robinson, Nicola

    2011-07-01

    In 2004 an estimated 17.1 million people died from cardiovascular diseases (CVDs) worldwide, representing 29% of all global deaths. According to the American Heart Association, heart disease and stroke are the main cause of death and disability among people with type 2 diabetes. Additional safe and effective approaches are needed for the prevention and management of CVDs which may include nutritional supplements. To identify the potential of bromelain (a food supplement) on the risk factors associated with CVDs. An electronic and manual search was conducted during November 2009 to March 2010. The databases searched included: Ovid MEDLINE; All EBM Reviews-Cochrane Database of Systematic Reviews (Cochrane DSR), American College of Physicians (ACP) Journal Club, Database of Abstracts of Reviews of Effects (DARE), Cochrane Central Register of Controlled Trials (CCTR), Cochrane Methodology Register (CMR), Health Technology Assessment (HTA) and National Health Service Economic Evaluation Database (NHSEED); Allied and Complementary Medicine (AMED); British Nursing Index and Archive; EMBASE; Health Management Information Consortium (HMIC); Science Direct and Electronic Thesis Online Services (ETHOS). Only papers in the English language were included. Randomised controlled trials (RCTs), human studies, animal studies and experimental studies related to bromelain for CVDs. The quality assessment of all the selected studies was conducted by the authors. Data from 3 animal trials and 3 human trials were included in the review. Data collected included: type of trial, drug dosage, duration, outcome measures, characteristics of bromelain used, significance of results and conclusion. Out of 223 papers retrieved, 6 papers met the inclusion criteria and could be included in the review. These comprised of 3 animal and 3 human trials, each of which investigated the use of bromelain for CVDs. Results suggested that bromelain could be used for treating acute thrombophlebitis, as it decreases aggregation of blood platelets, has a cardio-protective effect, ameliorates rejection-induced arterial wall remodelling, prevents thrombin-induced human platelet aggregation as well as reduces thrombus formation. No substantive study of bromelain and clinical CVDs has been carried out in human populations. Only a few studies on bromelain and CVDs were published from 1948 to 2010. This may be an area worthy to be explored in future CVDs research.

  17. National Readmission Patterns of Isolated Splenic Injuries Based on Initial Management Strategy.

    PubMed

    Rosenberg, Graeme M; Knowlton, Lisa; Rajasingh, Charlotte; Weng, Yingjie; Maggio, Paul M; Spain, David A; Staudenmayer, Kristan L

    2017-12-01

    Options for managing splenic injuries have evolved with a focus on nonoperative management. Long-term outcomes, such as readmissions and delayed splenectomy rate, are not well understood. To describe the natural history of isolated splenic injuries in the United States and determine whether patterns of readmission were influenced by management strategy. The Healthcare Cost and Utilization Project's Nationwide Readmission Database is an all-payer, all-ages, longitudinal administrative database that provides data on more than 35 million weighted US discharges yearly. The database was used to identify patients with isolated splenic injuries and the procedures that they received. Adult patients with isolated splenic injuries admitted from January 1 through June 30, 2013, and from January 1 through June 30, 2014, were included. Those who died during the index hospitalization or who had an additional nonsplenic injury with an Abbreviated Injury Score of 2 or greater were excluded. Univariate and mixed-effects logistic regression analysis controlling for center effect were used. Weighted numbers are reported. Initial management strategy at the time of index hospitalization, including nonprocedural management, angioembolization, and splenectomy. All-cause 6-month readmission rate. Secondary outcome was delayed splenectomy rate. A weighted sample of 3792 patients (2146 men [56.6%] and 1646 women [43.4%]; mean [SE] age, 48.5 [0.7] years) with 5155 admission events was included. During the index hospitalization, 825 (21.8%) underwent splenectomy, 293 (7.7%) underwent angioembolization, and 2673 (70.5%) had no procedure. The overall readmission rate was 21.1% (799 patients). Readmission rates did not differ based on initial management strategy (195 patients undergoing splenectomy [23.6%], 70 undergoing angioembolism [23.9%], and 534 undergoing no procedure [20%]; P = .33). Splenectomy was performed in 36 of 799 readmitted patients (4.5%) who did not have a splenectomy at their index hospitalization, leading to an overall delayed splenectomy rate of 1.2% (36 of 2967 patients). In mixed-effects logistic regression analysis controlling for patient, injury, clinical, and hospital characteristics, the choice of splenectomy (odds ratio, 0.93; 95% CI, 0.66-1.31) vs angioembolization (odds ratio, 1.19; 95% CI, 0.72-1.97) as initial management strategy was not associated with readmission. This national evaluation of the natural history of isolated splenic injuries from index admission through 6 months found that approximately 1 in 5 patients are readmitted within 6 months of discharge after an isolated splenic injury. However, the chance of readmission for splenectomy after initial nonoperative management was 1.2%. This finding suggests that the current management strategies used for isolated splenic injuries in the United States are well matched to patient need.

  18. Cost-effectiveness of pharmaceutical management for osteoarthritis pain: a systematic review of the literature and recommendations for future economic evaluation.

    PubMed

    Xie, Feng; Tanvejsilp, Pimwara; Campbell, Kaitryn; Gaebel, Kathryn

    2013-05-01

    Osteoarthritis (OA) is a highly prevalent and chronic condition characterized by pain and physical disability. Currently, many treatments are available, and they primarily target pain relief. The objectives of this study were to systematically review economic evaluations for pharmaceutical management of OA pain and to provide methodological recommendations for future economic evaluation. Published literature was identified by searching the following bibliographic databases: MEDLINE (1948-16 November 2011) with In-Process records and EMBASE (1980-2011 Week 47) via Ovid; The Cochrane Library (Issue 4 of 4, 2011) and the Health Economic Evaluations Database (HEED) via Wiley; and PubMed (for non-MEDLINE records). The main search terms were OA and economic evaluations. Two reviewers independently screened all identified articles and extracted the data from those included in the final review. Twelve articles reporting the cost-effectiveness of various pharmaceuticals were included, with five being trial-based and seven being model-based economic evaluations. The mean health economics quality score of the included articles was 84 (minimum-maximum: 63-99). These evaluations varied in study design, treatments compared, and outcomes measured. The existing economic evaluations on pharmaceutical management of OA pain were of acceptable quality. Comparability of economic evaluations could be improved by selecting standard comparators, adopting a longer time horizon, and directly measuring health utilities.

  19. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    PubMed

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  20. A hierarchical spatial framework and database for the national river fish habitat condition assessment

    USGS Publications Warehouse

    Wang, L.; Infante, D.; Esselman, P.; Cooper, A.; Wu, D.; Taylor, W.; Beard, D.; Whelan, G.; Ostroff, A.

    2011-01-01

    Fisheries management programs, such as the National Fish Habitat Action Plan (NFHAP), urgently need a nationwide spatial framework and database for health assessment and policy development to protect and improve riverine systems. To meet this need, we developed a spatial framework and database using National Hydrography Dataset Plus (I-.100,000-scale); http://www.horizon-systems.com/nhdplus). This framework uses interconfluence river reaches and their local and network catchments as fundamental spatial river units and a series of ecological and political spatial descriptors as hierarchy structures to allow users to extract or analyze information at spatial scales that they define. This database consists of variables describing channel characteristics, network position/connectivity, climate, elevation, gradient, and size. It contains a series of catchment-natural and human-induced factors that are known to influence river characteristics. Our framework and database assembles all river reaches and their descriptors in one place for the first time for the conterminous United States. This framework and database provides users with the capability of adding data, conducting analyses, developing management scenarios and regulation, and tracking management progresses at a variety of spatial scales. This database provides the essential data needs for achieving the objectives of NFHAP and other management programs. The downloadable beta version database is available at http://ec2-184-73-40-15.compute-1.amazonaws.com/nfhap/main/.

Top