Sample records for central database management

  1. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  2. Centralized Data Management in a Multicountry, Multisite Population-based Study.

    PubMed

    Rahman, Qazi Sadeq-ur; Islam, Mohammad Shahidul; Hossain, Belal; Hossain, Tanvir; Connor, Nicholas E; Jaman, Md Jahiduj; Rahman, Md Mahmudur; Ahmed, A S M Nawshad Uddin; Ahmed, Imran; Ali, Murtaza; Moin, Syed Mamun Ibne; Mullany, Luke; Saha, Samir K; El Arifeen, Shams

    2016-05-01

    A centralized data management system was developed for data collection and processing for the Aetiology of Neonatal Infection in South Asia (ANISA) study. ANISA is a longitudinal cohort study involving neonatal infection surveillance and etiology detection in multiple sites in South Asia. The primary goal of designing such a system was to collect and store data from different sites in a standardized way to pool the data for analysis. We designed the data management system centrally and implemented it to enable data entry at individual sites. This system uses validation rules and audit that reduce errors. The study sites employ a dual data entry method to minimize keystroke errors. They upload collected data weekly to a central server via internet to create a pooled central database. Any inconsistent data identified in the central database are flagged and corrected after discussion with the relevant site. The ANISA Data Coordination Centre in Dhaka provides technical support for operations, maintenance and updating the data management system centrally. Password-protected login identifications and audit trails are maintained for the management system to ensure the integrity and safety of stored data. Centralized management of the ANISA database helps to use common data capture forms (DCFs), adapted to site-specific contextual requirements. DCFs and data entry interfaces allow on-site data entry. This reduces the workload as DCFs do not need to be shipped to a single location for entry. It also improves data quality as all collected data from ANISA goes through the same quality check and cleaning process.

  3. Data bases for forest inventory in the North-Central Region.

    Treesearch

    Jerold T. Hahn; Mark H. Hansen

    1985-01-01

    Describes the data collected by the Forest Inventory and Analysis (FIA) Research Work Unit at the North Central Forest Experiment Station. Explains how interested parties may obtain information from the databases either through direct access or by special requests to the FIA database manager.

  4. Adopting a corporate perspective on databases. Improving support for research and decision making.

    PubMed

    Meistrell, M; Schlehuber, C

    1996-03-01

    The Veterans Health Administration (VHA) is at the forefront of designing and managing health care information systems that accommodate the needs of clinicians, researchers, and administrators at all levels. Rather than using one single-site, centralized corporate database VHA has constructed several large databases with different configurations to meet the needs of users with different perspectives. The largest VHA database is the Decentralized Hospital Computer Program (DHCP), a multisite, distributed data system that uses decoupled hospital databases. The centralization of DHCP policy has promoted data coherence, whereas the decentralization of DHCP management has permitted system development to be done with maximum relevance to the users'local practices. A more recently developed VHA data system, the Event Driven Reporting system (EDR), uses multiple, highly coupled databases to provide workload data at facility, regional, and national levels. The EDR automatically posts a subset of DHCP data to local and national VHA management. The development of the EDR illustrates how adoption of a corporate perspective can offer significant database improvements at reasonable cost and with modest impact on the legacy system.

  5. Small Business Innovations (Integrated Database)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Because of the diversity of NASA's information systems, it was necessary to develop DAVID as a central database management system. Under a Small Business Innovation Research (SBIR) grant, Ken Wanderman and Associates, Inc. designed software tools enabling scientists to interface with DAVID and commercial database management systems, as well as artificial intelligence programs. The software has been installed at a number of data centers and is commercially available.

  6. [Research on Zhejiang blood information network and management system].

    PubMed

    Yan, Li-Xing; Xu, Yan; Meng, Zhong-Hua; Kong, Chang-Hong; Wang, Jian-Min; Jin, Zhen-Liang; Wu, Shi-Ding; Chen, Chang-Shui; Luo, Ling-Fei

    2007-02-01

    This research was aimed to develop the first level blood information centralized database and real time communication network at a province area in China. Multiple technology like local area network database separate operation, real time data concentration and distribution mechanism, allopatric backup, and optical fiber virtual private network (VPN) were used. As a result, the blood information centralized database and management system were successfully constructed, which covers all the Zhejiang province, and the real time exchange of blood data was realised. In conclusion, its implementation promote volunteer blood donation and ensure the blood safety in Zhejiang, especially strengthen the quick response to public health emergency. This project lays the first stone of centralized test and allotment among blood banks in Zhejiang, and can serve as a reference of contemporary blood bank information systems in China.

  7. MST radar data-base management

    NASA Technical Reports Server (NTRS)

    Wickwar, V. B.

    1983-01-01

    Data management for Mesospheric-Stratospheric-Tropospheric, (MST) radars is addressed. An incoherent-scatter radar data base is discussed in terms of purpose, centralization, scope, and nature of the data base management system.

  8. The North Central Forest Inventory and Analysis timber product output database--a regional composite approach.

    Treesearch

    Dennis M. May

    1998-01-01

    Discusses a regional composite approach to managing timber product output data in a relational database. Describes the development and structure of the regional composite database and demonstrates its use in addressing everyday timber product output information needs.

  9. A centralized informatics infrastructure for the National Institute on Drug Abuse Clinical Trials Network.

    PubMed

    Pan, Jeng-Jong; Nahm, Meredith; Wakim, Paul; Cushing, Carol; Poole, Lori; Tai, Betty; Pieper, Carl F

    2009-02-01

    Clinical trial networks (CTNs) were created to provide a sustaining infrastructure for the conduct of multisite clinical trials. As such, they must withstand changes in membership. Centralization of infrastructure including knowledge management, portfolio management, information management, process automation, work policies, and procedures in clinical research networks facilitates consistency and ultimately research. In 2005, the National Institute on Drug Abuse (NIDA) CTN transitioned from a distributed data management model to a centralized informatics infrastructure to support the network's trial activities and administration. We describe the centralized informatics infrastructure and discuss our challenges to inform others considering such an endeavor. During the migration of a clinical trial network from a decentralized to a centralized data center model, descriptive data were captured and are presented here to assess the impact of centralization. We present the framework for the informatics infrastructure and evaluative metrics. The network has decreased the time from last patient-last visit to database lock from an average of 7.6 months to 2.8 months. The average database error rate decreased from 0.8% to 0.2%, with a corresponding decrease in the interquartile range from 0.04%-1.0% before centralization to 0.01-0.27% after centralization. Centralization has provided the CTN with integrated trial status reporting and the first standards-based public data share. A preliminary cost-benefit analysis showed a 50% reduction in data management cost per study participant over the life of a trial. A single clinical trial network comprising addiction researchers and community treatment programs was assessed. The findings may not be applicable to other research settings. The identified informatics components provide the information and infrastructure needed for our clinical trial network. Post centralization data management operations are more efficient and less costly, with higher data quality.

  10. The relational clinical database: a possible solution to the star wars in registry systems.

    PubMed

    Michels, D K; Zamieroski, M

    1990-12-01

    In summary, having data from other service areas available in a relational clinical database could resolve many of the problems existing in today's registry systems. Uniting sophisticated information systems into a centralized database system could definitely be a corporate asset in managing the bottom line.

  11. Economic analysis of centralized vs. decentralized electronic data capture in multi-center clinical studies.

    PubMed

    Walden, Anita; Nahm, Meredith; Barnett, M Edwina; Conde, Jose G; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E; Eisenstein, Eric L

    2011-01-01

    New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs.

  12. Economic Analysis of Centralized vs. Decentralized Electronic Data Capture in Multi-Center Clinical Studies

    PubMed Central

    Walden, Anita; Nahm, Meredith; Barnett, M. Edwina; Conde, Jose G.; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E.; Eisenstein, Eric L.

    2012-01-01

    Background New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. Methods We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. Main Outcome Measures The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Results Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Conclusion Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs. PMID:21335692

  13. Databases in the Central Government : State-of-the-art and the Future

    NASA Astrophysics Data System (ADS)

    Ohashi, Tomohiro

    Management and Coordination Agency, Prime Minister’s Office, conducted a survey by questionnaire against all Japanese Ministries and Agencies, in November 1985, on a subject of the present status of databases produced or planned to be produced by the central government. According to the results, the number of the produced databases has been 132 in 19 Ministries and Agencies. Many of such databases have been possessed by Defence Agency, Ministry of Construction, Ministry of Agriculture, Forestry & Fisheries, and Ministry of International Trade & Industries and have been in the fields of architecture & civil engineering, science & technology, R & D, agriculture, forestry and fishery. However the ratio of the databases available for other Ministries and Agencies has amounted to only 39 percent of all produced databases and the ratio of the databases unavailable for them has amounted to 60 percent of all of such databases, because of in-house databases and so forth. The outline of such results of the survey is reported and the databases produced by the central government are introduced under the items of (1) databases commonly used by all Ministries and Agencies, (2) integrated databases, (3) statistical databases and (4) bibliographic databases. The future problems are also described from the viewpoints of technology developments and mutual uses of databases.

  14. Database on Demand: insight how to build your own DBaaS

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, Ruben; Coterillo Coz, Ignacio

    2015-12-01

    At CERN, a number of key database applications are running on user-managed MySQL, PostgreSQL and Oracle database services. The Database on Demand (DBoD) project was born out of an idea to provide CERN user community with an environment to develop and run database services as a complement to the central Oracle based database service. The Database on Demand empowers the user to perform certain actions that had been traditionally done by database administrators, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently three major RDBMS (relational database management system) vendors are offered. In this article we show the actual status of the service after almost three years of operations, some insight of our new redesign software engineering and near future evolution.

  15. Six steps to an effective denials management program.

    PubMed

    Robertson, Brian; Doré, Alexander

    2005-09-01

    The following six steps can help you manage denials management issues in your organization: Create standard definitions of denial types. Establish a denial hierarchy. Establish a centralized denial database. Develop key performance indicators. Build responsibility matrices. Measure, monitor, and take action.

  16. Computerizing Maintenance Management Improves School Processes.

    ERIC Educational Resources Information Center

    Conroy, Pat

    2002-01-01

    Describes how a Computerized Maintenance Management System (CMMS), a centralized maintenance operations database that facilitates work order procedures and staff directives, can help individual school campuses and school districts to manage maintenance. Presents the benefits of CMMS and things to consider in CMMS selection. (EV)

  17. NCO Production Management Branch

    Science.gov Websites

    Climate Climate Prediction Climate Archives Weather Safety Storm Ready NOAA Central Library Photo Library Management Branch Production Management Branch About the Production Management Branch NCO REQUEST FOR CHANGE (RFC) DATABASE ACCESS NCO Request For Change (RFC) Archive [For INTERNAL Use Only] NCO Request For

  18. 7 CFR 274.3 - Retailer management.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... retailer, and it must include acceptable privacy and security features. Such systems shall only be... terminals that are capable of relaying electronic transactions to a central database computer for... specifications prior to implementation of the EBT system to enable third party processors to access the database...

  19. BADERI: an online database to coordinate handsearching activities of controlled clinical trials for their potential inclusion in systematic reviews.

    PubMed

    Pardo-Hernandez, Hector; Urrútia, Gerard; Barajas-Nava, Leticia A; Buitrago-Garcia, Diana; Garzón, Julieth Vanessa; Martínez-Zapata, María José; Bonfill, Xavier

    2017-06-13

    Systematic reviews provide the best evidence on the effect of health care interventions. They rely on comprehensive access to the available scientific literature. Electronic search strategies alone may not suffice, requiring the implementation of a handsearching approach. We have developed a database to provide an Internet-based platform from which handsearching activities can be coordinated, including a procedure to streamline the submission of these references into CENTRAL, the Cochrane Collaboration Central Register of Controlled Trials. We developed a database and a descriptive analysis. Through brainstorming and discussion among stakeholders involved in handsearching projects, we designed a database that met identified needs that had to be addressed in order to ensure the viability of handsearching activities. Three handsearching teams pilot tested the proposed database. Once the final version of the database was approved, we proceeded to train the staff involved in handsearching. The proposed database is called BADERI (Database of Iberoamerican Clinical Trials and Journals, by its initials in Spanish). BADERI was officially launched in October 2015, and it can be accessed at www.baderi.com/login.php free of cost. BADERI has an administration subsection, from which the roles of users are managed; a references subsection, where information associated to identified controlled clinical trials (CCTs) can be entered; a reports subsection, from which reports can be generated to track and analyse the results of handsearching activities; and a built-in free text search engine. BADERI allows all references to be exported in ProCite files that can be directly uploaded into CENTRAL. To date, 6284 references to CCTs have been uploaded to BADERI and sent to CENTRAL. The identified CCTs were published in a total of 420 journals related to 46 medical specialties. The year of publication ranged between 1957 and 2016. BADERI allows the efficient management of handsearching activities across different countries and institutions. References to all CCTs available in BADERI can be readily submitted to CENTRAL for their potential inclusion in systematic reviews.

  20. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  1. Importance of Data Management in a Long-term Biological Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Sigurd W; Brandt, Craig C; McCracken, Kitty

    2011-01-01

    The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program. An existing relational database was adapted and extended to handle biological data. Data modeling enabled the program's database to process, store, and retrievemore » its data. The data base's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. There are limitations. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.« less

  2. Importance of Data Management in a Long-Term Biological Monitoring Program

    NASA Astrophysics Data System (ADS)

    Christensen, Sigurd W.; Brandt, Craig C.; McCracken, Mary K.

    2011-06-01

    The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to meeting this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program when an existing relational database was adapted and extended to handle biological data. The database's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. We also discuss some limitations to our implementation. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.

  3. Facility Registry Service (FRS)

    EPA Pesticide Factsheets

    This is a centrally managed database that identifies facilities either subject to environmental regulations or of environmental interest, providing an integrated source of air, water, and waste environmental data.

  4. The Microcomputer in the Administrative Office.

    ERIC Educational Resources Information Center

    Huntington, Fred

    1983-01-01

    Discusses microcomputer uses for administrative computing in education at site level and central office and recommends that administrators start with a word processing program for time management, an electronic spreadsheet for financial accounting, a database management system for inventories, and self-written programs to alleviate paper…

  5. SSCR Automated Manager (SAM) release 1. 1 reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-10-01

    This manual provides instructions for using the SSCR Automated Manager (SAM) to manage System Software Change Records (SSCRs) online. SSCRs are forms required to document all system software changes for the Martin Marietta Energy Systems, Inc., Central computer systems. SAM, a program developed at Energy Systems, is accessed through IDMS/R (Integrated Database Management System) on an IBM system.

  6. RISK MANAGEMENT USING PROJECT RECON

    DTIC Science & Technology

    2016-11-28

    Risk Management Using Project Recon UNCLASSIFIED: Distribution Statement A. Approved for public release; distribution is unlimited. Bonnie Leece... Project Recon Lead What is Project Recon? • A web-based GOTS tool designed to capture, manage, and link Risks, Issues, and Opportunities in a...centralized database. • Project Recon (formerly Risk Recon) is designed to be used by all Program Management Offices, Integrated Project Teams and any

  7. Economic evaluation of manual therapy for musculoskeletal diseases: a protocol for a systematic review and narrative synthesis of evidence.

    PubMed

    Kim, Chang-Gon; Mun, Su-Jeong; Kim, Ka-Na; Shin, Byung-Cheul; Kim, Nam-Kwen; Lee, Dong-Hyo; Lee, Jung-Han

    2016-05-13

    Manual therapy is the non-surgical conservative management of musculoskeletal disorders using the practitioner's hands on the patient's body for diagnosing and treating disease. The aim of this study is to systematically review trial-based economic evaluations of manual therapy relative to other interventions used for the management of musculoskeletal diseases. Randomised clinical trials (RCTs) on the economic evaluation of manual therapy for musculoskeletal diseases will be included in the review. The following databases will be searched from their inception: Medline, Embase, Cochrane Central Register of Controlled Trials (CENTRAL), Cumulative Index to Nursing and Allied Health Literature (CINAHL), Econlit, Mantis, Index to Chiropractic Literature, Science Citation Index, Social Science Citation Index, Allied and Complementary Medicine Database (AMED), Cochrane Database of Systematic Reviews (CDSR), National Health Service Database of Abstracts of Reviews of Effects (NHS DARE), National Health Service Health Technology Assessment Database (NHS HTA), National Health Service Economic Evaluation Database (NHS EED), CENTRAL, five Korean medical databases (Oriental Medicine Advanced Searching Integrated System (OASIS), Research Information Service System (RISS), DBPIA, Korean Traditional Knowledge Portal (KTKP) and KoreaMed) and three Chinese databases (China National Knowledge Infrastructure (CNKI), VIP and Wanfang). The evidence for the cost-effectiveness, cost-utility and cost-benefit of manual therapy for musculoskeletal diseases will be assessed as the primary outcome. Health-related quality of life and adverse effects will be assessed as secondary outcomes. We will critically appraise the included studies using the Cochrane risk of bias tool and the Drummond checklist. Results will be summarised using Slavin's qualitative best-evidence synthesis approach. The results of the study will be disseminated via a peer-reviewed journal and/or conference presentations. PROSPERO CRD42015026757. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  8. Strategies for managing impressions of racial identity in the workplace.

    PubMed

    Roberts, Laura Morgan; Cha, Sandra E; Kim, Sung Soo

    2014-10-01

    This article deepens understanding of the workplace experiences of racial minorities by investigating racial identity-based impression management (RIM) by Asian American journalists. Racial centrality, directly or indirectly, predicted the use of 4 RIM strategies (avoidance, enhancement, affiliation, and racial humor). Professional centrality also predicted strategy use, which was related to life satisfaction and perceived career success. By shedding light on proactive strategies that individuals use to influence colleagues' impressions of their racial identity, we contribute to research on diversity in organizations, impression management, and racial identity. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  9. The representation of manipulable solid objects in a relational database

    NASA Technical Reports Server (NTRS)

    Bahler, D.

    1984-01-01

    This project is concerned with the interface between database management and solid geometric modeling. The desirability of integrating computer-aided design, manufacture, testing, and management into a coherent system is by now well recognized. One proposed configuration for such a system uses a relational database management system as the central focus; the various other functions are linked through their use of a common data repesentation in the data manager, rather than communicating pairwise to integrate a geometric modeling capability with a generic relational data managemet system in such a way that well-formed questions can be posed and answered about the performance of the system as a whole. One necessary feature of any such system is simplification for purposes of anaysis; this and system performance considerations meant that a paramount goal therefore was that of unity and simplicity of the data structures used.

  10. Integrative medicine for managing the symptoms of lupus nephritis: A protocol for systematic review and meta-analysis.

    PubMed

    Choi, Tae-Young; Jun, Ji Hee; Lee, Myeong Soo

    2018-03-01

    Integrative medicine is claimed to improve symptoms of lupus nephritis. No systematic reviews have been performed for the application of integrative medicine for lupus nephritis on patients with systemic lupus erythematosus (SLE). Thus, this review will aim to evaluate the current evidence on the efficacy of integrative medicine for the management of lupus nephritis in patients with SLE. The following electronic databases will be searched for studies published from their dates of inception February 2018: Medline, EMBASE and the Cochrane Central Register of Controlled Trials (CENTRAL), as well as 6 Korean medical databases (Korea Med, the Oriental Medicine Advanced Search Integrated System [OASIS], DBpia, the Korean Medical Database [KM base], the Research Information Service System [RISS], and the Korean Studies Information Services System [KISS]), and 1 Chinese medical database (the China National Knowledge Infrastructure [CNKI]). Study selection, data extraction, and assessment will be performed independently by 2 researchers. The risk of bias (ROB) will be assessed using the Cochrane ROB tool. This systematic review will be published in a peer-reviewed journal and disseminated both electronically and in print. The review will be updated to inform and guide healthcare practice and policy. PROSPERO 2018 CRD42018085205.

  11. Development of a standardized Intranet database of formulation records for nonsterile compounding, Part 2.

    PubMed

    Haile, Michael; Anderson, Kim; Evans, Alex; Crawford, Angela

    2012-01-01

    In part 1 of this series, we outlined the rationale behind the development of a centralized electronic database used to maintain nonsterile compounding formulation records in the Mission Health System, which is a union of several independent hospitals and satellite and regional pharmacies that form the cornerstone of advanced medical care in several areas of western North Carolina. Hospital providers in many healthcare systems require compounded formulations to meet the needs of their patients (in particular, pediatric patients). Before a centralized electronic compounding database was implemented in the Mission Health System, each satellite or regional pharmacy affiliated with that system had a specific set of formulation records, but no standardized format for those records existed. In this article, we describe the quality control, database platform selection, description, implementation, and execution of our intranet database system, which is designed to maintain, manage, and disseminate nonsterile compounding formulation records in the hospitals and affiliated pharmacies of the Mission Health System. The objectives of that project were to standardize nonsterile compounding formulation records, create a centralized computerized database that would increase healthcare staff members' access to formulation records, establish beyond-use dates based on published stability studies, improve quality control, reduce the potential for medication errors related to compounding medications, and (ultimately) improve patient safety.

  12. Assessment & Commitment Tracking System (ACTS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryant, Robert A.; Childs, Teresa A.; Miller, Michael A.

    2004-12-20

    The ACTS computer code provides a centralized tool for planning and scheduling assessments, tracking and managing actions associated with assessments or that result from an event or condition, and "mining" data for reporting and analyzing information for improving performance. The ACTS application is designed to work with the MS SQL database management system. All database interfaces are written in SQL. The following software is used to develop and support the ACTS application: Cold Fusion HTML JavaScript Quest TOAD Microsoft Visual Source Safe (VSS) HTML Mailer for sending email Microsoft SQL Microsoft Internet Information Server

  13. R2 REGULATED FACILITIES

    EPA Science Inventory

    The Facility Registry System (FRS) is a centrally managed database that identifies facilities, sites or places subject to environmental regulations or of environmental interest. FRS creates high-quality, accurate, and authoritative facility identification records through rigorous...

  14. A DICOM based radiotherapy plan database for research collaboration and reporting

    NASA Astrophysics Data System (ADS)

    Westberg, J.; Krogh, S.; Brink, C.; Vogelius, I. R.

    2014-03-01

    Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.

  15. Turning Access into a web-enabled secure information system for clinical trials.

    PubMed

    Dongquan Chen; Chen, Wei-Bang; Soong, Mayhue; Soong, Seng-Jaw; Orthner, Helmuth F

    2009-08-01

    Organizations that have limited resources need to conduct clinical studies in a cost-effective, but secure way. Clinical data residing in various individual databases need to be easily accessed and secured. Although widely available, digital certification, encryption, and secure web server, have not been implemented as widely, partly due to a lack of understanding of needs and concerns over issues such as cost and difficulty in implementation. The objective of this study was to test the possibility of centralizing various databases and to demonstrate ways of offering an alternative to a large-scale comprehensive and costly commercial product, especially for simple phase I and II trials, with reasonable convenience and security. We report a working procedure to transform and develop a standalone Access database into a secure Web-based secure information system. For data collection and reporting purposes, we centralized several individual databases; developed, and tested a web-based secure server using self-issued digital certificates. The system lacks audit trails. The cost of development and maintenance may hinder its wide application. The clinical trial databases scattered in various departments of an institution could be centralized into a web-enabled secure information system. The limitations such as the lack of a calendar and audit trail can be partially addressed with additional programming. The centralized Web system may provide an alternative to a comprehensive clinical trial management system.

  16. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  17. Distributing stand inventory data and maps over a wide area network

    Treesearch

    Thomas E. Burk

    2000-01-01

    High-speed networks connecting multiple levels of management are becoming commonplace among forest resources organizations. Such networks can be used to deliver timely spatial and aspatial data relevant to the management of stands to field personnel. A network infrastructure allows maintenance of cost-effective, centralized databases with the potential for updating by...

  18. Rainfall Induced Natural Disaster in Central America, a challenge for Regional Risk Management

    NASA Astrophysics Data System (ADS)

    Estuardo Guinea Barrientos, Héctor; Swain, Ashok

    2013-04-01

    Rainfall induced natural disasters rank first among all natural disasters in Central America. According to the records of the EM-DAT international database, 248 out of 486 disasters registered in Central America were disasters triggered by rainfall invents, in countries like Belize and Honduras, rainfall-induced natural disasters, mainly floods and landslides, account for more than 90% of the total number of casualties as well as the economic damage of all the disasters. Due to the natural conditions of the Central American Isthmus, precipitation events often struck more than one country at the time, for example Hurricane Mitch in 1998 affected the entire Central American region causing more than 18,000 casualties. In this context, the Central America countries have been working on joint programs and policies aiming transboundary cooperation and management of natural disasters, a clear example of this effort is CEPREDENAC which is the intergovernmental body with the mandate of promoting activities, projects and programs towards reduction of the risks to disasters in order to avoid loss of life and economic assets in the Central America, however, transnational management face several challenges that fall mostly in the political, economical and technical areas. In this paper we described and analyzed the rainfall induced natural disasters, their impacts and the inherent management challenges in the Central American context. Key words: Central America, Natural Disasters, Risk Management, International Cooperation

  19. Interactive access to forest inventory data for the South Central United States

    Treesearch

    William H. McWilliams

    1990-01-01

    On-line access to USDA, Forest Service successive forest inventory data for the South Central United States is provided by two computer systems. The Easy Access to Forest Inventory and Analysis Tables program (EZTAB) produces a set of tables for specific geographic areas. The Interactive Graphics and Retrieval System (INGRES) is a database management system that...

  20. MouseNet database: digital management of a large-scale mutagenesis project.

    PubMed

    Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M

    2000-07-01

    The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.

  1. Integrative medicine for managing the symptoms of lupus nephritis

    PubMed Central

    Choi, Tae-Young; Jun, Ji Hee; Lee, Myeong Soo

    2018-01-01

    Abstract Background: Integrative medicine is claimed to improve symptoms of lupus nephritis. No systematic reviews have been performed for the application of integrative medicine for lupus nephritis on patients with systemic lupus erythematosus (SLE). Thus, this review will aim to evaluate the current evidence on the efficacy of integrative medicine for the management of lupus nephritis in patients with SLE. Methods and analyses: The following electronic databases will be searched for studies published from their dates of inception February 2018: Medline, EMBASE and the Cochrane Central Register of Controlled Trials (CENTRAL), as well as 6 Korean medical databases (Korea Med, the Oriental Medicine Advanced Search Integrated System [OASIS], DBpia, the Korean Medical Database [KM base], the Research Information Service System [RISS], and the Korean Studies Information Services System [KISS]), and 1 Chinese medical database (the China National Knowledge Infrastructure [CNKI]). Study selection, data extraction, and assessment will be performed independently by 2 researchers. The risk of bias (ROB) will be assessed using the Cochrane ROB tool. Dissemination: This systematic review will be published in a peer-reviewed journal and disseminated both electronically and in print. The review will be updated to inform and guide healthcare practice and policy. Trial registration number: PROSPERO 2018 CRD42018085205 PMID:29595669

  2. A Data Analysis Expert System For Large Established Distributed Databases

    NASA Astrophysics Data System (ADS)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-05-01

    The purpose of this work is to analyze the applicability of artificial intelligence techniques for developing a user-friendly, parallel interface to large isolated, incompatible NASA databases for the purpose of assisting the management decision process. To carry out this work, a survey was conducted to establish the data access requirements of several key NASA user groups. In addition, current NASA database access methods were evaluated. The results of this work are presented in the form of a design for a natural language database interface system, called the Deductively Augmented NASA Management Decision Support System (DANMDS). This design is feasible principally because of recently announced commercial hardware and software product developments which allow cross-vendor compatibility. The goal of the DANMDS system is commensurate with the central dilemma confronting most large companies and institutions in America, the retrieval of information from large, established, incompatible database systems. The DANMDS system implementation would represent a significant first step toward this problem's resolution.

  3. Supplier Management System

    NASA Technical Reports Server (NTRS)

    Ramirez, Eric; Gutheinz, Sandy; Brison, James; Ho, Anita; Allen, James; Ceritelli, Olga; Tobar, Claudia; Nguyen, Thuykien; Crenshaw, Harrel; Santos, Roxann

    2008-01-01

    Supplier Management System (SMS) allows for a consistent, agency-wide performance rating system for suppliers used by NASA. This version (2.0) combines separate databases into one central database that allows for the sharing of supplier data. Information extracted from the NBS/Oracle database can be used to generate ratings. Also, supplier ratings can now be generated in the areas of cost, product quality, delivery, and audit data. Supplier data can be charted based on real-time user input. Based on these individual ratings, an overall rating can be generated. Data that normally would be stored in multiple databases, each requiring its own log-in, is now readily available and easily accessible with only one log-in required. Additionally, the database can accommodate the storage and display of quality-related data that can be analyzed and used in the supplier procurement decision-making process. Moreover, the software allows for a Closed-Loop System (supplier feedback), as well as the capability to communicate with other federal agencies.

  4. The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity

    NASA Astrophysics Data System (ADS)

    Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo

    2015-05-01

    The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.

  5. The UNIX/XENIX Advantage: Applications in Libraries.

    ERIC Educational Resources Information Center

    Gordon, Kelly L.

    1988-01-01

    Discusses the application of the UNIX/XENIX operating system to support administrative office automation functions--word processing, spreadsheets, database management systems, electronic mail, and communications--at the Central Michigan University Libraries. Advantages and disadvantages of the XENIX operating system and system configuration are…

  6. A uniform database of teleseismic shear wave splitting measurements for the western and central United States

    NASA Astrophysics Data System (ADS)

    Liu, Kelly H.; Elsheikh, Ahmed; Lemnifi, Awad; Purevsuren, Uranbaigal; Ray, Melissa; Refayee, Hesham; Yang, Bin B.; Yu, Youqiang; Gao, Stephen S.

    2014-05-01

    We present a shear wave splitting (SWS) database for the western and central United States as part of a lasting effort to build a uniform SWS database for the entire North America. The SWS measurements were obtained by minimizing the energy on the transverse component of the PKS, SKKS, and SKS phases. Each of the individual measurements was visually checked to ensure quality. This version of the database contains 16,105 pairs of splitting parameters. The data used to generate the parameters were recorded by 1774 digital broadband seismic stations over the period of 1989-2012, and represented all the available data from both permanent and portable seismic networks archived at the Incorporated Research Institutions for Seismology Data Management Center in the area of 26.00°N to 50.00°N and 125.00°W to 90.00°W. About 10,000 pairs of the measurements were from the 1092 USArray Transportable Array stations. The results show that approximately 2/3 of the fast orientations are within 30° from the absolute plate motion (APM) direction of the North American plate, and most of the largest departures with the APM are located along the eastern boundary of the western US orogenic zone and in the central Great Basins. The splitting times observed in the western US are larger than, and those in the central US are comparable with the global average of 1.0 s. The uniform database has an unprecedented spatial coverage and can be used for various investigations of the structure and dynamics of the Earth.

  7. The accuracy of real-time procedure coding by theatre nurses: a comparison with the central national system.

    PubMed

    Maclean, Donald; Younes, Hakim Ben; Forrest, Margaret; Towers, Hazel K

    2012-03-01

    Accurate and timely clinical data are required for clinical and organisational purposes and is especially important for patient management, audit of surgical performance and the electronic health record. The recent introduction of computerised theatre management systems has enabled real-time (point-of-care) operative procedure coding by clinical staff. However the accuracy of these data is unknown. The aim of this Scottish study was to compare the accuracy of theatre nurses' real-time coding on the local theatre management system with the central Scottish Morbidity Record (SMR01). Paired procedural codes were recorded, qualitatively graded for precision and compared (n = 1038). In this study, real-time, point-of-care coding by theatre nurses resulted in significant coding errors compared with the central SMR01 database. Improved collaboration between full-time coders and clinical staff using computerised decision support systems is suggested.

  8. An image database management system for conducting CAD research

    NASA Astrophysics Data System (ADS)

    Gruszauskas, Nicholas; Drukker, Karen; Giger, Maryellen L.

    2007-03-01

    The development of image databases for CAD research is not a trivial task. The collection and management of images and their related metadata from multiple sources is a time-consuming but necessary process. By standardizing and centralizing the methods in which these data are maintained, one can generate subsets of a larger database that match the specific criteria needed for a particular research project in a quick and efficient manner. A research-oriented management system of this type is highly desirable in a multi-modality CAD research environment. An online, webbased database system for the storage and management of research-specific medical image metadata was designed for use with four modalities of breast imaging: screen-film mammography, full-field digital mammography, breast ultrasound and breast MRI. The system was designed to consolidate data from multiple clinical sources and provide the user with the ability to anonymize the data. Input concerning the type of data to be stored as well as desired searchable parameters was solicited from researchers in each modality. The backbone of the database was created using MySQL. A robust and easy-to-use interface for entering, removing, modifying and searching information in the database was created using HTML and PHP. This standardized system can be accessed using any modern web-browsing software and is fundamental for our various research projects on computer-aided detection, diagnosis, cancer risk assessment, multimodality lesion assessment, and prognosis. Our CAD database system stores large amounts of research-related metadata and successfully generates subsets of cases that match the user's desired search criteria.

  9. Impact of quality management monitoring and intervention on central venous catheter dysfunction in the outpatient chemotherapy infusion setting.

    PubMed

    Bansal, Anu; Binkert, Christoph A; Robinson, Malcolm K; Shulman, Lawrence N; Pellerin, Linda; Davison, Brian

    2008-08-01

    To assess the utility of maintaining and analyzing a quality-management database while investigating a subjectively perceived increase in the incidence of tunneled catheter and port dysfunction in a cohort of oncology outpatients. All 152 patients undergoing lytic therapy (2-4 mg alteplase) of a malfunctioning indwelling central venous catheter (CVC) from January through June 2004 at a single cancer center in the United States were included in a quality-management database. Patients were categorized by time to device failure and the initial method of catheter placement (surgery vs interventional radiology). Data were analyzed after 3 months, and areas of possible improvement were identified and acted upon. Three months of follow-up data were then collected and similarly analyzed. In a 6-month period, 152 patients treated for catheter malfunction received a total of 276 doses of lytic therapy. A 3-month interim analysis revealed a disproportionately high rate (34%) of early catheter malfunction (ECM; <30 days from placement). Postplacement radiographs demonstrated suboptimal catheter positioning in 67% of these patients, all of whom had surgical catheter placement. There was a 50% absolute decrease in the number of patients presenting with catheter malfunction in the period from April through June (P < .001). Evaluation of postplacement radiographs in these patients demonstrated a 50% decrease in the incidence of suboptimal positioning (P < .05). Suboptimal positioning was likely responsible for some, but not all, cases of ECM. Maintenance of a quality-management database is a relatively simple intervention that can have a clear and important impact on the quality and cost of patient care.

  10. Collecting and Using Student Information for School Improvement.

    ERIC Educational Resources Information Center

    Riegel, N. Blyth

    This paper suggests methods for collecting and using student information for school improvement by describing how the Richardson Independent School District (RISD), Texas, determines data for effective school management decisionmaking. RISD readily accesses student information via a networked database on line with the central office's IBM…

  11. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  12. 78 FR 69097 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... (CRM) system to improve the response to correspondences from individuals seeking information from a... and private sector sources.'' The SalesForce CRM provides a centralized portal to manage frequently... topics. Depending on the topic searched, the CRM queries the database of pre-approved questions and...

  13. Improving CD-ROM Management through Networking.

    ERIC Educational Resources Information Center

    Rutherford, John

    1990-01-01

    Summarizes advantages, components, and manufacturers of CD-ROM networks based on experiences at the Central Connecticut State University library. Three configurations are described, and the steps in installing a network where a large number of databases are shared by a number of microcomputers are detailed. Licensing and network performance issues…

  14. The Cronus Distributed DBMS (Database Management System) Project

    DTIC Science & Technology

    1989-10-01

    projects, e.g., HiPAC [Dayal 88] and Postgres [Stonebraker 86]. Although we expect to use these techniques, they have been developed for centralized...Computing Systems, June 1989. (To appear). [Stonebraker 86] Stonebraker, M. and Rowe, L. A., "The Design of POSTGRES ," Proceedings ACM SIGMOD Annual

  15. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  16. ASEAN Mineral Database and Information System (AMDIS)

    NASA Astrophysics Data System (ADS)

    Okubo, Y.; Ohno, T.; Bandibas, J. C.; Wakita, K.; Oki, Y.; Takahashi, Y.

    2014-12-01

    AMDIS has lunched officially since the Fourth ASEAN Ministerial Meeting on Minerals on 28 November 2013. In cooperation with Geological Survey of Japan, the web-based GIS was developed using Free and Open Source Software (FOSS) and the Open Geospatial Consortium (OGC) standards. The system is composed of the local databases and the centralized GIS. The local databases created and updated using the centralized GIS are accessible from the portal site. The system introduces distinct advantages over traditional GIS. Those are a global reach, a large number of users, better cross-platform capability, charge free for users, charge free for provider, easy to use, and unified updates. Raising transparency of mineral information to mining companies and to the public, AMDIS shows that mineral resources are abundant throughout the ASEAN region; however, there are many datum vacancies. We understand that such problems occur because of insufficient governance of mineral resources. Mineral governance we refer to is a concept that enforces and maximizes the capacity and systems of government institutions that manages minerals sector. The elements of mineral governance include a) strengthening of information infrastructure facility, b) technological and legal capacities of state-owned mining companies to fully-engage with mining sponsors, c) government-led management of mining projects by supporting the project implementation units, d) government capacity in mineral management such as the control and monitoring of mining operations, and e) facilitation of regional and local development plans and its implementation with the private sector.

  17. Database citation in supplementary data linked to Europe PubMed Central full text biomedical articles.

    PubMed

    Kafkas, Şenay; Kim, Jee-Hyub; Pi, Xingjun; McEntyre, Johanna R

    2015-01-01

    In this study, we present an analysis of data citation practices in full text research articles and their corresponding supplementary data files, made available in the Open Access set of articles from Europe PubMed Central. Our aim is to investigate whether supplementary data files should be considered as a source of information for integrating the literature with biomolecular databases. Using text-mining methods to identify and extract a variety of core biological database accession numbers, we found that the supplemental data files contain many more database citations than the body of the article, and that those citations often take the form of a relatively small number of articles citing large collections of accession numbers in text-based files. Moreover, citation of value-added databases derived from submission databases (such as Pfam, UniProt or Ensembl) is common, demonstrating the reuse of these resources as datasets in themselves. All the database accession numbers extracted from the supplementary data are publicly accessible from http://dx.doi.org/10.5281/zenodo.11771. Our study suggests that supplementary data should be considered when linking articles with data, in curation pipelines, and in information retrieval tasks in order to make full use of the entire research article. These observations highlight the need to improve the management of supplemental data in general, in order to make this information more discoverable and useful.

  18. National information network and database system of hazardous waste management in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma Hongchang

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry,more » and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.« less

  19. 78 FR 55689 - Applications for Fiscal Year 2014 Awards; Impact Aid Section 8002 Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-11

    ... System (DUNS) number and a Taxpayer Identification Number (TIN); b. Register both your DUNS number and TIN with the System for Award Management (SAM) (formerly the Central Contractor Registry (CCR)), the Government's primary registrant database; c. Provide your DUNS number and TIN on your application; and d...

  20. 78 FR 29349 - Applications for New Awards; National Institute on Disability and Rehabilitation Research...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... System (DUNS) number and a Taxpayer Identification Number (TIN); b. Register both your DUNS number and TIN with the Central Contractor Registry (CCR)--and, after July 24, 2012, with the System for Award Management (SAM), the Government's primary registrant database; c. Provide your DUNS number and TIN on your...

  1. 77 FR 187 - Federal Acquisition Regulation; Transition to the System for Award Management (SAM)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-03

    ... architecture. Deletes reference to ``business partner network'' at 4.1100, Scope, which is no longer necessary...) architecture has begun. This effort will transition the Central Contractor Registration (CCR) database, the...) to the new architecture. This case provides the first step in updating the FAR for these changes, and...

  2. Techniques for Efficiently Managing Large Geosciences Data Sets

    NASA Astrophysics Data System (ADS)

    Kruger, A.; Krajewski, W. F.; Bradley, A. A.; Smith, J. A.; Baeck, M. L.; Steiner, M.; Lawrence, R. E.; Ramamurthy, M. K.; Weber, J.; Delgreco, S. A.; Domaszczynski, P.; Seo, B.; Gunyon, C. A.

    2007-12-01

    We have developed techniques and software tools for efficiently managing large geosciences data sets. While the techniques were developed as part of an NSF-Funded ITR project that focuses on making NEXRAD weather data and rainfall products available to hydrologists and other scientists, they are relevant to other geosciences disciplines that deal with large data sets. Metadata, relational databases, data compression, and networking are central to our methodology. Data and derived products are stored on file servers in a compressed format. URLs to, and metadata about the data and derived products are managed in a PostgreSQL database. Virtually all access to the data and products is through this database. Geosciences data normally require a number of processing steps to transform the raw data into useful products: data quality assurance, coordinate transformations and georeferencing, applying calibration information, and many more. We have developed the concept of crawlers that manage this scientific workflow. Crawlers are unattended processes that run indefinitely, and at set intervals query the database for their next assignment. A database table functions as a roster for the crawlers. Crawlers perform well-defined tasks that are, except for perhaps sequencing, largely independent from other crawlers. Once a crawler is done with its current assignment, it updates the database roster table, and gets its next assignment by querying the database. We have developed a library that enables one to quickly add crawlers. The library provides hooks to external (i.e., C-language) compiled codes, so that developers can work and contribute independently. Processes called ingesters inject data into the system. The bulk of the data are from a real-time feed using UCAR/Unidata's IDD/LDM software. An exciting recent development is the establishment of a Unidata HYDRO feed that feeds value-added metadata over the IDD/LDM. Ingesters grab the metadata and populate the PostgreSQL tables. These and other concepts we have developed have enabled us to efficiently manage a 70 Tb (and growing) data weather radar data set.

  3. The spectral database Specchio: Data management, data sharing and initial processing of field spectrometer data within the Dimensions of Biodiversity project

    NASA Astrophysics Data System (ADS)

    Hueni, A.; Schweiger, A. K.

    2015-12-01

    Field spectrometry has substantially gained importance in vegetation ecology due to the increasing knowledge about causal ties between vegetation spectra and biochemical and structural plant traits. Additionally, worldwide databases enable the exchange of spectral and plant trait data and promote global research cooperation. This can be expected to further enhance the use of field spectrometers in ecological studies. However, the large amount of data collected during spectral field campaigns poses major challenges regarding data management, archiving and processing. The spectral database Specchio is designed to organize, manage, process and share spectral data and metadata. We provide an example for using Specchio based on leaf level spectra of prairie plant species collected during the 2015 field campaign of the Dimensions of Biodiversity research project, conducted at the Cedar Creek Long-Term Ecological Research site, in central Minnesota. We show how spectral data collections can be efficiently administered, organized and shared between distinct research groups and explore the capabilities of Specchio for data quality checks and initial processing steps.

  4. Knowledge-Based Vision Techniques for the Autonomous Land Vehicle Program

    DTIC Science & Technology

    1991-10-01

    Knowledge System The CKS is an object-oriented knowledge database that was originally designed to serve as the central information manager for a...34 Representation Space: An Approach to the Integra- tion of Visual Information ," Proc. of DARPA Image Understanding Workshop, Palo Alto, CA, pp. 263-272, May 1989...Strat, " Information Management in a Sensor-Based Au- tonomous System," Proc. DARPA Image Understanding Workshop, University of Southern CA, Vol.1, pp

  5. Development of the Lymphoma Enterprise Architecture Database: A caBIG(tm) Silver level compliant System

    PubMed Central

    Huang, Taoying; Shenoy, Pareen J.; Sinha, Rajni; Graiser, Michael; Bumpers, Kevin W.; Flowers, Christopher R.

    2009-01-01

    Lymphomas are the fifth most common cancer in United States with numerous histological subtypes. Integrating existing clinical information on lymphoma patients provides a platform for understanding biological variability in presentation and treatment response and aids development of novel therapies. We developed a cancer Biomedical Informatics Grid™ (caBIG™) Silver level compliant lymphoma database, called the Lymphoma Enterprise Architecture Data-system™ (LEAD™), which integrates the pathology, pharmacy, laboratory, cancer registry, clinical trials, and clinical data from institutional databases. We utilized the Cancer Common Ontological Representation Environment Software Development Kit (caCORE SDK) provided by National Cancer Institute’s Center for Bioinformatics to establish the LEAD™ platform for data management. The caCORE SDK generated system utilizes an n-tier architecture with open Application Programming Interfaces, controlled vocabularies, and registered metadata to achieve semantic integration across multiple cancer databases. We demonstrated that the data elements and structures within LEAD™ could be used to manage clinical research data from phase 1 clinical trials, cohort studies, and registry data from the Surveillance Epidemiology and End Results database. This work provides a clear example of how semantic technologies from caBIG™ can be applied to support a wide range of clinical and research tasks, and integrate data from disparate systems into a single architecture. This illustrates the central importance of caBIG™ to the management of clinical and biological data. PMID:19492074

  6. Development of the Lymphoma Enterprise Architecture Database: a caBIG Silver level compliant system.

    PubMed

    Huang, Taoying; Shenoy, Pareen J; Sinha, Rajni; Graiser, Michael; Bumpers, Kevin W; Flowers, Christopher R

    2009-04-03

    Lymphomas are the fifth most common cancer in United States with numerous histological subtypes. Integrating existing clinical information on lymphoma patients provides a platform for understanding biological variability in presentation and treatment response and aids development of novel therapies. We developed a cancer Biomedical Informatics Grid (caBIG) Silver level compliant lymphoma database, called the Lymphoma Enterprise Architecture Data-system (LEAD), which integrates the pathology, pharmacy, laboratory, cancer registry, clinical trials, and clinical data from institutional databases. We utilized the Cancer Common Ontological Representation Environment Software Development Kit (caCORE SDK) provided by National Cancer Institute's Center for Bioinformatics to establish the LEAD platform for data management. The caCORE SDK generated system utilizes an n-tier architecture with open Application Programming Interfaces, controlled vocabularies, and registered metadata to achieve semantic integration across multiple cancer databases. We demonstrated that the data elements and structures within LEAD could be used to manage clinical research data from phase 1 clinical trials, cohort studies, and registry data from the Surveillance Epidemiology and End Results database. This work provides a clear example of how semantic technologies from caBIG can be applied to support a wide range of clinical and research tasks, and integrate data from disparate systems into a single architecture. This illustrates the central importance of caBIG to the management of clinical and biological data.

  7. Software support for Huntingtons disease research.

    PubMed

    Conneally, P M; Gersting, J M; Gray, J M; Beidleman, K; Wexler, N S; Smith, C L

    1991-01-01

    Huntingtons disease (HD) is a hereditary disorder involving the central nervous system. Its effects are devastating, to the affected person as well as his family. The Department of Medical and Molecular Genetics at Indiana University (IU) plays an integral part in Huntingtons research by providing computerized repositories of HD family information for researchers and families. The National Huntingtons Disease Research Roster, founded in 1979 at IU, and the Huntingtons Disease in Venezuela Project database contain information that has proven to be invaluable in the worldwide field of HD research. This paper addresses the types of information stored in each database, the pedigree database program (MEGADATS) used to manage the data, and significant findings that have resulted from access to the data.

  8. Federated or cached searches: Providing expected performance from multiple invasive species databases

    NASA Astrophysics Data System (ADS)

    Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.

    2011-06-01

    Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search "deep" web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.

  9. Federated or cached searches: providing expected performance from multiple invasive species databases

    USGS Publications Warehouse

    Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.

    2011-01-01

    Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search “deep” web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.

  10. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  11. Spatial configuration and distribution of forest patches in Champaign County, Illinois: 1940 to 1993

    Treesearch

    J. Danilo Chinea

    1997-01-01

    Spatial configuration and distribution of landscape elements have implications for the dynamics of forest ecosystems, and, therefore, for the management of these resources. The forest cover of Champaign County, in east-central Illinois, was mapped from 1940 and 1993 aerial photography and entered in a geographical information system database. In 1940, 208 forest...

  12. Information Technology and the Evolution of the Library

    DTIC Science & Technology

    2009-03-01

    Resource Commons/ Repository/ Federated Search ILS (GLADIS/Pathfinder - Millenium)/ Catalog/ Circulation/ Acquisitions/ Digital Object Content...content management services to help centralize and distribute digi- tal content from across the institution, software to allow for seamless federated ... search - ing across multiple databases, and imaging software to allow for daily reimaging of ter- minals to reduce security concerns that otherwise

  13. Implementing the EuroFIR Document and Data Repositories as accessible resources of food composition information.

    PubMed

    Unwin, Ian; Jansen-van der Vliet, Martine; Westenbrink, Susanne; Presser, Karl; Infanger, Esther; Porubska, Janka; Roe, Mark; Finglas, Paul

    2016-02-15

    The EuroFIR Document and Data Repositories are being developed as accessible collections of source documents, including grey literature, and the food composition data reported in them. These Repositories will contain source information available to food composition database compilers when selecting their nutritional data. The Document Repository was implemented as searchable bibliographic records in the Europe PubMed Central database, which links to the documents online. The Data Repository will contain original data from source documents in the Document Repository. Testing confirmed the FoodCASE food database management system as a suitable tool for the input, documentation and quality assessment of Data Repository information. Data management requirements for the input and documentation of reported analytical results were established, including record identification and method documentation specifications. Document access and data preparation using the Repositories will provide information resources for compilers, eliminating duplicated work and supporting unambiguous referencing of data contributing to their compiled data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. EPA Facility Registry Service (FRS): OIL

    EPA Pesticide Factsheets

    This dataset contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Oil database. The Oil database contains information on Spill Prevention, Control, and Countermeasure (SPCC) and Facility Response Plan (FRP) subject facilities to prevent and respond to oil spills. FRP facilities are referred to as substantial harm facilities due to the quantities of oil stored and facility characteristics. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to Oil facilities once the Oil data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.

  15. Characterising droughts in Central America with uncertain hydro-meteorological data

    NASA Astrophysics Data System (ADS)

    Quesada Montano, B.; Westerberg, I.; Wetterhall, F.; Hidalgo, H. G.; Halldin, S.

    2015-12-01

    Droughts studies are scarce in Central America, a region frequently affected by droughts that cause significant socio-economic and environmental problems. Drought characterisation is important for water management and planning and can be done with the help of drought indices. Many indices have been developed in the last decades but their ability to suitably characterise droughts depends on the region of application. In Central America, comprehensive and high-quality observational networks of meteorological and hydrological data are not available. This limits the choice of drought indices and denotes the need to evaluate the quality of the data used in their calculation. This paper aimed to find which combination(s) of drought index and meteorological database are most suitable for characterising droughts in Central America. The drought indices evaluated were the standardised precipitation index (SPI), deciles (DI), the standardised precipitation evapotranspiration index (SPEI) and the effective drought index (EDI). These were calculated using precipitation data from the Climate Hazards Group Infra-Red Precipitation with station (CHIRPS), CRN073, the Climate Research Unit (CRU), ERA-Interim and station databases, and temperature data from the CRU database. All the indices were calculated at 1-, 3-, 6-, 9- and 12-month accumulation times. As a first step, the large-scale meteorological precipitation datasets were compared to have an overview of the level of agreement between them and find possible quality problems. Then, the performance of all the combinations of drought indices and meteorological datasets were evaluated against independent river discharge data, in form of the standardised streamflow index (SSI). Results revealed the large disagreement between the precipitation datasets; we found the selection of database to be more important than the selection of drought index. We found that the best combinations of meteorological drought index and database were obtained using the SPI and DI, calculated with CHIRPS and station data.

  16. Space Station Freedom environmental database system (FEDS) for MSFC testing

    NASA Technical Reports Server (NTRS)

    Story, Gail S.; Williams, Wendy; Chiu, Charles

    1991-01-01

    The Water Recovery Test (WRT) at Marshall Space Flight Center (MSFC) is the first demonstration of integrated water recovery systems for potable and hygiene water reuse as envisioned for Space Station Freedom (SSF). In order to satisfy the safety and health requirements placed on the SSF program and facilitate test data assessment, an extensive laboratory analysis database was established to provide a central archive and data retrieval function. The database is required to store analysis results for physical, chemical, and microbial parameters measured from water, air and surface samples collected at various locations throughout the test facility. The Oracle Relational Database Management System (RDBMS) was utilized to implement a secured on-line information system with the ECLSS WRT program as the foundation for this system. The database is supported on a VAX/VMS 8810 series mainframe and is accessible from the Marshall Information Network System (MINS). This paper summarizes the database requirements, system design, interfaces, and future enhancements.

  17. FDDI information management system for centralizing interactive, computerized multimedia clinical experiences in pediatric rheumatology/Immunology.

    PubMed

    Rouhani, R; Cronenberger, H; Stein, L; Hannum, W; Reed, A M; Wilhelm, C; Hsiao, H

    1995-01-01

    This paper describes the design, authoring, and development of interactive, computerized, multimedia clinical simulations in pediatric rheumatology/immunology and related musculoskeletal diseases, the development and implementation of a high speed information management system for their centralized storage and distribution, and analytical methods for evaluating the total system's educational impact on medical students and pediatric residents. An FDDI fiber optic network with client/server/host architecture is the core. The server houses digitized audio, still-image video clips and text files. A host station houses the DB2/2 database containing case-associated labels and information. Cases can be accessed from any workstation via a customized interface in AVA/2 written specifically for this application. OS/2 Presentation Manager controls, written in C, are incorporated into the interface. This interface allows SQL searches and retrievals of cases and case materials. In addition to providing user-directed clinical experiences, this centralized information management system provides designated faculty with the ability to add audio notes and visual pointers to image files. Users may browse through case materials, mark selected ones and download them for utilization in lectures or for editing and converting into 35mm slides.

  18. Central venous catheter-related infections in hematology and oncology: 2012 updated guidelines on diagnosis, management and prevention by the Infectious Diseases Working Party of the German Society of Hematology and Medical Oncology.

    PubMed

    Hentrich, M; Schalk, E; Schmidt-Hieber, M; Chaberny, I; Mousset, S; Buchheidt, D; Ruhnke, M; Penack, O; Salwender, H; Wolf, H-H; Christopeit, M; Neumann, S; Maschmeyer, G; Karthaus, M

    2014-05-01

    Cancer patients are at increased risk for central venous catheter-related infections (CRIs). Thus, a comprehensive, practical and evidence-based guideline on CRI in patients with malignancies is warranted. A panel of experts by the Infectious Diseases Working Party (AGIHO) of the German Society of Hematology and Medical Oncology (DGHO) has developed a guideline on CRI in cancer patients. Literature searches of the PubMed, Medline and Cochrane databases were carried out and consensus discussions were held. Recommendations on diagnosis, management and prevention of CRI in cancer patients are made, and the strength of the recommendation and the level of evidence are presented. This guideline is an evidence-based approach to the diagnosis, management and prevention of CRI in cancer patients.

  19. Health information and communication system for emergency management in a developing country, Iran.

    PubMed

    Seyedin, Seyed Hesam; Jamali, Hamid R

    2011-08-01

    Disasters are fortunately rare occurrences. However, accurate and timely information and communication are vital to adequately prepare individual health organizations for such events. The current article investigates the health related communication and information systems for emergency management in Iran. A mixed qualitative and quantitative methodology was used in this study. A sample of 230 health service managers was surveyed using a questionnaire and 65 semi-structured interviews were also conducted with public health and therapeutic affairs managers who were responsible for emergency management. A range of problems were identified including fragmentation of information, lack of local databases, lack of clear information strategy and lack of a formal system for logging disaster related information at regional or local level. Recommendations were made for improving the national emergency management information and communication system. The findings have implications for health organizations in developing and developed countries especially in the Middle East. Creating disaster related information databases, creating protocols and standards, setting an information strategy, training staff and hosting a center for information system in the Ministry of Health to centrally manage and share the data could improve the current information system.

  20. Software support for Huntingtons disease research.

    PubMed Central

    Conneally, P. M.; Gersting, J. M.; Gray, J. M.; Beidleman, K.; Wexler, N. S.; Smith, C. L.

    1991-01-01

    Huntingtons disease (HD) is a hereditary disorder involving the central nervous system. Its effects are devastating, to the affected person as well as his family. The Department of Medical and Molecular Genetics at Indiana University (IU) plays an integral part in Huntingtons research by providing computerized repositories of HD family information for researchers and families. The National Huntingtons Disease Research Roster, founded in 1979 at IU, and the Huntingtons Disease in Venezuela Project database contain information that has proven to be invaluable in the worldwide field of HD research. This paper addresses the types of information stored in each database, the pedigree database program (MEGADATS) used to manage the data, and significant findings that have resulted from access to the data. PMID:1839672

  1. The Iranian National Geodata Revision Strategy and Realization Based on Geodatabase

    NASA Astrophysics Data System (ADS)

    Haeri, M.; Fasihi, A.; Ayazi, S. M.

    2012-07-01

    In recent years, using of spatial database for storing and managing spatial data has become a hot topic in the field of GIS. Accordingly National Cartographic Center of Iran (NCC) produces - from time to time - some spatial data which is usually included in some databases. One of the NCC major projects was designing National Topographic Database (NTDB). NCC decided to create National Topographic Database of the entire country-based on 1:25000 coverage maps. The standard of NTDB was published in 1994 and its database was created at the same time. In NTDB geometric data was stored in MicroStation design format (DGN) which each feature has a link to its attribute data (stored in Microsoft Access file). Also NTDB file was produced in a sheet-wise mode and then stored in a file-based style. Besides map compilation, revision of existing maps has already been started. Key problems of NCC are revision strategy, NTDB file-based style storage and operator challenges (NCC operators are almost preferred to edit and revise geometry data in CAD environments). A GeoDatabase solution for national Geodata, based on NTDB map files and operators' revision preferences, is introduced and released herein. The proposed solution extends the traditional methods to have a seamless spatial database which it can be revised in CAD and GIS environment, simultaneously. The proposed system is the common data framework to create a central data repository for spatial data storage and management.

  2. MyMolDB: a micromolecular database solution with open source and free components.

    PubMed

    Xia, Bing; Tai, Zheng-Fu; Gu, Yu-Cheng; Li, Bang-Jing; Ding, Li-Sheng; Zhou, Yan

    2011-10-01

    To manage chemical structures in small laboratories is one of the important daily tasks. Few solutions are available on the internet, and most of them are closed source applications. The open-source applications typically have limited capability and basic cheminformatics functionalities. In this article, we describe an open-source solution to manage chemicals in research groups based on open source and free components. It has a user-friendly interface with the functions of chemical handling and intensive searching. MyMolDB is a micromolecular database solution that supports exact, substructure, similarity, and combined searching. This solution is mainly implemented using scripting language Python with a web-based interface for compound management and searching. Almost all the searches are in essence done with pure SQL on the database by using the high performance of the database engine. Thus, impressive searching speed has been archived in large data sets for no external Central Processing Unit (CPU) consuming languages were involved in the key procedure of the searching. MyMolDB is an open-source software and can be modified and/or redistributed under GNU General Public License version 3 published by the Free Software Foundation (Free Software Foundation Inc. The GNU General Public License, Version 3, 2007. Available at: http://www.gnu.org/licenses/gpl.html). The software itself can be found at http://code.google.com/p/mymoldb/. Copyright © 2011 Wiley Periodicals, Inc.

  3. Using SIR (Scientific Information Retrieval System) for data management during a field program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tichler, J.L.

    As part of the US Department of Energy's program, PRocessing of Emissions by Clouds and Precipitation (PRECP), a team of scientists from four laboratories conducted a study in north central New York State, to characterize the chemical and physical processes occurring in winter storms. Sampling took place from three aircraft, two instrumented motor homes and a network of 26 surface precipitation sampling sites. Data management personnel were part of the field program, using a portable IBM PC-AT computer to enter information as it became available during the field study. Having the same database software on the field computer and onmore » the cluster of VAX 11/785 computers in use aided database development and the transfer of data between machines. 2 refs., 3 figs., 5 tabs.« less

  4. A Comparison of Different Database Technologies for the CMS AsyncStageOut Transfer Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ciangottini, D.; Balcas, J.; Mascheroni, M.

    AsyncStageOut (ASO) is the component of the CMS distributed data analysis system (CRAB) that manages users transfers in a centrally controlled way using the File Transfer System (FTS3) at CERN. It addresses a major weakness of the previous, decentralized model, namely that the transfer of the user’s output data to a single remote site was part of the job execution, resulting in inefficient use of job slots and an unacceptable failure rate. Currently ASO manages up to 600k files of various sizes per day from more than 500 users per month, spread over more than 100 sites. ASO uses amore » NoSQL database (CouchDB) as internal bookkeeping and as way to communicate with other CRAB components. Since ASO/CRAB were put in production in 2014, the number of transfers constantly increased up to a point where the pressure to the central CouchDB instance became critical, creating new challenges for the system scalability, performance, and monitoring. This forced a re-engineering of the ASO application to increase its scalability and lowering its operational effort. In this contribution we present a comparison of the performance of the current NoSQL implementation and a new SQL implementation, and how their different strengths and features influenced the design choices and operational experience. We also discuss other architectural changes introduced in the system to handle the increasing load and latency in delivering output to the user.« less

  5. A comparison of different database technologies for the CMS AsyncStageOut transfer database

    NASA Astrophysics Data System (ADS)

    Ciangottini, D.; Balcas, J.; Mascheroni, M.; Rupeika, E. A.; Vaandering, E.; Riahi, H.; Silva, J. M. D.; Hernandez, J. M.; Belforte, S.; Ivanov, T. T.

    2017-10-01

    AsyncStageOut (ASO) is the component of the CMS distributed data analysis system (CRAB) that manages users transfers in a centrally controlled way using the File Transfer System (FTS3) at CERN. It addresses a major weakness of the previous, decentralized model, namely that the transfer of the user’s output data to a single remote site was part of the job execution, resulting in inefficient use of job slots and an unacceptable failure rate. Currently ASO manages up to 600k files of various sizes per day from more than 500 users per month, spread over more than 100 sites. ASO uses a NoSQL database (CouchDB) as internal bookkeeping and as way to communicate with other CRAB components. Since ASO/CRAB were put in production in 2014, the number of transfers constantly increased up to a point where the pressure to the central CouchDB instance became critical, creating new challenges for the system scalability, performance, and monitoring. This forced a re-engineering of the ASO application to increase its scalability and lowering its operational effort. In this contribution we present a comparison of the performance of the current NoSQL implementation and a new SQL implementation, and how their different strengths and features influenced the design choices and operational experience. We also discuss other architectural changes introduced in the system to handle the increasing load and latency in delivering output to the user.

  6. Concepts and data model for a co-operative neurovascular database.

    PubMed

    Mansmann, U; Taylor, W; Porter, P; Bernarding, J; Jäger, H R; Lasjaunias, P; Terbrugge, K; Meisel, J

    2001-08-01

    Problems of clinical management of neurovascular diseases are very complex. This is caused by the chronic character of the diseases, a long history of symptoms and diverse treatments. If patients are to benefit from treatment, then treatment decisions have to rely on reliable and accurate knowledge of the natural history of the disease and the various treatments. Recent developments in statistical methodology and experience from electronic patient records are used to establish an information infrastructure based on a centralized register. A protocol to collect data on neurovascular diseases with technical as well as logistical aspects of implementing a database for neurovascular diseases are described. The database is designed as a co-operative tool of audit and research available to co-operating centres. When a database is linked to a systematic patient follow-up, it can be used to study prognosis. Careful analysis of patient outcome is valuable for decision-making.

  7. Cutaneous lichen planus: A systematic review of treatments.

    PubMed

    Fazel, Nasim

    2015-06-01

    Various treatment modalities are available for cutaneous lichen planus. Pubmed, EMBASE, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Database of Abstracts of Reviews of Effects, and Health Technology Assessment Database were searched for all the systematic reviews and randomized controlled trials related to cutaneous lichen planus. Two systematic reviews and nine relevant randomized controlled trials were identified. Acitretin, griseofulvin, hydroxychloroquine and narrow band ultraviolet B are demonstrated to be effective in the treatment of cutaneous lichen planus. Sulfasalazine is effective, but has an unfavorable safety profile. KH1060, a vitamin D analogue, is not beneficial in the management of cutaneous lichen planus. Evidence from large scale randomized trials demonstrating the safety and efficacy for many other treatment modalities used to treat cutaneous lichen planus is simply not available.

  8. BioMart Central Portal: an open database network for the biological community

    PubMed Central

    Guberman, Jonathan M.; Ai, J.; Arnaiz, O.; Baran, Joachim; Blake, Andrew; Baldock, Richard; Chelala, Claude; Croft, David; Cros, Anthony; Cutts, Rosalind J.; Di Génova, A.; Forbes, Simon; Fujisawa, T.; Gadaleta, E.; Goodstein, D. M.; Gundem, Gunes; Haggarty, Bernard; Haider, Syed; Hall, Matthew; Harris, Todd; Haw, Robin; Hu, S.; Hubbard, Simon; Hsu, Jack; Iyer, Vivek; Jones, Philip; Katayama, Toshiaki; Kinsella, R.; Kong, Lei; Lawson, Daniel; Liang, Yong; Lopez-Bigas, Nuria; Luo, J.; Lush, Michael; Mason, Jeremy; Moreews, Francois; Ndegwa, Nelson; Oakley, Darren; Perez-Llamas, Christian; Primig, Michael; Rivkin, Elena; Rosanoff, S.; Shepherd, Rebecca; Simon, Reinhard; Skarnes, B.; Smedley, Damian; Sperling, Linda; Spooner, William; Stevenson, Peter; Stone, Kevin; Teague, J.; Wang, Jun; Wang, Jianxin; Whitty, Brett; Wong, D. T.; Wong-Erasmus, Marie; Yao, L.; Youens-Clark, Ken; Yung, Christina; Zhang, Junjun; Kasprzyk, Arek

    2011-01-01

    BioMart Central Portal is a first of its kind, community-driven effort to provide unified access to dozens of biological databases spanning genomics, proteomics, model organisms, cancer data, ontology information and more. Anybody can contribute an independently maintained resource to the Central Portal, allowing it to be exposed to and shared with the research community, and linking it with the other resources in the portal. Users can take advantage of the common interface to quickly utilize different sources without learning a new system for each. The system also simplifies cross-database searches that might otherwise require several complicated steps. Several integrated tools streamline common tasks, such as converting between ID formats and retrieving sequences. The combination of a wide variety of databases, an easy-to-use interface, robust programmatic access and the array of tools make Central Portal a one-stop shop for biological data querying. Here, we describe the structure of Central Portal and show example queries to demonstrate its capabilities. Database URL: http://central.biomart.org. PMID:21930507

  9. Did States Use Implementation Discretion to Reduce the Stringency of NCLB? Evidence from a Database of State Regulations

    ERIC Educational Resources Information Center

    Wong, Vivian C.; Wing, Coady; Martin, David; Krishnamachari, Anandita

    2018-01-01

    When No Child Left Behind (NCLB) became law in 2002, it was viewed as an effort to create uniform standards for students and schools across the country. More than a decade later, we know surprisingly little about how states actually implemented NCLB and the extent to which state implementation decisions managed to undo the centralizing objectives…

  10. An architecture for a brain-image database

    NASA Technical Reports Server (NTRS)

    Herskovits, E. H.

    2000-01-01

    The widespread availability of methods for noninvasive assessment of brain structure has enabled researchers to investigate neuroimaging correlates of normal aging, cerebrovascular disease, and other processes; we designate such studies as image-based clinical trials (IBCTs). We propose an architecture for a brain-image database, which integrates image processing and statistical operators, and thus supports the implementation and analysis of IBCTs. The implementation of this architecture is described and results from the analysis of image and clinical data from two IBCTs are presented. We expect that systems such as this will play a central role in the management and analysis of complex research data sets.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less

  12. SU-G-TeP4-06: An Integrated Application for Radiation Therapy Treatment Plan Directives, Management, and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matuszak, M; Anderson, C; Lee, C

    Purpose: With electronic medical records, patient information for the treatment planning process has become disseminated across multiple applications with limited quality control and many associated failure modes. We present the development of a single application with a centralized database to manage the planning process. Methods: The system was designed to replace current functionalities of (i) static directives representing the physician intent for the prescription and planning goals, localization information for delivery, and other information, (ii) planning objective reports, (iii) localization and image guidance documents and (iv) the official radiation therapy prescription in the medical record. Using the Eclipse Scripting Applicationmore » Programming Interface, a plug-in script with an associated domain-specific SQL Server database was created to manage the information in (i)–(iv). The system’s user interface and database were designed by a team of physicians, clinical physicists, database experts, and software engineers to ensure usability and robustness for clinical use. Results: The resulting system has been fully integrated within the TPS via a custom script and database. Planning scenario templates, version control, approvals, and logic-based quality control allow this system to fully track and document the planning process as well as physician approval of tradeoffs while improving the consistency of the data. Multiple plans and prescriptions are supported along with non-traditional dose objectives and evaluation such as biologically corrected models, composite dose limits, and management of localization goals. User-specific custom views were developed for the attending physician review, physicist plan checks, treating therapists, and peer review in chart rounds. Conclusion: A method was developed to maintain cohesive information throughout the planning process within one integrated system by using a custom treatment planning management application that interfaces directly with the TPS. Future work includes quantifying the improvements in quality, safety and efficiency that are possible with the routine clinical use of this system. Supported in part by NIH-P01-CA-059827.« less

  13. Managing Data, Provenance and Chaos through Standardization and Automation at the Georgia Coastal Ecosystems LTER Site

    NASA Astrophysics Data System (ADS)

    Sheldon, W.

    2013-12-01

    Managing data for a large, multidisciplinary research program such as a Long Term Ecological Research (LTER) site is a significant challenge, but also presents unique opportunities for data stewardship. LTER research is conducted within multiple organizational frameworks (i.e. a specific LTER site as well as the broader LTER network), and addresses both specific goals defined in an NSF proposal as well as broader goals of the network; therefore, every LTER data can be linked to rich contextual information to guide interpretation and comparison. The challenge is how to link the data to this wealth of contextual metadata. At the Georgia Coastal Ecosystems LTER we developed an integrated information management system (GCE-IMS) to manage, archive and distribute data, metadata and other research products as well as manage project logistics, administration and governance (figure 1). This system allows us to store all project information in one place, and provide dynamic links through web applications and services to ensure content is always up to date on the web as well as in data set metadata. The database model supports tracking changes over time in personnel roles, projects and governance decisions, allowing these databases to serve as canonical sources of project history. Storing project information in a central database has also allowed us to standardize both the formatting and content of critical project information, including personnel names, roles, keywords, place names, attribute names, units, and instrumentation, providing consistency and improving data and metadata comparability. Lookup services for these standard terms also simplify data entry in web and database interfaces. We have also coupled the GCE-IMS to our MATLAB- and Python-based data processing tools (i.e. through database connections) to automate metadata generation and packaging of tabular and GIS data products for distribution. Data processing history is automatically tracked throughout the data lifecycle, from initial import through quality control, revision and integration by our data processing system (GCE Data Toolbox for MATLAB), and included in metadata for versioned data products. This high level of automation and system integration has proven very effective in managing the chaos and scalability of our information management program.

  14. Simulation and management games for training command and control in emergencies.

    PubMed

    Levi, Leon; Bregman, David

    2003-01-01

    The aim of our project was to introduce and implement simulation techniques in a problematic field of increasing health care system preparedness for disasters. This field was chosen as knowledge is gained by few experienced staff members who need to disperse it to others during the busy routine work of the system personnel. Knowledge management techniques ranging from classifying the current data, centralized organizational knowledge storage and using it for decision making and dispersing it through the organization--were used in this project. In the first stage we analyzed the current system of building a preparedness protocol (set of orders). We identified the pitfalls of changing personnel and loosing knowledge gained through lessons from local and national experience. For this stage we developed a database of resources and objects (casualties) to be used in the simulation in different possibilities. One of those was the differentiation between drills with trainer and those in front of computers enable to set the needed solution. The model rules for different scenarios of multi-casualty incidents from conventional warfare trauma to combined chemical/toxicological as well as, levels of care pre and inside hospitals--were incorporated to the database management system (we used Microsoft Access' DBMS). The hardware for management game was comprised of serial computers with network and possibility of projection of scenes. For prehospital phase the possibility of portable PC's and connections to central server was used to assess bidirectional flow of information. Simulation software (ARENA) and graphical interfase (Visual Basic, GUI) as shown in the attached figure. We hereby conclude that our system provides solutions which are in use in different levels of healthcare system to assess and improve management command and control for different scenarios of multi-casualty incidents.

  15. Outcome of Pediatric Gastroenterology Outpatients With Fever and Central Line.

    PubMed

    Alexander, Thomas; Blatt, Julie; Skinner, Asheley Cockrell; Jhaveri, Ravi; Jobson, Meghan; Freeman, Katherine

    2016-11-01

    Although management algorithms for fever and central venous catheters (CVCs) have been implemented for pediatric oncology (PO) patients, management of pediatric outpatients with noncancer diagnoses and CVCs lacks clear protocols. The aim of the study was to assess outcomes for pediatric outpatients with gastrointestinal disorders presenting with fever and CVC. Using a microbiology database and emergency department records, we created a database of pediatric gastroenterology (PGI) and PO outpatients with fever and a CVC who presented to our emergency department or clinics from January 2010 through December 2012. We excluded patients who had severe neutropenia (absolute neutrophil count, <500/mm). We performed chart reviews to assess demographic and clinical characteristics. A total of 334 episodes in 144 patients were evaluated. Fifty-three percent (95% confidence interval, 38%-68%) of PGI patients had a bloodstream infection, whereas only 9% (95% confidence interval, 5%-14%) of PO patients had a bloodstream infection (P < 0.001). Among patients with a bloodstream infection, the PGI patients were more likely than the PO patients to have polymicrobial infections (46% vs 15%), gram-negative infections (57% vs 27%), and/or infection with enteric organisms (61% vs 23%). The PGI patients had higher rates of CVC removal (19% vs 4%) but no statistical difference in intensive care unit needs (11% vs 4%). Pediatric gastroenterology outpatients with fever and a CVC have a high prevalence of bloodstream infection. Algorithms for management need to be subspecialty specific. Pediatric gastroenterology patients presenting to emergency departments or clinics with fever and CVC require admission for monitoring and management.

  16. DISTRIBUTED STRUCTURE-SEARCHABLE TOXICITY ...

    EPA Pesticide Factsheets

    The ability to assess the potential genotoxicity, carcinogenicity, or other toxicity of pharmaceutical or industrial chemicals based on chemical structure information is a highly coveted and shared goal of varied academic, commercial, and government regulatory groups. These diverse interests often employ different approaches and have different criteria and use for toxicity assessments, but they share a need for unrestricted access to existing public toxicity data linked with chemical structure information. Currently, there exists no central repository of toxicity information, commercial or public, that adequately meets the data requirements for flexible analogue searching, SAR model development, or building of chemical relational databases (CRD). The Distributed Structure-Searchable Toxicity (DSSTox) Public Database Network is being proposed as a community-supported, web-based effort to address these shared needs of the SAR and toxicology communities. The DSSTox project has the following major elements: 1) to adopt and encourage the use of a common standard file format (SDF) for public toxicity databases that includes chemical structure, text and property information, and that can easily be imported into available CRD applications; 2) to implement a distributed source approach, managed by a DSSTox Central Website, that will enable decentralized, free public access to structure-toxicity data files, and that will effectively link knowledgeable toxicity data s

  17. Active in-database processing to support ambient assisted living systems.

    PubMed

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  18. Active In-Database Processing to Support Ambient Assisted Living Systems

    PubMed Central

    de Morais, Wagner O.; Lundström, Jens; Wickström, Nicholas

    2014-01-01

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164

  19. Geoinformatics in the public service: building a cyberinfrastructure across the geological surveys

    USGS Publications Warehouse

    Allison, M. Lee; Gundersen, Linda C.; Richard, Stephen M.; Keller, G. Randy; Baru, Chaitanya

    2011-01-01

    Advanced information technology infrastructure is increasingly being employed in the Earth sciences to provide researchers with efficient access to massive central databases and to integrate diversely formatted information from a variety of sources. These geoinformatics initiatives enable manipulation, modeling and visualization of data in a consistent way, and are helping to develop integrated Earth models at various scales, and from the near surface to the deep interior. This book uses a series of case studies to demonstrate computer and database use across the geosciences. Chapters are thematically grouped into sections that cover data collection and management; modeling and community computational codes; visualization and data representation; knowledge management and data integration; and web services and scientific workflows. Geoinformatics is a fascinating and accessible introduction to this emerging field for readers across the solid Earth sciences and an invaluable reference for researchers interested in initiating new cyberinfrastructure projects of their own.

  20. Indexing of randomised controlled trials of physiotherapy interventions: a comparison of AMED, CENTRAL, CINAHL, EMBASE, hooked on evidence, PEDro, PsycINFO and PubMed.

    PubMed

    Moseley, Anne M; Sherrington, Catherine; Elkins, Mark R; Herbert, Robert D; Maher, Christopher G

    2009-09-01

    To compare the comprehensiveness of indexing the reports of randomised controlled trials of physiotherapy interventions by eight bibliographic databases (AMED, CENTRAL, CINAHL, EMBASE, Hooked on Evidence, PEDro, PsycINFO and PubMed). Audit of bibliographic databases. Two hundred and eighty-one reports of randomised controlled trials of physiotherapy interventions were identified by screening the reference lists of 30 relevant systematic reviews published in four consecutive issues of the Cochrane Database of Systematic Reviews (Issue 3, 2007 to Issue 2, 2008). AMED, CENTRAL, CINAHL, EMBASE, Hooked on Evidence, PEDro, PsycINFO and PubMed were used to search for the trial reports. The number of trial reports indexed in each database was calculated. PEDro indexed 99% of the trial reports, CENTRAL indexed 98%, PubMed indexed 91%, EMBASE indexed 82%, CINAHL indexed 61%, Hooked on Evidence indexed 40%, AMED indexed 36% and PsycINFO indexed 17%. Most trial reports (92%) were indexed on four or more of the databases. One trial report was indexed on a single database (PEDro). Of the eight bibliographic databases examined, PEDro and CENTRAL provide the most comprehensive indexing of reports of randomised trials of physiotherapy interventions.

  1. Design and implementation of a fault-tolerant and dynamic metadata database for clinical trials

    NASA Astrophysics Data System (ADS)

    Lee, J.; Zhou, Z.; Talini, E.; Documet, J.; Liu, B.

    2007-03-01

    In recent imaging-based clinical trials, quantitative image analysis (QIA) and computer-aided diagnosis (CAD) methods are increasing in productivity due to higher resolution imaging capabilities. A radiology core doing clinical trials have been analyzing more treatment methods and there is a growing quantity of metadata that need to be stored and managed. These radiology centers are also collaborating with many off-site imaging field sites and need a way to communicate metadata between one another in a secure infrastructure. Our solution is to implement a data storage grid with a fault-tolerant and dynamic metadata database design to unify metadata from different clinical trial experiments and field sites. Although metadata from images follow the DICOM standard, clinical trials also produce metadata specific to regions-of-interest and quantitative image analysis. We have implemented a data access and integration (DAI) server layer where multiple field sites can access multiple metadata databases in the data grid through a single web-based grid service. The centralization of metadata database management simplifies the task of adding new databases into the grid and also decreases the risk of configuration errors seen in peer-to-peer grids. In this paper, we address the design and implementation of a data grid metadata storage that has fault-tolerance and dynamic integration for imaging-based clinical trials.

  2. The GTN-P Data Management System: A central database for permafrost monitoring parameters of the Global Terrestrial Network for Permafrost (GTN-P) and beyond

    NASA Astrophysics Data System (ADS)

    Lanckman, Jean-Pierre; Elger, Kirsten; Karlsson, Ævar Karl; Johannsson, Halldór; Lantuit, Hugues

    2013-04-01

    Permafrost is a direct indicator of climate change and has been identified as Essential Climate Variable (ECV) by the global observing community. The monitoring of permafrost temperatures, active-layer thicknesses and other parameters has been performed for several decades already, but it was brought together within the Global Terrestrial Network for Permafrost (GTN-P) in the 1990's only, including the development of measurement protocols to provide standardized data. GTN-P is the primary international observing network for permafrost sponsored by the Global Climate Observing System (GCOS) and the Global Terrestrial Observing System (GTOS), and managed by the International Permafrost Association (IPA). All GTN-P data was outfitted with an "open data policy" with free data access via the World Wide Web. The existing data, however, is far from being homogeneous: it is not yet optimized for databases, there is no framework for data reporting or archival and data documentation is incomplete. As a result, and despite the utmost relevance of permafrost in the Earth's climate system, the data has not been used by as many researchers as intended by the initiators of the programs. While the monitoring of many other ECVs has been tackled by organized international networks (e.g. FLUXNET), there is still no central database for all permafrost-related parameters. The European Union project PAGE21 created opportunities to develop this central database for permafrost monitoring parameters of GTN-P during the duration of the project and beyond. The database aims to be the one location where the researcher can find data, metadata, and information of all relevant parameters for a specific site. Each component of the Data Management System (DMS), including parameters, data levels and metadata formats were developed in cooperation with the GTN-P and the IPA. The general framework of the GTN-P DMS is based on an object oriented model (OOM), open for as many parameters as possible, and implemented into a spatial database. To ensure interoperability and enable potential inter-database search, field names are following international metadata standards and are based on a control vocabulary registry. Tools are developed to provide data processing, analysis capability, and quality control. Our system aims to be a reference model, improvable and reusable. It allows a maximum top-down and bottom-up data flow, giving scientists one global searchable data and metadata repository, the public a full access to scientific data, and the policy maker a powerful cartographic and statistical tool. To engage the international community in GTN-P, it was essential to develop an online interface for data upload. Aim for this was that it is easy-to-use and allows data input with a minimum of technical and personal effort. In addition to this, large efforts will have to be produced in order to be able to query, visualize and retrieve information over many platforms and type of measurements. Ultimately, it is not the layer in itself that matter, but more the relationship that these information layers maintain with each other.

  3. Time-critical Database Condition Data Handling in the CMS Experiment During the First Data Taking Period

    NASA Astrophysics Data System (ADS)

    Cavallari, Francesca; de Gruttola, Michele; Di Guida, Salvatore; Govi, Giacomo; Innocente, Vincenzo; Pfeiffer, Andreas; Pierro, Antonio

    2011-12-01

    Automatic, synchronous and reliable population of the condition databases is critical for the correct operation of the online selection as well as of the offline reconstruction and analysis of data. In this complex infrastructure, monitoring and fast detection of errors is a very challenging task. In this paper, we describe the CMS experiment system to process and populate the Condition Databases and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are automatically collected using centralized jobs or are "dropped" by the users in dedicated services (offline and online drop-box), which synchronize them and take care of writing them into the online database. Then they are automatically streamed to the offline database, and thus are immediately accessible offline worldwide. The condition data are managed by different users using a wide range of applications.In normal operation the database monitor is used to provide simple timing information and the history of all transactions for all database accounts, and in the case of faults it is used to return simple error messages and more complete debugging information.

  4. CENTRAL, PEDro, PubMed, and EMBASE are the most comprehensive databases indexing randomized controlled trials of physical therapy interventions.

    PubMed

    Michaleff, Zoe A; Costa, Leonardo O P; Moseley, Anne M; Maher, Christopher G; Elkins, Mark R; Herbert, Robert D; Sherrington, Catherine

    2011-02-01

    Many bibliographic databases index research studies evaluating the effects of health care interventions. One study has concluded that the Physiotherapy Evidence Database (PEDro) has the most complete indexing of reports of randomized controlled trials of physical therapy interventions, but the design of that study may have exaggerated estimates of the completeness of indexing by PEDro. The purpose of this study was to compare the completeness of indexing of reports of randomized controlled trials of physical therapy interventions by 8 bibliographic databases. This study was an audit of bibliographic databases. Prespecified criteria were used to identify 400 reports of randomized controlled trials from the reference lists of systematic reviews published in 2008 that evaluated physical therapy interventions. Eight databases (AMED, CENTRAL, CINAHL, EMBASE, Hooked on Evidence, PEDro, PsycINFO, and PubMed) were searched for each trial report. The proportion of the 400 trial reports indexed by each database was calculated. The proportions of the 400 trial reports indexed by the databases were as follows: CENTRAL, 95%; PEDro, 92%; PubMed, 89%; EMBASE, 88%; CINAHL, 53%; AMED, 50%; Hooked on Evidence, 45%; and PsycINFO, 6%. Almost all of the trial reports (99%) were found in at least 1 database, and 88% were indexed by 4 or more databases. Four trial reports were uniquely indexed by a single database only (2 in CENTRAL and 1 each in PEDro and PubMed). The results are only applicable to searching for English-language published reports of randomized controlled trials evaluating physical therapy interventions. The 4 most comprehensive databases of trial reports evaluating physical therapy interventions were CENTRAL, PEDro, PubMed, and EMBASE. Clinicians seeking quick answers to clinical questions could search any of these databases knowing that all are reasonably comprehensive. PEDro, unlike the other 3 most complete databases, is specific to physical therapy, so studies not relevant to physical therapy are less likely to be retrieved. Researchers could use CENTRAL, PEDro, PubMed, and EMBASE in combination to conduct exhaustive searches for randomized trials in physical therapy.

  5. Informatics and data quality at collaborative multicenter Breast and Colon Cancer Family Registries.

    PubMed

    McGarvey, Peter B; Ladwa, Sweta; Oberti, Mauricio; Dragomir, Anca Dana; Hedlund, Erin K; Tanenbaum, David Michael; Suzek, Baris E; Madhavan, Subha

    2012-06-01

    Quality control and harmonization of data is a vital and challenging undertaking for any successful data coordination center and a responsibility shared between the multiple sites that produce, integrate, and utilize the data. Here we describe a coordinated effort between scientists and data managers in the Cancer Family Registries to implement a data governance infrastructure consisting of both organizational and technical solutions. The technical solution uses a rule-based validation system that facilitates error detection and correction for data centers submitting data to a central informatics database. Validation rules comprise both standard checks on allowable values and a crosscheck of related database elements for logical and scientific consistency. Evaluation over a 2-year timeframe showed a significant decrease in the number of errors in the database and a concurrent increase in data consistency and accuracy.

  6. Informatics and data quality at collaborative multicenter Breast and Colon Cancer Family Registries

    PubMed Central

    McGarvey, Peter B; Ladwa, Sweta; Oberti, Mauricio; Dragomir, Anca Dana; Hedlund, Erin K; Tanenbaum, David Michael; Suzek, Baris E

    2012-01-01

    Quality control and harmonization of data is a vital and challenging undertaking for any successful data coordination center and a responsibility shared between the multiple sites that produce, integrate, and utilize the data. Here we describe a coordinated effort between scientists and data managers in the Cancer Family Registries to implement a data governance infrastructure consisting of both organizational and technical solutions. The technical solution uses a rule-based validation system that facilitates error detection and correction for data centers submitting data to a central informatics database. Validation rules comprise both standard checks on allowable values and a crosscheck of related database elements for logical and scientific consistency. Evaluation over a 2-year timeframe showed a significant decrease in the number of errors in the database and a concurrent increase in data consistency and accuracy. PMID:22323393

  7. Software architecture of the Magdalena Ridge Observatory Interferometer

    NASA Astrophysics Data System (ADS)

    Farris, Allen; Klinglesmith, Dan; Seamons, John; Torres, Nicolas; Buscher, David; Young, John

    2010-07-01

    Merging software from 36 independent work packages into a coherent, unified software system with a lifespan of twenty years is the challenge faced by the Magdalena Ridge Observatory Interferometer (MROI). We solve this problem by using standardized interface software automatically generated from simple highlevel descriptions of these systems, relying only on Linux, GNU, and POSIX without complex software such as CORBA. This approach, based on gigabit Ethernet with a TCP/IP protocol, provides the flexibility to integrate and manage diverse, independent systems using a centralized supervisory system that provides a database manager, data collectors, fault handling, and an operator interface.

  8. Development of a forestry government agency enterprise GIS system: a disconnected editing approach

    NASA Astrophysics Data System (ADS)

    Zhu, Jin; Barber, Brad L.

    2008-10-01

    The Texas Forest Service (TFS) has developed a geographic information system (GIS) for use by agency personnel in central Texas for managing oak wilt suppression and other landowner assistance programs. This Enterprise GIS system was designed to support multiple concurrent users accessing shared information resources. The disconnected editing approach was adopted in this system to avoid the overhead of maintaining an active connection between TFS central Texas field offices and headquarters since most field offices are operating with commercially provided Internet service. The GIS system entails maintaining a personal geodatabase on each local field office computer. Spatial data from the field is periodically up-loaded into a central master geodatabase stored in a Microsoft SQL Server at the TFS headquarters in College Station through the ESRI Spatial Database Engine (SDE). This GIS allows users to work off-line when editing data and requires connecting to the central geodatabase only when needed.

  9. Methods and implementation of a central biosample and data management in a three-centre clinical study.

    PubMed

    Angelow, Aniela; Schmidt, Matthias; Weitmann, Kerstin; Schwedler, Susanne; Vogt, Hannes; Havemann, Christoph; Hoffmann, Wolfgang

    2008-07-01

    In our report we describe concept, strategies and implementation of a central biosample and data management (CSDM) system in the three-centre clinical study of the Transregional Collaborative Research Centre "Inflammatory Cardiomyopathy - Molecular Pathogenesis and Therapy" SFB/TR 19, Germany. Following the requirements of high system resource availability, data security, privacy protection and quality assurance, a web-based CSDM was developed based on Java 2 Enterprise Edition using an Oracle database. An efficient and reliable sample documentation system using bar code labelling, a partitioning storage algorithm and an online documentation software was implemented. An online electronic case report form is used to acquire patient-related data. Strict rules for access to the online applications and secure connections are used to account for privacy protection and data security. Challenges for the implementation of the CSDM resided at project, technical and organisational level as well as at staff level.

  10. EPA Facility Registry Service (FRS): TRI

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Toxic Release Inventory (TRI) System. TRI is a publicly available EPA database reported annually by certain covered industry groups, as well as federal facilities. It contains information about more than 650 toxic chemicals that are being used, manufactured, treated, transported, or released into the environment, and includes information about waste management and pollution prevention activities. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to TRI facilities once the TRI data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.

  11. MSeqDR: A Centralized Knowledge Repository and Bioinformatics Web Resource to Facilitate Genomic Investigations in Mitochondrial Disease.

    PubMed

    Shen, Lishuang; Diroma, Maria Angela; Gonzalez, Michael; Navarro-Gomez, Daniel; Leipzig, Jeremy; Lott, Marie T; van Oven, Mannis; Wallace, Douglas C; Muraresku, Colleen Clarke; Zolkipli-Cunningham, Zarazuela; Chinnery, Patrick F; Attimonelli, Marcella; Zuchner, Stephan; Falk, Marni J; Gai, Xiaowu

    2016-06-01

    MSeqDR is the Mitochondrial Disease Sequence Data Resource, a centralized and comprehensive genome and phenome bioinformatics resource built by the mitochondrial disease community to facilitate clinical diagnosis and research investigations of individual patient phenotypes, genomes, genes, and variants. A central Web portal (https://mseqdr.org) integrates community knowledge from expert-curated databases with genomic and phenotype data shared by clinicians and researchers. MSeqDR also functions as a centralized application server for Web-based tools to analyze data across both mitochondrial and nuclear DNA, including investigator-driven whole exome or genome dataset analyses through MSeqDR-Genesis. MSeqDR-GBrowse genome browser supports interactive genomic data exploration and visualization with custom tracks relevant to mtDNA variation and mitochondrial disease. MSeqDR-LSDB is a locus-specific database that currently manages 178 mitochondrial diseases, 1,363 genes associated with mitochondrial biology or disease, and 3,711 pathogenic variants in those genes. MSeqDR Disease Portal allows hierarchical tree-style disease exploration to evaluate their unique descriptions, phenotypes, and causative variants. Automated genomic data submission tools are provided that capture ClinVar compliant variant annotations. PhenoTips will be used for phenotypic data submission on deidentified patients using human phenotype ontology terminology. The development of a dynamic informed patient consent process to guide data access is underway to realize the full potential of these resources. © 2016 WILEY PERIODICALS, INC.

  12. MSeqDR: A Centralized Knowledge Repository and Bioinformatics Web Resource to Facilitate Genomic Investigations in Mitochondrial Disease

    PubMed Central

    Shen, Lishuang; Diroma, Maria Angela; Gonzalez, Michael; Navarro-Gomez, Daniel; Leipzig, Jeremy; Lott, Marie T.; van Oven, Mannis; Wallace, Douglas C.; Muraresku, Colleen Clarke; Zolkipli-Cunningham, Zarazuela; Chinnery, Patrick F.; Attimonelli, Marcella; Zuchner, Stephan

    2016-01-01

    MSeqDR is the Mitochondrial Disease Sequence Data Resource, a centralized and comprehensive genome and phenome bioinformatics resource built by the mitochondrial disease community to facilitate clinical diagnosis and research investigations of individual patient phenotypes, genomes, genes, and variants. A central Web portal (https://mseqdr.org) integrates community knowledge from expert-curated databases with genomic and phenotype data shared by clinicians and researchers. MSeqDR also functions as a centralized application server for Web-based tools to analyze data across both mitochondrial and nuclear DNA, including investigator-driven whole exome or genome dataset analyses through MSeqDR-Genesis. MSeqDR-GBrowse supports interactive genomic data exploration and visualization with custom tracks relevant to mtDNA variation and disease. MSeqDR-LSDB is a locus specific database that currently manages 178 mitochondrial diseases, 1,363 genes associated with mitochondrial biology or disease, and 3,711 pathogenic variants in those genes. MSeqDR Disease Portal allows hierarchical tree-style disease exploration to evaluate their unique descriptions, phenotypes, and causative variants. Automated genomic data submission tools are provided that capture ClinVar-compliant variant annotations. PhenoTips is used for phenotypic data submission on de-identified patients using human phenotype ontology terminology. Development of a dynamic informed patient consent process to guide data access is underway to realize the full potential of these resources. PMID:26919060

  13. The new geographic information system in ETVA VI.PE.

    NASA Astrophysics Data System (ADS)

    Xagoraris, Zafiris; Soulis, George

    2016-08-01

    ETVA VI.PE. S.A. is a member of the Piraeus Bank Group of Companies and its activities include designing, developing, exploiting and managing Industrial Areas throughout Greece. Inside ETVA VI.PE.'s thirty-one Industrial Parks there are currently 2,500 manufacturing companies established, with 40,000 employees and € 2.5 billion of invested funds. In each one of the industrial areas ETVA VI.PE guarantees the companies industrial lots of land (sites) with propitious building codes and complete infrastructure networks of water supply, sewerage, paved roads, power supply, communications, cleansing services, etc. The development of Geographical Information System for ETVA VI.PE.'s Industrial Parks started at the beginning of 1992 and consists of three subsystems: Cadastre, that manages the information for the land acquisition of Industrial Areas; Street Layout - Sites, that manages the sites sold to manufacturing companies; Networks, that manages the infrastructure networks (roads, water supply, sewerage etc). The mapping of each Industrial Park is made incorporating state-of-the-art photogrammetric, cartographic and surveying methods and techniques. Passing through the phases of initial design (hybrid GIS) and system upgrade (integrated Gis solution with spatial database), the system is currently operating on a new upgrade (integrated gIS solution with spatial database) that includes redesigning and merging the system's database schemas, along with the creation of central security policies, and the development of a new web GIS application for advanced data entry, highly customisable and standard reports, and dynamic interactive maps. The new GIS bring the company to advanced levels of productivity and introduce the new era for decision making and business management.

  14. The Role of IMAT Solutions for Training Development at the Royal Netherlands Air Force. IMAT Follow-up Research Part 1

    DTIC Science & Technology

    2005-09-01

    e.g. the transformation of a fragment to an instructional fragment. "* IMAT Database: A Jasmine ® database is used as central database in IMAT for the...storage of fragments. This is an object-oriented relational database. Jasmine ® was, amongst other factors, chosen for its ability to handle multimedia...to the Jasmine ® database, which is used in IMAT as central database. 3.1.1.1 Ontologies In IMAT, the proposed solution on problems with information

  15. Use of a secure Internet Web site for collaborative medical research.

    PubMed

    Marshall, W W; Haley, R W

    2000-10-11

    Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.

  16. Fish Karyome: A karyological information network database of Indian Fishes.

    PubMed

    Nagpure, Naresh Sahebrao; Pathak, Ajey Kumar; Pati, Rameshwar; Singh, Shri Prakash; Singh, Mahender; Sarkar, Uttam Kumar; Kushwaha, Basdeo; Kumar, Ravindra

    2012-01-01

    'Fish Karyome', a database on karyological information of Indian fishes have been developed that serves as central source for karyotype data about Indian fishes compiled from the published literature. Fish Karyome has been intended to serve as a liaison tool for the researchers and contains karyological information about 171 out of 2438 finfish species reported in India and is publically available via World Wide Web. The database provides information on chromosome number, morphology, sex chromosomes, karyotype formula and cytogenetic markers etc. Additionally, it also provides the phenotypic information that includes species name, its classification, and locality of sample collection, common name, local name, sex, geographical distribution, and IUCN Red list status. Besides, fish and karyotype images, references for 171 finfish species have been included in the database. Fish Karyome has been developed using SQL Server 2008, a relational database management system, Microsoft's ASP.NET-2008 and Macromedia's FLASH Technology under Windows 7 operating environment. The system also enables users to input new information and images into the database, search and view the information and images of interest using various search options. Fish Karyome has wide range of applications in species characterization and identification, sex determination, chromosomal mapping, karyo-evolution and systematics of fishes.

  17. FishTraits: a database of ecological and life-history traits of freshwater fishes of the United States

    USGS Publications Warehouse

    Angermeier, Paul L.; Frimpong, Emmanuel A.

    2011-01-01

    The need for integrated and widely accessible sources of species traits data to facilitate studies of ecology, conservation, and management has motivated development of traits databases for various taxa. In spite of the increasing number of traits-based analyses of freshwater fishes in the United States, no consolidated database of traits of this group exists publicly, and much useful information on these species is documented only in obscure sources. The largely inaccessible and unconsolidated traits information makes large-scale analysis involving many fishes and/or traits particularly challenging. We have compiled a database of > 100 traits for 809 (731 native and 78 nonnative) fish species found in freshwaters of the conterminous United States, including 37 native families and 145 native genera. The database, named Fish Traits, contains information on four major categories of traits: (1) trophic ecology; (2) body size, reproductive ecology, and life history; (3) habitat preferences; and (4) salinity and temperature tolerances. Information on geographic distribution and conservation status was also compiled. The database enhances many opportunities for conducting research on fish species traits and constitutes the first step toward establishing a central repository for a continually expanding set of traits of North American fishes.

  18. Role of surgery in the management of patients with supratentorial spontaneous intracerebral hematoma: Critical appraisal of evidence.

    PubMed

    Akhigbe, Taiwo; Zolnourian, Ardalan

    2017-05-01

    Whether surgery improves the outcome more than medical management alone continues to be a subject of intense debate and controversy. However, there is optimism that the management of spontaneous supratentorial intracerebral haemorrhage will change in future based new insight and better understanding of the acute pathophysiology of hematomas and its dynamics. Craniotomy as a surgical approach has been the most studied intervention for spontaneous supratentorial intracerebral haemorrhage but with no significant benefit when compared to best medical management. A literature search was conducted using electronic data bases including the Cochrane Central Register of Controlled Trials (CENTRAL) on the Cochrane library, MEDLINE and EMBASE. In addition, critical appraisal of most current evidences was carried out. About 1387 articles identified through database search over 10-year period of which one systematic review and two randomised controlled trials most relevant to this review were critically appraised. The role of surgery in the management of spontaneous intracerebral haemorrhage still remains a matter of debate. There is insufficient evidence to justify a general policy of early surgery for patients with spontaneous intracerebral haemorrhage compared to initial medical management but STICH did demonstrate that patients with superficial hematoma might benefit from craniotomy. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  19. The USA-NPN Information Management System: A tool in support of phenological assessments

    NASA Astrophysics Data System (ADS)

    Rosemartin, A.; Vazquez, R.; Wilson, B. E.; Denny, E. G.

    2009-12-01

    The USA National Phenology Network (USA-NPN) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and all aspects of environmental change. Data management and information sharing are central to the USA-NPN mission. The USA-NPN develops, implements, and maintains a comprehensive Information Management System (IMS) to serve the needs of the network, including the collection, storage and dissemination of phenology data, access to phenology-related information, tools for data interpretation, and communication among partners of the USA-NPN. The IMS includes components for data storage, such as the National Phenology Database (NPD), and several online user interfaces to accommodate data entry, data download, data visualization and catalog searches for phenology-related information. The IMS is governed by a set of standards to ensure security, privacy, data access, and data quality. The National Phenology Database is designed to efficiently accommodate large quantities of phenology data, to be flexible to the changing needs of the network, and to provide for quality control. The database stores phenology data from multiple sources (e.g., partner organizations, researchers and citizen observers), and provides for integration with legacy datasets. Several services will be created to provide access to the data, including reports, visualization interfaces, and web services. These services will provide integrated access to phenology and related information for scientists, decision-makers and general audiences. Phenological assessments at any scale will rely on secure and flexible information management systems for the organization and analysis of phenology data. The USA-NPN’s IMS can serve phenology assessments directly, through data management and indirectly as a model for large-scale integrated data management.

  20. Fifteen hundred guidelines and growing: the UK database of clinical guidelines.

    PubMed

    van Loo, John; Leonard, Niamh

    2006-06-01

    The National Library for Health offers a comprehensive searchable database of nationally approved clinical guidelines, called the Guidelines Finder. This resource, commissioned in 2002, is managed and developed by the University of Sheffield Health Sciences Library. The authors introduce the historical and political dimension of guidelines and the nature of guidelines as a mechanism to ensure clinical effectiveness in practice. The article then outlines the maintenance and organisation of the Guidelines Finder database itself, the criteria for selection, who publishes guidelines and guideline formats, usage of the Guidelines Finder service and finally looks at some lessons learnt from a local library offering a national service. Clinical guidelines are central to effective clinical practice at the national, organisational and individual level. The Guidelines Finder is one of the most visited resources within the National Library for Health and is successful in answering information needs related to specific patient care, clinical research, guideline development and education.

  1. Corporate governance and the adoption of health information technology within integrated delivery systems.

    PubMed

    Baird, Aaron; Furukawa, Michael F; Rahman, Bushra; Schneller, Eugene S

    2014-01-01

    Although several previous studies have found "system affiliation" to be a significant and positive predictor of health information technology (IT) adoption, little is known about the association between corporate governance practices and adoption of IT within U.S. integrated delivery systems (IDSs). Rooted in agency theory and corporate governance research, this study examines the association between corporate governance practices (centralization of IT decision rights and strategic alignment between business and IT strategy) and IT adoption, standardization, and innovation within IDSs. Cross-sectional, retrospective analyses using data from the 2011 Health Information and Management Systems Society Analytics Database on adoption within IDSs (N = 485) is used to analyze the correlation between two corporate governance constructs (centralization of IT decision rights and strategic alignment) and three IT constructs (adoption, standardization, and innovation) for clinical and supply chain IT. Multivariate fractional logit, probit, and negative binomial regressions are applied. Multivariate regressions controlling for IDS and market characteristics find that measures of IT adoption, IT standardization, and innovative IT adoption are significantly associated with centralization of IT decision rights and strategic alignment. Specifically, centralization of IT decision rights is associated with 22% higher adoption of Bar Coding for Materials Management and 30%-35% fewer IT vendors for Clinical Data Repositories and Materials Management Information Systems. A combination of centralization and clinical IT strategic alignment is associated with 50% higher Computerized Physician Order Entry adoption, and centralization along with supply chain IT strategic alignment is significantly negatively correlated with Radio Frequency Identification adoption : Although IT adoption and standardization are likely to benefit from corporate governance practices within IDSs, innovation is likely to be delayed. In addition, corporate governance is not one-size-fits-all, and contingencies are important considerations.

  2. Analyzing legacy U.S. Geological Survey geochemical databases using GIS: applications for a national mineral resource assessment

    USGS Publications Warehouse

    Yager, Douglas B.; Hofstra, Albert H.; Granitto, Matthew

    2012-01-01

    This report emphasizes geographic information system analysis and the display of data stored in the legacy U.S. Geological Survey National Geochemical Database for use in mineral resource investigations. Geochemical analyses of soils, stream sediments, and rocks that are archived in the National Geochemical Database provide an extensive data source for investigating geochemical anomalies. A study area in the Egan Range of east-central Nevada was used to develop a geographic information system analysis methodology for two different geochemical datasets involving detailed (Bureau of Land Management Wilderness) and reconnaissance-scale (National Uranium Resource Evaluation) investigations. ArcGIS was used to analyze and thematically map geochemical information at point locations. Watershed-boundary datasets served as a geographic reference to relate potentially anomalous sample sites with hydrologic unit codes at varying scales. The National Hydrography Dataset was analyzed with Hydrography Event Management and ArcGIS Utility Network Analyst tools to delineate potential sediment-sample provenance along a stream network. These tools can be used to track potential upstream-sediment-contributing areas to a sample site. This methodology identifies geochemically anomalous sample sites, watersheds, and streams that could help focus mineral resource investigations in the field.

  3. United States Army Medical Materiel Development Activity: 1997 Annual Report.

    DTIC Science & Technology

    1997-01-01

    business planning and execution information management system (Project Management Division Database ( PMDD ) and Product Management Database System (PMDS...MANAGEMENT • Project Management Division Database ( PMDD ), Product Management Database System (PMDS), and Special Users Database System:The existing...System (FMS), were investigated. New Product Managers and Project Managers were added into PMDS and PMDD . A separate division, Support, was

  4. An Integrated Korean Biodiversity and Genetic Information Retrieval System

    PubMed Central

    Lim, Jeongheui; Bhak, Jong; Oh, Hee-Mock; Kim, Chang-Bae; Park, Yong-Ha; Paek, Woon Kee

    2008-01-01

    Background On-line biodiversity information databases are growing quickly and being integrated into general bioinformatics systems due to the advances of fast gene sequencing technologies and the Internet. These can reduce the cost and effort of performing biodiversity surveys and genetic searches, which allows scientists to spend more time researching and less time collecting and maintaining data. This will cause an increased rate of knowledge build-up and improve conservations. The biodiversity databases in Korea have been scattered among several institutes and local natural history museums with incompatible data types. Therefore, a comprehensive database and a nation wide web portal for biodiversity information is necessary in order to integrate diverse information resources, including molecular and genomic databases. Results The Korean Natural History Research Information System (NARIS) was built and serviced as the central biodiversity information system to collect and integrate the biodiversity data of various institutes and natural history museums in Korea. This database aims to be an integrated resource that contains additional biological information, such as genome sequences and molecular level diversity. Currently, twelve institutes and museums in Korea are integrated by the DiGIR (Distributed Generic Information Retrieval) protocol, with Darwin Core2.0 format as its metadata standard for data exchange. Data quality control and statistical analysis functions have been implemented. In particular, integrating molecular and genetic information from the National Center for Biotechnology Information (NCBI) databases with NARIS was recently accomplished. NARIS can also be extended to accommodate other institutes abroad, and the whole system can be exported to establish local biodiversity management servers. Conclusion A Korean data portal, NARIS, has been developed to efficiently manage and utilize biodiversity data, which includes genetic resources. NARIS aims to be integral in maximizing bio-resource utilization for conservation, management, research, education, industrial applications, and integration with other bioinformation data resources. It can be found at . PMID:19091024

  5. Efficacy of Noninvasive Stellate Ganglion Blockade Performed Using Physical Agent Modalities in Patients with Sympathetic Hyperactivity-Associated Disorders: A Systematic Review and Meta-Analysis.

    PubMed

    Liao, Chun-De; Tsauo, Jau-Yih; Liou, Tsan-Hon; Chen, Hung-Chou; Rau, Chi-Lun

    2016-01-01

    Stellate ganglion blockade (SGB) is mainly used to relieve symptoms of neuropathic pain in conditions such as complex regional pain syndrome and has several potential complications. Noninvasive SGB performed using physical agent modalities (PAMs), such as light irradiation and electrical stimulation, can be clinically used as an alternative to conventional invasive SGB. However, its application protocols vary and its clinical efficacy remains controversial. This study investigated the use of noninvasive SGB for managing neuropathic pain or other disorders associated with sympathetic hyperactivity. We performed a comprehensive search of the following online databases: Medline, PubMed, Excerpta Medica Database, Cochrane Library Database, Ovid MEDLINE, Europe PubMed Central, EBSCOhost Research Databases, CINAHL, ProQuest Research Library, Physiotherapy Evidence Database, WorldWideScience, BIOSIS, and Google Scholar. We identified and included quasi-randomized or randomized controlled trials reporting the efficacy of SGB performed using therapeutic ultrasound, transcutaneous electrical nerve stimulation, light irradiation using low-level laser therapy, or xenon light or linearly polarized near-infrared light irradiation near or over the stellate ganglion region in treating complex regional pain syndrome or disorders requiring sympatholytic management. The included articles were subjected to a meta-analysis and risk of bias assessment. Nine randomized and four quasi-randomized controlled trials were included. Eleven trials had good methodological quality with a Physiotherapy Evidence Database (PEDro) score of ≥6, whereas the remaining two trials had a PEDro score of <6. The meta-analysis results revealed that the efficacy of noninvasive SGB on 100-mm visual analog pain score is higher than that of a placebo or active control (weighted mean difference, -21.59 mm; 95% CI, -34.25, -8.94; p = 0.0008). Noninvasive SGB performed using PAMs effectively relieves pain of various etiologies, making it a valuable addition to the contemporary pain management armamentarium. However, this evidence is limited by the potential risk of bias.

  6. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  7. 48 CFR 52.232-33 - Payment by Electronic Funds Transfer-Central Contractor Registration.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... contained in the Central Contractor Registration (CCR) database. In the event that the EFT information changes, the Contractor shall be responsible for providing the updated information to the CCR database. (c... 210. (d) Suspension of payment. If the Contractor's EFT information in the CCR database is incorrect...

  8. Meta-All: a system for managing metabolic pathway information.

    PubMed

    Weise, Stephan; Grosse, Ivo; Klukas, Christian; Koschützki, Dirk; Scholz, Uwe; Schreiber, Falk; Junker, Björn H

    2006-10-23

    Many attempts are being made to understand biological subjects at a systems level. A major resource for these approaches are biological databases, storing manifold information about DNA, RNA and protein sequences including their functional and structural motifs, molecular markers, mRNA expression levels, metabolite concentrations, protein-protein interactions, phenotypic traits or taxonomic relationships. The use of these databases is often hampered by the fact that they are designed for special application areas and thus lack universality. Databases on metabolic pathways, which provide an increasingly important foundation for many analyses of biochemical processes at a systems level, are no exception from the rule. Data stored in central databases such as KEGG, BRENDA or SABIO-RK is often limited to read-only access. If experimentalists want to store their own data, possibly still under investigation, there are two possibilities. They can either develop their own information system for managing that own data, which is very time-consuming and costly, or they can try to store their data in existing systems, which is often restricted. Hence, an out-of-the-box information system for managing metabolic pathway data is needed. We have designed META-ALL, an information system that allows the management of metabolic pathways, including reaction kinetics, detailed locations, environmental factors and taxonomic information. Data can be stored together with quality tags and in different parallel versions. META-ALL uses Oracle DBMS and Oracle Application Express. We provide the META-ALL information system for download and use. In this paper, we describe the database structure and give information about the tools for submitting and accessing the data. As a first application of META-ALL, we show how the information contained in a detailed kinetic model can be stored and accessed. META-ALL is a system for managing information about metabolic pathways. It facilitates the handling of pathway-related data and is designed to help biochemists and molecular biologists in their daily research. It is available on the Web at http://bic-gh.de/meta-all and can be downloaded free of charge and installed locally.

  9. Meta-All: a system for managing metabolic pathway information

    PubMed Central

    Weise, Stephan; Grosse, Ivo; Klukas, Christian; Koschützki, Dirk; Scholz, Uwe; Schreiber, Falk; Junker, Björn H

    2006-01-01

    Background Many attempts are being made to understand biological subjects at a systems level. A major resource for these approaches are biological databases, storing manifold information about DNA, RNA and protein sequences including their functional and structural motifs, molecular markers, mRNA expression levels, metabolite concentrations, protein-protein interactions, phenotypic traits or taxonomic relationships. The use of these databases is often hampered by the fact that they are designed for special application areas and thus lack universality. Databases on metabolic pathways, which provide an increasingly important foundation for many analyses of biochemical processes at a systems level, are no exception from the rule. Data stored in central databases such as KEGG, BRENDA or SABIO-RK is often limited to read-only access. If experimentalists want to store their own data, possibly still under investigation, there are two possibilities. They can either develop their own information system for managing that own data, which is very time-consuming and costly, or they can try to store their data in existing systems, which is often restricted. Hence, an out-of-the-box information system for managing metabolic pathway data is needed. Results We have designed META-ALL, an information system that allows the management of metabolic pathways, including reaction kinetics, detailed locations, environmental factors and taxonomic information. Data can be stored together with quality tags and in different parallel versions. META-ALL uses Oracle DBMS and Oracle Application Express. We provide the META-ALL information system for download and use. In this paper, we describe the database structure and give information about the tools for submitting and accessing the data. As a first application of META-ALL, we show how the information contained in a detailed kinetic model can be stored and accessed. Conclusion META-ALL is a system for managing information about metabolic pathways. It facilitates the handling of pathway-related data and is designed to help biochemists and molecular biologists in their daily research. It is available on the Web at and can be downloaded free of charge and installed locally. PMID:17059592

  10. Lessons Learned Implementing DOORS in a Citrix Environment

    NASA Technical Reports Server (NTRS)

    Bussman, Marie

    2005-01-01

    NASA's James Web Space Telescope (JWST) Project is a large multi-national project with geographically dispersed contractors that all need access to the Projects requirement database. Initially, the project utilized multiple DOORS databases with the built-in partitions feature to exchange modules amongst the various contractor sites. As the requirements databases matured the use of partitions became extremely difficult. There have been many issues such as incompatible versions of DOORS, inefficient mechanism for sharing modules, security concerns, performance issues, and inconsistent document import and export formats. Deployment of the client software with limited IT resources available was also an issue. The solution chosen by JWST was to integrate the use of a Citrix environment with the DOORS database to address most of the project concerns. The use of the Citrix solution allowed a single Requirements database in a secure environment via a web interface. The Citrix environment allows JWST to upgrade to the most current version of DOORS without having to coordinate multiple sites and user upgrades. The single requirements database eliminates a multitude of Configuration Management concerns and facilitated the standardization of documentation formats. This paper discusses the obstacles and the lessons learned throughout the installation, implementation, usage and deployment process of a centralized DOORS database solution.

  11. Managing hybrid marketing systems.

    PubMed

    Moriarty, R T; Moran, U

    1990-01-01

    As competition increases and costs become critical, companies that once went to market only one way are adding new channels and using new methods - creating hybrid marketing systems. These hybrid marketing systems hold the promise of greater coverage and reduced costs. But they are also hard to manage; they inevitably raise questions of conflict and control: conflict because marketing units compete for customers; control because new indirect channels are less subject to management authority. Hard as they are to manage, however, hybrid marketing systems promise to become the dominant design, replacing the "purebred" channel strategy in all kinds of businesses. The trick to managing the hybrid is to analyze tasks and channels within and across a marketing system. A map - the hybrid grid - can help managers make sense of their hybrid system. What the chart reveals is that channels are not the basic building blocks of a marketing system; marketing tasks are. The hybrid grid forces managers to consider various combinations of channels and tasks that will optimize both cost and coverage. Managing conflict is also an important element of a successful hybrid system. Managers should first acknowledge the inevitability of conflict. Then they should move to bound it by creating guidelines that spell out which customers to serve through which methods. Finally, a marketing and sales productivity (MSP) system, consisting of a central marketing database, can act as the central nervous system of a hybrid marketing system, helping managers create customized channels and service for specific customer segments.

  12. The MycoBrowser portal: a comprehensive and manually annotated resource for mycobacterial genomes.

    PubMed

    Kapopoulou, Adamandia; Lew, Jocelyne M; Cole, Stewart T

    2011-01-01

    In this paper, we present the MycoBrowser portal (http://mycobrowser.epfl.ch/), a resource that provides both in silico generated and manually reviewed information within databases dedicated to the complete genomes of Mycobacterium tuberculosis, Mycobacterium leprae, Mycobacterium marinum and Mycobacterium smegmatis. A central component of MycoBrowser is TubercuList (http://tuberculist.epfl.ch), which has recently benefited from a new data management system and web interface. These improvements were extended to all MycoBrowser databases. We provide an overview of the functionalities available and the different ways of interrogating the data then discuss how both the new information and the latest features are helping the mycobacterial research communities. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. The Data Acquisition System of the Stockholm Educational Air Shower Array

    NASA Astrophysics Data System (ADS)

    Hofverberg, P.; Johansson, H.; Pearce, M.; Rydstrom, S.; Wikstrom, C.

    2005-12-01

    The Stockholm Educational Air Shower Array (SEASA) project is deploying an array of plastic scintillator detector stations on school roofs in the Stockholm area. Signals from GPS satellites are used to time synchronise signals from the widely separated detector stations, allowing cosmic ray air showers to be identified and studied. A low-cost and highly scalable data acquisition system has been produced using embedded Linux processors which communicate station data to a central server running a MySQL database. Air shower data can be visualised in real-time using a Java-applet client. It is also possible to query the database and manage detector stations from the client. In this paper, the design and performance of the system are described

  14. EPA Facility Registry System (FRS): NCES

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry System (FRS) for the subset of facilities that link to the National Center for Education Statistics (NCES). The primary federal database for collecting and analyzing data related to education in the United States and other Nations, NCES is located in the U.S. Department of Education, within the Institute of Education Sciences. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA00e2??s national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to NCES school facilities once the NCES data has been integrated into the FRS database. Additional information on FRS is available at the EPA website http://www.epa.gov/enviro/html/fii/index.html.

  15. BioMart Central Portal: an open database network for the biological community.

    PubMed

    Guberman, Jonathan M; Ai, J; Arnaiz, O; Baran, Joachim; Blake, Andrew; Baldock, Richard; Chelala, Claude; Croft, David; Cros, Anthony; Cutts, Rosalind J; Di Génova, A; Forbes, Simon; Fujisawa, T; Gadaleta, E; Goodstein, D M; Gundem, Gunes; Haggarty, Bernard; Haider, Syed; Hall, Matthew; Harris, Todd; Haw, Robin; Hu, S; Hubbard, Simon; Hsu, Jack; Iyer, Vivek; Jones, Philip; Katayama, Toshiaki; Kinsella, R; Kong, Lei; Lawson, Daniel; Liang, Yong; Lopez-Bigas, Nuria; Luo, J; Lush, Michael; Mason, Jeremy; Moreews, Francois; Ndegwa, Nelson; Oakley, Darren; Perez-Llamas, Christian; Primig, Michael; Rivkin, Elena; Rosanoff, S; Shepherd, Rebecca; Simon, Reinhard; Skarnes, B; Smedley, Damian; Sperling, Linda; Spooner, William; Stevenson, Peter; Stone, Kevin; Teague, J; Wang, Jun; Wang, Jianxin; Whitty, Brett; Wong, D T; Wong-Erasmus, Marie; Yao, L; Youens-Clark, Ken; Yung, Christina; Zhang, Junjun; Kasprzyk, Arek

    2011-01-01

    BioMart Central Portal is a first of its kind, community-driven effort to provide unified access to dozens of biological databases spanning genomics, proteomics, model organisms, cancer data, ontology information and more. Anybody can contribute an independently maintained resource to the Central Portal, allowing it to be exposed to and shared with the research community, and linking it with the other resources in the portal. Users can take advantage of the common interface to quickly utilize different sources without learning a new system for each. The system also simplifies cross-database searches that might otherwise require several complicated steps. Several integrated tools streamline common tasks, such as converting between ID formats and retrieving sequences. The combination of a wide variety of databases, an easy-to-use interface, robust programmatic access and the array of tools make Central Portal a one-stop shop for biological data querying. Here, we describe the structure of Central Portal and show example queries to demonstrate its capabilities.

  16. Literature Review and Database of Relations Between Salinity and Aquatic Biota: Applications to Bowdoin National Wildlife Refuge, Montana

    USGS Publications Warehouse

    Gleason, Robert A.; Tangen, Brian A.; Laubhan, Murray K.; Finocchiaro, Raymond G.; Stamm, John F.

    2009-01-01

    Long-term accumulation of salts in wetlands at Bowdoin National Wildlife Refuge (NWR), Mont., has raised concern among wetland managers that increasing salinity may threaten plant and invertebrate communities that provide important habitat and food resources for migratory waterfowl. Currently, the U.S. Fish and Wildlife Service (USFWS) is evaluating various water management strategies to help maintain suitable ranges of salinity to sustain plant and invertebrate resources of importance to wildlife. To support this evaluation, the USFWS requested that the U.S. Geological Survey (USGS) provide information on salinity ranges of water and soil for common plants and invertebrates on Bowdoin NWR lands. To address this need, we conducted a search of the literature on occurrences of plants and invertebrates in relation to salinity and pH of the water and soil. The compiled literature was used to (1) provide a general overview of salinity concepts, (2) document published tolerances and adaptations of biota to salinity, (3) develop databases that the USFWS can use to summarize the range of reported salinity values associated with plant and invertebrate taxa, and (4) perform database summaries that describe reported salinity ranges associated with plants and invertebrates at Bowdoin NWR. The purpose of this report is to synthesize information to facilitate a better understanding of the ecological relations between salinity and flora and fauna when developing wetland management strategies. A primary focus of this report is to provide information to help evaluate and address salinity issues at Bowdoin NWR; however, the accompanying databases, as well as concepts and information discussed, are applicable to other areas or refuges. The accompanying databases include salinity values reported for 411 plant taxa and 330 invertebrate taxa. The databases are available in Microsoft Excel version 2007 (http://pubs.usgs.gov/sir/2009/5098/downloads/databases_21april2009.xls) and contain 27 data fields that include variables such as taxonomic identification, values for salinity and pH, wetland classification, location of study, and source of data. The databases are not exhaustive of the literature and are biased toward wetland habitats located in the glaciated North-Central United States; however, the databases do encompass a diversity of biota commonly found in brackish and freshwater inland wetland habitats.

  17. Chinese patent medicine Fei-Liu-Ping ointment as an adjunctive treatment for non-small cell lung cancer: protocol for a systematic review.

    PubMed

    Zheng, Honggang; He, Shulin; Liu, Rui; Xu, Xinyao; Xu, Tao; Chen, Shuntai; Guo, Qiujun; Gao, Yebo; Hua, Baojin

    2017-01-16

    Fei-Liu-Ping ointment has been widely applied as adjunctive drug in the treatment of non-small cell lung cancer (NSCLC). However, there has been no systematic review of research findings regarding the efficacy of this treatment. Here, we provide a protocol for assessing the effectiveness and safety of Fei-Liu-Ping ointment in the treatment of NSCLC. The electronic databases to be searched will include MEDLINE (PubMed), Cochrane Central Register of Controlled Trials (CENTRAL) in the Cochrane Library, Excerpt Medica Database (EMBASE), China National Knowledge Infrastructure (CNKI), China Scientific Journal Database (VIP), Wanfang Database and Chinese Biomedical Literature Database (CBM). Papers in English or Chinese published from inception to 2016 will be included without any restrictions. We will conduct a meta-analysis of randomised controlled trial if possible. The therapeutic effects according to the standard for treatment of solid tumours by the WHO and the quality of life as evaluated by Karnofsky score and weight will be applied as the primary outcomes. We will also evaluate the data synthesis and risk of bias using Review Manager 5.3 software. The results of this review will offer implications for the use of Fei-Liu-Ping ointment as an adjunctive treatment for NSCLC. This knowledge will inform recommendations by surgeons and researchers who are interested in the treatment of NSCLC. The results of this systematic review will be disseminated through presentation at a conference and publication of the data in a peer-reviewed journal. PROSPERO CRD42016036911. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. A strong-motion database from the Central American subduction zone

    NASA Astrophysics Data System (ADS)

    Arango, Maria Cristina; Strasser, Fleur O.; Bommer, Julian J.; Hernández, Douglas A.; Cepeda, Jose M.

    2011-04-01

    Subduction earthquakes along the Pacific Coast of Central America generate considerable seismic risk in the region. The quantification of the hazard due to these events requires the development of appropriate ground-motion prediction equations, for which purpose a database of recordings from subduction events in the region is indispensable. This paper describes the compilation of a comprehensive database of strong ground-motion recordings obtained during subduction-zone events in Central America, focusing on the region from 8 to 14° N and 83 to 92° W, including Guatemala, El Salvador, Nicaragua and Costa Rica. More than 400 accelerograms recorded by the networks operating across Central America during the last decades have been added to data collected by NORSAR in two regional projects for the reduction of natural disasters. The final database consists of 554 triaxial ground-motion recordings from events of moment magnitudes between 5.0 and 7.7, including 22 interface and 58 intraslab-type events for the time period 1976-2006. Although the database presented in this study is not sufficiently complete in terms of magnitude-distance distribution to serve as a basis for the derivation of predictive equations for interface and intraslab events in Central America, it considerably expands the Central American subduction data compiled in previous studies and used in early ground-motion modelling studies for subduction events in this region. Additionally, the compiled database will allow the assessment of the existing predictive models for subduction-type events in terms of their applicability for the Central American region, which is essential for an adequate estimation of the hazard due to subduction earthquakes in this region.

  19. From metaphor to practices: The introduction of "information engineers" into the first DNA sequence database.

    PubMed

    García-Sancho, Miguel

    2011-01-01

    This paper explores the introduction of professional systems engineers and information management practices into the first centralized DNA sequence database, developed at the European Molecular Biology Laboratory (EMBL) during the 1980s. In so doing, it complements the literature on the emergence of an information discourse after World War II and its subsequent influence in biological research. By the careers of the database creators and the computer algorithms they designed, analyzing, from the mid-1960s onwards information in biology gradually shifted from a pervasive metaphor to be embodied in practices and professionals such as those incorporated at the EMBL. I then investigate the reception of these database professionals by the EMBL biological staff, which evolved from initial disregard to necessary collaboration as the relationship between DNA, genes, and proteins turned out to be more complex than expected. The trajectories of the database professionals at the EMBL suggest that the initial subject matter of the historiography of genomics should be the long-standing practices that emerged after World War II and to a large extent originated outside biomedicine and academia. Only after addressing these practices, historians may turn to their further disciplinary assemblage in fields such as bioinformatics or biotechnology.

  20. A web-based system architecture for ontology-based data integration in the domain of IT benchmarking

    NASA Astrophysics Data System (ADS)

    Pfaff, Matthias; Krcmar, Helmut

    2018-03-01

    In the domain of IT benchmarking (ITBM), a variety of data and information are collected. Although these data serve as the basis for business analyses, no unified semantic representation of such data yet exists. Consequently, data analysis across different distributed data sets and different benchmarks is almost impossible. This paper presents a system architecture and prototypical implementation for an integrated data management of distributed databases based on a domain-specific ontology. To preserve the semantic meaning of the data, the ITBM ontology is linked to data sources and functions as the central concept for database access. Thus, additional databases can be integrated by linking them to this domain-specific ontology and are directly available for further business analyses. Moreover, the web-based system supports the process of mapping ontology concepts to external databases by introducing a semi-automatic mapping recommender and by visualizing possible mapping candidates. The system also provides a natural language interface to easily query linked databases. The expected result of this ontology-based approach of knowledge representation and data access is an increase in knowledge and data sharing in this domain, which will enhance existing business analysis methods.

  1. Evaluation of wildlife-habitat relationships data base for predicting bird community composition in central California chaparral and blue oak woodlands

    USGS Publications Warehouse

    Avery, M.L.; van Riper, Charles

    1990-01-01

    The California Wildlife-Habitat Relationships (WHR) database can be used to assist resource managers to evaluate effects of habitat manipulations on wildlife. The accuracy of predictions from WHR was evaluated using data from bird surveys conducted during winter and spring 1984 and 1985 in chamise (Adenostema fasciculata) chaparral, mixed chaparral and blue oak (Quercus douglasii) woodland. Considerable variability between habitat types was found for errors both of commission and of omission.

  2. An Analysis of the United States Naval Aviation Schedule Removal Component (SRC) Card Process

    DTIC Science & Technology

    2009-12-01

    JSF has the ability to communicate in flight with its maintenance system , ALIS. Its Prognostic Health Management (PHM) System abilities allow it to...end-users. PLCS allows users of the system , through a central database, visibility of a component’s history and lifecycle data . Since both OOMA...used in PLM systems .2 This research recommends a PLM system that is Web-based and uses DoD- mandated UID technology as the future for data

  3. Flow experience in teams: The role of shared leadership.

    PubMed

    Aubé, Caroline; Rousseau, Vincent; Brunelle, Eric

    2018-04-01

    The present study tests a multilevel mediation model concerning the effect of shared leadership on team members' flow experience. Specifically, we investigate the mediating role of teamwork behaviors in the relationships between 2 complementary indicators of shared leadership (i.e., density and centralization) and flow. Based on a multisource approach, we collected data through observation and survey of 111 project teams (521 individuals) made up of university students participating in a project management simulation. The results show that density and centralization have both an additive effect and an interaction effect on teamwork behaviors, such that the relationship between density and teamwork behaviors is stronger when centralization is low. In addition, teamwork behaviors play a mediating role in the relationship between shared leadership and flow. Overall, the findings highlight the importance of promoting team-based shared leadership in organizations to favor the flow experience. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Exploring the potential offered by legacy soil databases for ecosystem services mapping of Central African soils

    NASA Astrophysics Data System (ADS)

    Verdoodt, Ann; Baert, Geert; Van Ranst, Eric

    2014-05-01

    Central African soil resources are characterised by a large variability, ranging from stony, shallow or sandy soils with poor life-sustaining capabilities to highly weathered soils that recycle and support large amounts of biomass. Socio-economic drivers within this largely rural region foster inappropriate land use and management, threaten soil quality and finally culminate into a declining soil productivity and increasing food insecurity. For the development of sustainable land use strategies targeting development planning and natural hazard mitigation, decision makers often rely on legacy soil maps and soil profile databases. Recent development cooperation financed projects led to the design of soil information systems for Rwanda, D.R. Congo, and (ongoing) Burundi. A major challenge is to exploit these existing soil databases and convert them into soil inference systems through an optimal combination of digital soil mapping techniques, land evaluation tools, and biogeochemical models. This presentation aims at (1) highlighting some key characteristics of typical Central African soils, (2) assessing the positional, geographic and semantic quality of the soil information systems, and (3) revealing its potential impacts on the use of these datasets for thematic mapping of soil ecosystem services (e.g. organic carbon storage, pH buffering capacity). Soil map quality is assessed considering positional and semantic quality, as well as geographic completeness. Descriptive statistics, decision tree classification and linear regression techniques are used to mine the soil profile databases. Geo-matching as well as class-matching approaches are considered when developing thematic maps. Variability in inherent as well as dynamic soil properties within the soil taxonomic units is highlighted. It is hypothesized that within-unit variation in soil properties highly affects the use and interpretation of thematic maps for ecosystem services mapping. Results will mainly be based on analyses done in Rwanda, but can be complemented with ongoing research results or prospects for Burundi.

  5. Optimizing literature search in systematic reviews - are MEDLINE, EMBASE and CENTRAL enough for identifying effect studies within the area of musculoskeletal disorders?

    PubMed

    Aagaard, Thomas; Lund, Hans; Juhl, Carsten

    2016-11-22

    When conducting systematic reviews, it is essential to perform a comprehensive literature search to identify all published studies relevant to the specific research question. The Cochrane Collaborations Methodological Expectations of Cochrane Intervention Reviews (MECIR) guidelines state that searching MEDLINE, EMBASE and CENTRAL should be considered mandatory. The aim of this study was to evaluate the MECIR recommendations to use MEDLINE, EMBASE and CENTRAL combined, and examine the yield of using these to find randomized controlled trials (RCTs) within the area of musculoskeletal disorders. Data sources were systematic reviews published by the Cochrane Musculoskeletal Review Group, including at least five RCTs, reporting a search history, searching MEDLINE, EMBASE, CENTRAL, and adding reference- and hand-searching. Additional databases were deemed eligible if they indexed RCTs, were in English and used in more than three of the systematic reviews. Relative recall was calculated as the number of studies identified by the literature search divided by the number of eligible studies i.e. included studies in the individual systematic reviews. Finally, cumulative median recall was calculated for MEDLINE, EMBASE and CENTRAL combined followed by the databases yielding additional studies. Deemed eligible was twenty-three systematic reviews and the databases included other than MEDLINE, EMBASE and CENTRAL was AMED, CINAHL, HealthSTAR, MANTIS, OT-Seeker, PEDro, PsychINFO, SCOPUS, SportDISCUS and Web of Science. Cumulative median recall for combined searching in MEDLINE, EMBASE and CENTRAL was 88.9% and increased to 90.9% when adding 10 additional databases. Searching MEDLINE, EMBASE and CENTRAL was not sufficient for identifying all effect studies on musculoskeletal disorders, but additional ten databases did only increase the median recall by 2%. It is possible that searching databases is not sufficient to identify all relevant references, and that reviewers must rely upon additional sources in their literature search. However further research is needed.

  6. Dynamic biosignal management and transmission during telemedicine incidents handled by Mobile Units over diverse network types.

    PubMed

    Mandellos, George J; Koutelakis, George V; Panagiotakopoulos, Theodor C; Koukias, Andreas M; Koukias, Mixalis N; Lymberopoulos, Dimitrios K

    2008-01-01

    Early and specialized pre-hospital patient treatment improves outcome in terms of mortality and morbidity, in emergency cases. This paper focuses on the design and implementation of a telemedicine system that supports diverse types of endpoints including moving transports (MT) (ambulances, ships, planes, etc.), handheld devices and fixed units, using diverse communication networks. Target of the above telemedicine system is the pre-hospital patient treatment. While vital sign transmission is prior to other services provided by the telemedicine system (videoconference, remote management, voice calls etc.), a predefined algorithm controls provision and quality of the other services. A distributed database system controlled by a central server, aims to manage patient attributes, exams and incidents handled by different Telemedicine Coordination Centers (TCC).

  7. Database documentation of marine mammal stranding and mortality: current status review and future prospects.

    PubMed

    Chan, Derek K P; Tsui, Henry C L; Kot, Brian C W

    2017-11-21

    Databases are systematic tools to archive and manage information related to marine mammal stranding and mortality events. Stranding response networks, governmental authorities and non-governmental organizations have established regional or national stranding networks and have developed unique standard stranding response and necropsy protocols to document and track stranded marine mammal demographics, signalment and health data. The objectives of this study were to (1) describe and review the current status of marine mammal stranding and mortality databases worldwide, including the year established, types of database and their goals; and (2) summarize the geographic range included in the database, the number of cases recorded, accessibility, filter and display methods. Peer-reviewed literature was searched, focussing on published databases of live and dead marine mammal strandings and mortality and information released from stranding response organizations (i.e. online updates, journal articles and annual stranding reports). Databases that were not published in the primary literature or recognized by government agencies were excluded. Based on these criteria, 10 marine mammal stranding and mortality databases were identified, and strandings and necropsy data found in these databases were evaluated. We discuss the results, limitations and future prospects of database development. Future prospects include the development and application of virtopsy, a new necropsy investigation tool. A centralized web-accessed database of all available postmortem multimedia from stranded marine mammals may eventually support marine conservation and policy decisions, which will allow the use of marine animals as sentinels of ecosystem health, working towards a 'One Ocean-One Health' ideal.

  8. ACToR: Aggregated Computational Toxicology Resource (T) ...

    EPA Pesticide Factsheets

    The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information on chemicals of interest to the EPA, and in particular to be a central source for the testing data on all chemicals regulated by all EPA programs; (2) To be a source of in vivo training data sets for building in vitro to in vivo computational models; (3) To serve as a central source of chemical structure and identity information for the ToxCastTM and Tox21 programs. There are 4 main databases, all linked through a common set of chemical information and a common structure linking chemicals to assay data: the public ACToR system (available at http://actor.epa.gov), the ToxMiner database holding ToxCast and Tox21 data, along with results form statistical analyses on these data; the Tox21 chemical repository which is managing the ordering and sample tracking process for the larger Tox21 project; and the public version of ToxRefDB. The public ACToR system contains information on ~500K compounds with toxicology, exposure and chemical property information from >400 public sources. The web site is visited by ~1,000 unique users per month and generates ~1,000 page requests per day on average. The databases are built on open source technology, which has allowed us to export them to a number of col

  9. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  10. ACToR-Aggregated Computational Resource | Science ...

    EPA Pesticide Factsheets

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food & Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high throughput environmental chemical screening and prioritization program called ToxCast(TM).

  11. ACToR - Aggregated Computational Toxicology Resource

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judson, Richard; Richard, Ann; Dix, David

    2008-11-15

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food and Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Centermore » for Computational Toxicology, ACToR helps manage large data sets being used in a high-throughput environmental chemical screening and prioritization program called ToxCast{sup TM}.« less

  12. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    ERIC Educational Resources Information Center

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  13. 75 FR 60415 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-30

    ... computer systems and networks. This information collection is required to obtain the necessary data... card reflecting those benefits and privileges, and to maintain a centralized database of the eligible... card reflecting those benefits and privileges, and to maintain a centralized database of the eligible...

  14. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal

    PubMed Central

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M. Juliana; Hural, John

    2014-01-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×106 ±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8–3.2×106 cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%–130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. PMID:24709391

  15. Ureteral endometriosis: A systematic literature review

    PubMed Central

    Palla, Viktoria-Varvara; Karaolanis, Georgios; Katafigiotis, Ioannis; Anastasiou, Ioannis

    2017-01-01

    Introduction: Ureteral endometriosis is a rare disease affecting women of childbearing age which presents with nonspecific symptoms and it may result in severe morbidity. The aim of this study was to review evidence about incidence, pathogenesis, clinical presentation, diagnosis, and management of ureteral endometriosis. Materials and Methods: PubMed Central database was searched to identify studies reporting cases of ureteral endometriosis. “Ureter” or “Ureteral” and “Endometriosis” were used as key words. Database was searched for articles published since 1996, in English without restrictions regarding the study design. Results: From 420 studies obtained through database search, 104 articles were finally included in this review, including a total of 1384 patients with ureteral endometriosis. Data regarding age, location, pathological findings, and interventions were extracted. Mean patients' age was 38.6 years, whereas the therapeutic arsenal included hormonal, endoscopic, and/or surgical treatment. Conclusions: Ureteral endometriosis represents a diagnostic and therapeutic challenge for the clinicians and high clinical suspicion is needed to identify it. PMID:29021650

  16. Group updates Gravity Database for central Andes

    NASA Astrophysics Data System (ADS)

    MIGRA Group; Götze, H.-J.

    Between 1993 and 1995 a group of scientists from Chile, Argentina, and Germany incorporated some 2000 new gravity observations into a database that covers a remote region of the Central Andes in northern Chile and northwestern Argentina (between 64°-71°W and 20°-29°S). The database can be used to study the structure and evolution of the Andes. About 14,000 gravity values are included in the database, including older, reprocessed data. Researchers at universities or governmental agencies are welcome to use the data for noncommercial purposes.

  17. Adult HIV care resources, management practices and patient characteristics in the Phase 1 IeDEA Central Africa cohort

    PubMed Central

    Divaris, Kimon; Newman, Jamie; Hemingway-Foday, Jennifer; Akam, Wilfred; Balimba, Ashu; Dusengamungu, Cyrille; Kalenga, Lucien; Mbaya, Marcel; Molu, Brigitte Mfangam; Mugisha, Veronicah; Mukumbi, Henri; Mushingantahe, Jules; Nash, Denis; Niyongabo, Théodore; Atibu, Joseph; Azinyue, Innocent; Kiumbu, Modeste; Woelk, Godfrey

    2012-01-01

    Introduction Despite recent advances in the management of HIV infection and increased access to treatment, prevention, care and support, the HIV/AIDS epidemic continues to be a major global health problem, with sub-Saharan Africa suffering by far the greatest humanitarian, demographic and socio-economic burden of the epidemic. Information on HIV/AIDS clinical care and established cohorts’ characteristics in the Central Africa region are sparse. Methods A survey of clinical care resources, management practices and patient characteristics was undertaken among 12 adult HIV care sites in four countries of the International Epidemiologic Databases to Evaluate AIDS Central Africa (IeDEA-CA) Phase 1 regional network in October 2009. These facilities served predominantly urban populations and offered primary care in the Democratic Republic of Congo (DRC; six sites), secondary care in Rwanda (two sites) and tertiary care in Cameroon (three sites) and Burundi (one site). Results Despite some variation in facility characteristics, sites reported high levels of monitoring resources, including electronic databases, as well as linkages to prevention of mother-to-child HIV transmission programs. At the time of the survey, there were 21,599 HIV-positive adults (median age=37 years) enrolled in the clinical cohort. Though two-thirds were women, few adults (6.5%) entered HIV care through prevention of mother-to-child transmission services, whereas 55% of the cohort entered care through voluntary counselling and testing. Two-thirds of patients at sites in Cameroon and DRC were in WHO Stage III and IV at baseline, whereas nearly all patients in the Rwanda facilities with clinical stage information available were in Stage I and II. WHO criteria were used for antiretroviral therapy initiation. The most common treatment regimen was stavudine/lamivudine/nevirapine (64%), followed by zidovudine/lamivudine/nevirapine (19%). Conclusions Our findings demonstrate the feasibility of establishing large clinical cohorts of HIV-positive individuals in a relatively short amount of time in spite of challenges experienced by clinics in resource-limited settings such as those in this region. Country differences in the cohort's site and patient characteristics were noted. This information sets the stage for the development of research initiatives and additional programs to enhance adult HIV care and treatment in Central Africa. PMID:23199800

  18. Data entry module and manuals for the Land Treatment Digital Library

    USGS Publications Warehouse

    Welty, Justin L.; Pilliod, David S.

    2013-01-01

    Across the country, public land managers make decisions each year that influence landscapes and ecosystems within their jurisdictions. Many of these decisions involve vegetation manipulations, which often are referred to as land treatments. These treatments include removal or alteration of plant biomass, seeding of burned areas, application of herbicides, and other activities. Data documenting these land treatments usually are stored at local management offices in various formats. Therefore, anyone interested in the types and effects of land treatments across multiple jurisdictions must first assemble the information, which can be difficult if data discovery and organization involve multiple local offices. A centralized system for storing and accessing the data helps inform land managers when making policy and management considerations and assists scientists in developing sampling designs and studies. The Land Treatment Digital Library (LTDL) was created by the U.S. Geological Survey (USGS) as a comprehensive database incorporating tabular data, documentation, photographs, and spatial data about land treatments in a single system. It was developed over a period of several years and refined based on feedback from partner agencies and stakeholders. Currently, Bureau of Land Management (BLM) land treatment data are being entered by USGS personnel as part of a memorandum of understanding between the USGS and BLM. The LTDL has a website maintained by the USGS Forest and Rangeland Ecosystem Science Center where LTDL data can be viewed http://ltdl.wr.usgs.gov/. The resources and information provided in this data series allow other agencies, organizations, and individuals to download an empty, stand-alone LTDL database to individual or networked computers. Data entered in these databases may be submitted to the USGS for possible inclusion in the online LTDL. Multiple computer programs are used to accomplish the objective of the LTDL. The support of an information-technology specialist or professionals familiar with Microsoft Access™, ESRI’s ArcGIS™, Python, Adobe Acrobat Professional™, and computer settings is essential when installing and operating the LTDL. After the program is operational, a critical element for successful data entry is an understanding of the difference between database tables and forms, and how to edit data in both formats. Complete instructions accompany the program, and they should be followed carefully to ensure the setup and operation of the database goes smoothly.

  19. 76 FR 59170 - Hartford Financial Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT; Notice of Negative... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, Connecticut (The Hartford, Corporate/EIT/CTO Database Management Division). The negative determination was issued on August 19, 2011...

  20. Efficacy of Noninvasive Stellate Ganglion Blockade Performed Using Physical Agent Modalities in Patients with Sympathetic Hyperactivity-Associated Disorders: A Systematic Review and Meta-Analysis

    PubMed Central

    Liao, Chun-De; Tsauo, Jau-Yih; Liou, Tsan-Hon

    2016-01-01

    Background Stellate ganglion blockade (SGB) is mainly used to relieve symptoms of neuropathic pain in conditions such as complex regional pain syndrome and has several potential complications. Noninvasive SGB performed using physical agent modalities (PAMs), such as light irradiation and electrical stimulation, can be clinically used as an alternative to conventional invasive SGB. However, its application protocols vary and its clinical efficacy remains controversial. This study investigated the use of noninvasive SGB for managing neuropathic pain or other disorders associated with sympathetic hyperactivity. Materials and Methods We performed a comprehensive search of the following online databases: Medline, PubMed, Excerpta Medica Database, Cochrane Library Database, Ovid MEDLINE, Europe PubMed Central, EBSCOhost Research Databases, CINAHL, ProQuest Research Library, Physiotherapy Evidence Database, WorldWideScience, BIOSIS, and Google Scholar. We identified and included quasi-randomized or randomized controlled trials reporting the efficacy of SGB performed using therapeutic ultrasound, transcutaneous electrical nerve stimulation, light irradiation using low-level laser therapy, or xenon light or linearly polarized near-infrared light irradiation near or over the stellate ganglion region in treating complex regional pain syndrome or disorders requiring sympatholytic management. The included articles were subjected to a meta-analysis and risk of bias assessment. Results Nine randomized and four quasi-randomized controlled trials were included. Eleven trials had good methodological quality with a Physiotherapy Evidence Database (PEDro) score of ≥6, whereas the remaining two trials had a PEDro score of <6. The meta-analysis results revealed that the efficacy of noninvasive SGB on 100-mm visual analog pain score is higher than that of a placebo or active control (weighted mean difference, −21.59 mm; 95% CI, −34.25, −8.94; p = 0.0008). Conclusions Noninvasive SGB performed using PAMs effectively relieves pain of various etiologies, making it a valuable addition to the contemporary pain management armamentarium. However, this evidence is limited by the potential risk of bias. PMID:27911934

  1. Medicinal plants used in the traditional management of diabetes and its sequelae in Central America: A review.

    PubMed

    Giovannini, Peter; Howes, Melanie-Jayne R; Edwards, Sarah E

    2016-05-26

    Globally 387 million people currently have diabetes and it is projected that this condition will be the 7th leading cause of death worldwide by 2030. As of 2012, its total prevalence in Central America (8.5%) was greater than the prevalence in most Latin American countries and the population of this region widely use herbal medicine. The aim of this study is to review the medicinal plants used to treat diabetes and its sequelae in seven Central American countries: Belize, Costa Rica, El Salvador, Guatemala, Honduras, Nicaragua and Panama. We conducted a literature review and extracted from primary sources the plant use reports in traditional remedies that matched one of the following disease categories: diabetes mellitus, kidney disease, urinary problems, skin diseases and infections, cardiovascular disease, sexual dysfunctions, visual loss, and nerve damage. Use reports were entered in a database and data were analysed in terms of the highest number of use reports for diabetes management and for the different sequelae. We also examined the scientific evidence that might support the local uses of the most reported species. Out of 535 identified species used to manage diabetes and its sequelae, 104 species are used to manage diabetes and we found in vitro and in vivo preclinical experimental evidence of hypoglycaemic effect for 16 of the 20 species reported by at least two sources. However, only seven of these species are reported in more than 3 studies: Momordica charantia L., Neurolaena lobata (L.) R. Br. ex Cass., Tecoma stans (L.) Juss. ex Kunth, Persea americana Mill., Psidium guajava L., Anacardium occidentale L. and Hamelia patens Jacq. Several of the species that are used to manage diabetes in Central America are also used to treat conditions that may arise as its consequence such as kidney disease, urinary problems and skin conditions. This review provides an overview of the medicinal plants used to manage diabetes and its sequelae in Central America and of the current scientific knowledge that might explain their traditional use. In Central America a large number of medicinal plants are used to treat this condition and its sequelae, although relatively few species are widely used across the region. For the species used to manage diabetes, there is variation in the availability and quality of pharmacological, chemical and clinical studies to explain traditional use. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Data Sharing in Astrobiology: the Astrobiology Habitable Environments Database (AHED)

    NASA Astrophysics Data System (ADS)

    Bristow, T.; Lafuente Valverde, B.; Keller, R.; Stone, N.; Downs, R. T.; Blake, D. F.; Fonda, M.; Pires, A.

    2016-12-01

    Astrobiology is a multidisciplinary area of scientific research focused on studying the origins of life on Earth and the conditions under which life might have emerged elsewhere in the universe. The understanding of complex questions in astrobiology requires integration and analysis of data spanning a range of disciplines including biology, chemistry, geology, astronomy and planetary science. However, the lack of a centralized repository makes it difficult for astrobiology teams to share data and benefit from resultant synergies. Moreover, in recent years, federal agencies are requiring that results of any federally funded scientific research must be available and useful for the public and the science community. Astrobiology, as any other scientific discipline, needs to respond to these mandates. The Astrobiology Habitable Environments Database (AHED) is a central, high quality, long-term searchable repository designed to help the community by promoting the integration and sharing of all the data generated by these diverse disciplines. AHED provides public and open-access to astrobiology-related research data through a user-managed web portal implemented using the open-source software The Open Data Repository's (ODR) Data Publisher [1]. ODR-DP provides a user-friendly interface that research teams or individual scientists can use to design, populate and manage their own databases or laboratory notebooks according to the characteristics of their data. AHED is then a collection of databases housed in the ODR framework that store information about samples, along with associated measurements, analyses, and contextual information about field sites where samples were collected, the instruments or equipment used for analysis, and people and institutions involved in their collection. Advanced graphics are implemented together with advanced online tools for data analysis (e.g. R, MATLAB, Project Jupyter-http://jupyter.org). A permissions system will be put in place so that as data are being actively collected and interpreted, they will remain proprietary. A citation system will allow research data to be used and appropriately referenced by other researchers after the data are made public. This project is supported by SERA and NASA NNX11AP82A, MSL. [1] Stone et al. (2016) AGU, submitted.

  3. Influence of national centralization of oesophagogastric cancer on management and clinical outcome from emergency upper gastrointestinal conditions.

    PubMed

    Markar, S R; Mackenzie, H; Wiggins, T; Askari, A; Karthikesalingam, A; Faiz, O; Griffin, S M; Birkmeyer, J D; Hanna, G B

    2018-01-01

    In England in 2001 oesophagogastric cancer surgery was centralized. The aim of this study was to evaluate whether centralization of oesophagogastric cancer to high-volume centres has had an effect on mortality from different emergency upper gastrointestinal conditions. The Hospital Episode Statistics database was used to identify patients admitted to hospitals in England (1997-2012). The influence of oesophagogastric high-volume cancer centre status (20 or more resections per year) on 30- and 90-day mortality from oesophageal perforation, paraoesophageal hernia and perforated peptic ulcer was analysed. Over the study interval, 3707, 12 441 and 56 822 patients with oesophageal perforation, paraoesophageal hernia and perforated peptic ulcer respectively were included. There was a passive centralization to high-volume cancer centres for oesophageal perforation (26·9 per cent increase), paraoesophageal hernia (19·5 per cent increase) and perforated peptic ulcer (23·0 per cent increase). Management of oesophageal perforation in high-volume centres was associated with a reduction in 30-day (HR 0·58, 95 per cent c.i. 0·45 to 0·74) and 90-day (HR 0·62, 0·49 to 0·77) mortality. High-volume cancer centre status did not affect mortality from paraoesophageal hernia or perforated peptic ulcer. Annual emergency admission volume thresholds at which mortality improved were observed for oesophageal perforation (5 patients) and paraoesophageal hernia (11). Following centralization, the proportion of patients managed in high-volume cancer centres that reached this volume threshold was 88·0 per cent for oesophageal perforation, but only 30·3 per cent for paraoesophageal hernia. Centralization of low incidence conditions such as oesophageal perforation to high-volume cancer centres provides a greater level of expertise and ultimately reduces mortality. © 2017 BJS Society Ltd Published by John Wiley & Sons Ltd.

  4. WebEQ: a web-GIS System to collect, display and query data for the management of the earthquake emergency in Central Italy

    NASA Astrophysics Data System (ADS)

    Carbone, Gianluca; Cosentino, Giuseppe; Pennica, Francesco; Moscatelli, Massimiliano; Stigliano, Francesco

    2017-04-01

    After the strong earthquakes that hit central Italy in recent months, the Center for Seismic Microzonation and its applications (CentroMS) was commissioned by the Italian Department of Civil Protection to conduct the study of seismic microzonation of the territories affected by the earthquake of August 24, 2016. As part of the activities of microzonation, IGAG CNR has created WebEQ, a management tool of the data that have been acquired by all participants (i.e., more than twenty research institutes and university departments). The data collection was organized and divided into sub-areas, assigned to working groups with multidisciplinary expertise in geology, geophysics and engineering. WebEQ is a web-GIS System that helps all the subjects involved in the data collection activities, through tools aimed at data uploading and validation, and with a simple GIS interface to display, query and download geographic data. WebEQ is contributing to the creation of a large database containing geographical data, both vector and raster, from various sources and types: - Regional Technical Map em Geological and geomorphological maps em Data location maps em Maps of microzones homogeneous in seismic perspective and seismic microzonation maps em National strong motion network location. Data loading is done through simple input masks that ensure consistency with the database structure, avoiding possible errors and helping users to interact with the map through user-friendly tools. All the data are thematized through standardized symbologies and colors (Gruppo di lavoro MS 2008), in order to allow the easy interpretation by all users. The data download tools allow data exchange between working groups and the scientific community to benefit from the activities. The seismic microzonation activities are still ongoing. WebEQ is enabling easy management of large amounts of data and will form a basis for the development of tools for the management of the upcoming seismic emergencies.

  5. 75 FR 57437 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-21

    ... a Food Safety Education and Training Materials Database. The Database is a centralized gateway to... creating previously available education materials) (2) provide a central gateway to access the education materials (3) create a systematic and efficient method of collecting data from USDA grantees and (4) promote...

  6. 78 FR 69040 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... a Food Safety Education and Training Materials Database. The Database is a centralized gateway to... creating previously available education materials), (2) provide a central gateway to access the education materials, (3) create a systematic and efficient method of collecting data from USDA grantees, and (4...

  7. TWRS technical baseline database manager definition document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acree, C.D.

    1997-08-13

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

  8. Hold your horses: A comparison of human laryngomalacia with analogous equine airway pathology.

    PubMed

    Lawrence, Rachael J; Butterell, Matthew J; Constable, James D; Daniel, Matija

    2018-02-01

    Laryngomalacia is the most common cause of stridor in infants. Dynamic airway collapse is also a well-recognised entity in horses and an important cause of surgical veterinary intervention. We compare the aetiology, clinical features and management of human laryngomalacia with equine dynamic airway collapse. A structured review of the PubMed, the Ovid Medline and the Cochrane Collaboration databases (Cochrane Central Register of Controlled Trials, Cochrane Database of Systemic Reviews). There are numerous equine conditions that cause dynamic airway collapse defined specifically by the anatomical structures involved. Axial Deviation of the Aryepiglottic Folds (ADAF) is the condition most clinically analogous to laryngomalacia in humans, and is likewise most prevalent in the immature equine airway. Both conditions are managed either conservatively, or if symptoms require it, with surgical intervention. The operative procedures performed for ADAF and laryngomalacia are technically comparable. Dynamic collapse of the equine larynx, especially ADAF, is clinically similar to human laryngomalacia, and both are treated in a similar fashion. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Towards a Global Service Registry for the World-Wide LHC Computing Grid

    NASA Astrophysics Data System (ADS)

    Field, Laurence; Alandes Pradillo, Maria; Di Girolamo, Alessandro

    2014-06-01

    The World-Wide LHC Computing Grid encompasses a set of heterogeneous information systems; from central portals such as the Open Science Grid's Information Management System and the Grid Operations Centre Database, to the WLCG information system, where the information sources are the Grid services themselves. Providing a consistent view of the information, which involves synchronising all these informations systems, is a challenging activity that has lead the LHC virtual organisations to create their own configuration databases. This experience, whereby each virtual organisation's configuration database interfaces with multiple information systems, has resulted in the duplication of effort, especially relating to the use of manual checks for the handling of inconsistencies. The Global Service Registry aims to address this issue by providing a centralised service that aggregates information from multiple information systems. It shows both information on registered resources (i.e. what should be there) and available resources (i.e. what is there). The main purpose is to simplify the synchronisation of the virtual organisation's own configuration databases, which are used for job submission and data management, through the provision of a single interface for obtaining all the information. By centralising the information, automated consistency and validation checks can be performed to improve the overall quality of information provided. Although internally the GLUE 2.0 information model is used for the purpose of integration, the Global Service Registry in not dependent on any particular information model for ingestion or dissemination. The intention is to allow the virtual organisation's configuration databases to be decoupled from the underlying information systems in a transparent way and hence simplify any possible future migration due to the evolution of those systems. This paper presents the Global Service Registry architecture, its advantages compared to the current situation and how it can support the evolution of information systems.

  10. [Quality management and participation into clinical database].

    PubMed

    Okubo, Suguru; Miyata, Hiroaki; Tomotaki, Ai; Motomura, Noboru; Murakami, Arata; Ono, Minoru; Iwanaka, Tadashi

    2013-07-01

    Quality management is necessary for establishing useful clinical database in cooperation with healthcare professionals and facilities. The ways of management are 1) progress management of data entry, 2) liaison with database participants (healthcare professionals), and 3) modification of data collection form. In addition, healthcare facilities are supposed to consider ethical issues and information security for joining clinical databases. Database participants should check ethical review boards and consultation service for patients.

  11. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  12. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE PAGES

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  13. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    NASA Astrophysics Data System (ADS)

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    The area of the Swabian-Franconian cuesta landscape (Southern Germany) is highly prone to landslides. This was apparent in the late spring of 2013, when numerous landslides occurred as a consequence of heavy and long-lasting rainfalls. The specific climatic situation caused numerous damages with serious impact on settlements and infrastructure. Knowledge on spatial distribution of landslides, processes and characteristics are important to evaluate the potential risk that can occur from mass movements in those areas. In the frame of two projects about 400 landslides were mapped and detailed data sets were compiled during years 2011 to 2014 at the Franconian Alb. The studies are related to the project "Slope stability and hazard zones in the northern Bavarian cuesta" (DFG, German Research Foundation) as well as to the LfU (The Bavarian Environment Agency) within the project "Georisks and climate change - hazard indication map Jura". The central goal of the present study is to create a spatial database for landslides. The database should contain all fundamental parameters to characterize the mass movements and should provide the potential for secure data storage and data management, as well as statistical evaluations. The spatial database was created with PostgreSQL, an object-relational database management system and PostGIS, a spatial database extender for PostgreSQL, which provides the possibility to store spatial and geographic objects and to connect to several GIS applications, like GRASS GIS, SAGA GIS, QGIS and GDAL, a geospatial library (Obe et al. 2011). Database access for querying, importing, and exporting spatial and non-spatial data is ensured by using GUI or non-GUI connections. The database allows the use of procedural languages for writing advanced functions in the R, Python or Perl programming languages. It is possible to work directly with the (spatial) data entirety of the database in R. The inventory of the database includes (amongst others), informations on location, landslide types and causes, geomorphological positions, geometries, hazards and damages, as well as assessments related to the activity of landslides. Furthermore, there are stored spatial objects, which represent the components of a landslide, in particular the scarps and the accumulation areas. Besides, waterways, map sheets, contour lines, detailed infrastructure data, digital elevation models, aspect and slope data are included. Examples of spatial queries to the database are intersections of raster and vector data for calculating values for slope gradients or aspects of landslide areas and for creating multiple, overlaying sections for the comparison of slopes, as well as distances to the infrastructure or to the next receiving drainage. Furthermore, getting informations on landslide magnitudes, distribution and clustering, as well as potential correlations concerning geomorphological or geological conditions. The data management concept in this study can be implemented for any academic, public or private use, because it is independent from any obligatory licenses. The created spatial database offers a platform for interdisciplinary research and socio-economic questions, as well as for landslide susceptibility and hazard indication mapping. Obe, R.O., Hsu, L.S. 2011. PostGIS in action. - pp 492, Manning Publications, Stamford

  14. Enhancing Chemical Inventory Management in Laboratory through a Mobile-Based QR Code Tag

    NASA Astrophysics Data System (ADS)

    Shukran, M. A. M.; Ishak, M. S.; Abdullah, M. N.

    2017-08-01

    The demand for a greater inventory management system which can provide a lot of useful information from a single scan has made laboratory inventory management using barcode technology more difficult. Since the barcode technology lacks the ability to overcome the problem and is not capable of providing information needed to manage the chemicals in the laboratory, thus employing a QR code technology is the best solution. In this research, the main idea is to develop a standalone application running with its own database that is periodically synchronized with the inventory software hosted by the computer and connected to a specialized network as well. The first process required to establish this centralized system is to determine all inventory available in the chemical laboratory by referring to the documented data in order to develop the database. Several customization and enhancement were made to the open source QR code technology to ensure the developed application is dedicated for its main purposes. As the end of the research, it was proven that the system is able to track the position of all inventory and showing real time information about the scanned chemical labels. This paper intends to give an overview about the QR tag inventory system that was developed and its implementation at the National Defence University of Malaysia’s (NDUM) chemical laboratory.

  15. Landscape features, standards, and semantics in U.S. national topographic mapping databases

    USGS Publications Warehouse

    Varanka, Dalia

    2009-01-01

    The objective of this paper is to examine the contrast between local, field-surveyed topographical representation and feature representation in digital, centralized databases and to clarify their ontological implications. The semantics of these two approaches are contrasted by examining the categorization of features by subject domains inherent to national topographic mapping. When comparing five USGS topographic mapping domain and feature lists, results indicate that multiple semantic meanings and ontology rules were applied to the initial digital database, but were lost as databases became more centralized at national scales, and common semantics were replaced by technological terms.

  16. Development of a bird banding recapture database

    USGS Publications Warehouse

    Tautin, J.; Doherty, P.F.; Metras, L.

    2001-01-01

    Recaptures (and resightings) constitute the vast majority of post-release data from banded or otherwise marked nongame birds. A powerful suite of contemporary analytical models is available for using recapture data to estimate population size, survival rates and other parameters, and many banders collect recapture data for their project specific needs. However, despite widely recognized, broader programmatic needs for more and better data, banders' recapture data are not centrally reposited and made available for use by others. To address this need, the US Bird Banding Laboratory, the Canadian Bird Banding Office and the Georgia Cooperative Fish and Wildlife Research Unit are developing a bird banding recapture database. In this poster we discuss the critical steps in developing the database, including: determining exactly which recapture data should be included; developing a standard record format and structure for the database; developing electronic means for collecting, vetting and disseminating the data; and most importantly, developing metadata descriptions and individual data set profiles to facilitate the user's selection of appropriate analytical models. We provide examples of individual data sets to be included in the database, and we assess the feasibility of developing a prescribed program for obtaining recapture data from banders who do not presently collect them. It is expected that the recapture database eventually will contain millions of records made available publicly for a variety of avian research and management purposes

  17. The region makes the difference: disparities in management of acute myocardial infarction within Switzerland.

    PubMed

    Insam, Charlène; Paccaud, Fred; Marques-Vidal, Pedro

    2014-05-01

    In Switzerland, health policies are decided at the local level, but little is known regarding their impact on the management of acute myocardial infarction (AMI). In this study, we assessed geographical differences within Switzerland regarding management of AMI. Cross-sectional study. Swiss hospital discharge database for period 2007-2008 (26,204 discharges from AMI). Seven Swiss regions (Leman, Mittelland, Northwest, Zurich, Central, Eastern, and Ticino) were analysed. Almost 53.7% of discharges from AMI were managed in a single hospital, ranging from 62.1% (Leman) to 31.6% (Ticino). The highest intensive care unit admission rate was in Leman (69.4%), the lowest (16.9%) in Ticino (Swiss average: 36.0%). Intracoronary revascularization rates were highest in Leman (51.1%) and lowest (30.9%) in Central Switzerland (average: 41.0%). Bare (non-drug-eluting) stent use was highest in Leman (61.4%) and lowest (16.9%) in Ticino (average: 42.1%), while drug-eluting stent use was highest (83.2%) in Ticino and lowest (38.6%) in Leman (average: 57.9%). Coronary artery bypass graft rates were highest (4.8%) in Ticino and lowest (0.5%) in Eastern Switzerland (average: 2.8%). Mechanical circulatory assistance rates were highest (4.2%) in Zurich and lowest (0.5%) in Ticino (average: 1.8%). The differences remained after adjusting for age, single or multiple hospital management, and gender. In Switzerland, significant geographical differences in management and revascularization procedures for AMI were found.

  18. [The RUTA project (Registro UTIC Triveneto ANMCO). An e-network for the coronary care units for acute myocardial infarction].

    PubMed

    Di Chiara, Antonio; Zonzin, Pietro; Pavoni, Daisy; Fioretti, Paolo Maria

    2003-06-01

    In the era of evidence-based medicine, the monitoring of the adherence to the guidelines is fundamental, in order to verify the diagnostic and therapeutic processes. Informatic paperless databases allow a higher data quality, lower costs and timely analysis with overall advantages over the traditional surveys. The RUTA project (acronym of Triveneto Registry of ANMCO CCUs) was designed in 1999, aiming at creating an informatic network among the coronary care units of a large Italian region, for a permanent survey of patients admitted for acute myocardial infarction. Information ranges from the pre-hospital phase to discharge, including all relevant clinical and management variables. The database uses DBMS Personal Oracle and Power-Builder as user interface, on Windows platform. Anonymous data are sent to a central server.

  19. A method to manage and share anti-retroviral (ARV) therapy information of human immunodeficiency virus (HIV) patients in Vietnam.

    PubMed

    Nguyen, Phung Anh; Syed-Abdul, Shabbir; Minamareddy, Priti; Lee, Peisan; Ngo, Thuy Dieu; Iqbal, Usman; Nguyen, Phuong Hoang; Jian, Wen-Shan; Li, Yu-Chuan Jack

    2013-08-01

    Management of antiretroviral (ARV) drug and HIV patients data is an important component of Vietnam Administration of HIV/AIDS Control (VAAC) Department and hospitals/health care units when people often travel in other places of Vietnam; therefore, it would lead to a number of medical errors in treatment as well as patients do not adhere to ARV therapy. In this paper, we describe a system that manages and shares antiretroviral therapy information of 4438 HIV patients in three healthcare centers in Hanoi capital of Vietnam. The overall design considerations, architecture and the integration of centralized database and decentralized management for the system are also presented. The findings from this study can serve as a guide to consider in the implementation model of health care to manage and share information of patients not only in HIV infection, but also in the other chronic and non-communicable diseases. Copyright © 2013. Published by Elsevier Ireland Ltd.

  20. Improving Care And Research Electronic Data Trust Antwerp (iCAREdata): a research database of linked data on out-of-hours primary care.

    PubMed

    Colliers, Annelies; Bartholomeeusen, Stefaan; Remmen, Roy; Coenen, Samuel; Michiels, Barbara; Bastiaens, Hilde; Van Royen, Paul; Verhoeven, Veronique; Holmgren, Philip; De Ruyck, Bernard; Philips, Hilde

    2016-05-04

    Primary out-of-hours care is developing throughout Europe. High-quality databases with linked data from primary health services can help to improve research and future health services. In 2014, a central clinical research database infrastructure was established (iCAREdata: Improving Care And Research Electronic Data Trust Antwerp, www.icaredata.eu ) for primary and interdisciplinary health care at the University of Antwerp, linking data from General Practice Cooperatives, Emergency Departments and Pharmacies during out-of-hours care. Medical data are pseudonymised using the services of a Trusted Third Party, which encodes private information about patients and physicians before data is sent to iCAREdata. iCAREdata provides many new research opportunities in the fields of clinical epidemiology, health care management and quality of care. A key aspect will be to ensure the quality of data registration by all health care providers. This article describes the establishment of a research database and the possibilities of linking data from different primary out-of-hours care providers, with the potential to help to improve research and the quality of health care services.

  1. Distributed structure-searchable toxicity (DSSTox) public database network: a proposal.

    PubMed

    Richard, Ann M; Williams, ClarLynda R

    2002-01-29

    The ability to assess the potential genotoxicity, carcinogenicity, or other toxicity of pharmaceutical or industrial chemicals based on chemical structure information is a highly coveted and shared goal of varied academic, commercial, and government regulatory groups. These diverse interests often employ different approaches and have different criteria and use for toxicity assessments, but they share a need for unrestricted access to existing public toxicity data linked with chemical structure information. Currently, there exists no central repository of toxicity information, commercial or public, that adequately meets the data requirements for flexible analogue searching, Structure-Activity Relationship (SAR) model development, or building of chemical relational databases (CRD). The distributed structure-searchable toxicity (DSSTox) public database network is being proposed as a community-supported, web-based effort to address these shared needs of the SAR and toxicology communities. The DSSTox project has the following major elements: (1) to adopt and encourage the use of a common standard file format (structure data file (SDF)) for public toxicity databases that includes chemical structure, text and property information, and that can easily be imported into available CRD applications; (2) to implement a distributed source approach, managed by a DSSTox Central Website, that will enable decentralized, free public access to structure-toxicity data files, and that will effectively link knowledgeable toxicity data sources with potential users of these data from other disciplines (such as chemistry, modeling, and computer science); and (3) to engage public/commercial/academic/industry groups in contributing to and expanding this community-wide, public data sharing and distribution effort. The DSSTox project's overall aims are to effect the closer association of chemical structure information with existing toxicity data, and to promote and facilitate structure-based exploration of these data within a common chemistry-based framework that spans toxicological disciplines.

  2. Short Fiction on Film: A Relational DataBase.

    ERIC Educational Resources Information Center

    May, Charles

    Short Fiction on Film is a database that was created and will run on DataRelator, a relational database manager created by Bill Finzer for the California State Department of Education in 1986. DataRelator was designed for use in teaching students database management skills and to provide teachers with examples of how a database manager might be…

  3. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...

  4. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...

  5. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...

  6. Map and digital database of sedimentary basins and indications of petroleum in the Central Alaska Province

    USGS Publications Warehouse

    Troutman, Sandra M.; Stanley, Richard G.

    2003-01-01

    This database and accompanying text depict historical and modern reported occurrences of petroleum both in wells and at the surface within the boundaries of the Central Alaska Province. These data were compiled from previously published and unpublished sources and were prepared for use in the 2002 U.S. Geological Survey petroleum assessment of Central Alaska, Yukon Flats region. Indications of petroleum are described as oil or gas shows in wells, oil or gas seeps, or outcrops of oil shale or oil-bearing rock and include confirmed and unconfirmed reports. The scale of the source map limits the spatial resolution (scale) of the database to 1:2,500,000 or smaller.

  7. Contraception supply chain challenges: a review of evidence from low- and middle-income countries.

    PubMed

    Mukasa, Bakali; Ali, Moazzam; Farron, Madeline; Van de Weerdt, Renee

    2017-10-01

    To identify and assess factors determining the functioning of supply chain systems for modern contraception in low- and middle-income countries (LMICs), and to identify challenges contributing to contraception stockouts that may lead to unmet need. Scientific databases and grey literature were searched including Database of Abstracts of Reviews of Effectiveness (DARE), PubMed, MEDLINE, POPLINE, CINAHL, Academic Search Complete, Science Direct, Web of Science, Cochrane Central, Google Scholar, WHO databases and websites of key international organisations. Studies indicated that supply chain system inefficiencies significantly affect availability of modern FP and contraception commodities in LMICs, especially in rural public facilities where distribution barriers may be acute. Supply chain failures or bottlenecks may be attributed to: weak and poorly institutionalized logistic management information systems (LMIS), poor physical infrastructures in LMICs, lack of trained and dedicated staff for supply chain management, inadequate funding, and rigid government policies on task sharing. However, there is evidence that implementing effective LMISs and involving public and private providers will distribution channels resulted in reduction in medical commodities' stockout rates. Supply chain bottlenecks contribute significantly to persistent high stockout rates for modern contraceptives in LMICs. Interventions aimed at enhancing uptake of contraceptives to reduce the problem of unmet need in LMICs should make strong commitments towards strengthening these countries' health commodities supply chain management systems. Current evidence is limited and additional, and well-designed implementation research on contraception supply chain systems is warranted to gain further understanding and insights on the determinants of supply chain bottlenecks and their impact on stockouts of contraception commodities.

  8. Assessing animal welfare in sow herds using data on meat inspection, medication and mortality.

    PubMed

    Knage-Rasmussen, K M; Rousing, T; Sørensen, J T; Houe, H

    2015-03-01

    This paper aims to contribute to the development of a cost-effective alternative to expensive on-farm animal-based welfare assessment systems. The objective of the study was to design an animal welfare index based on central database information (DBWI), and to validate it against an animal welfare index based on-farm animal-based measurements (AWI). Data on 63 Danish sow herds with herd-sizes of 80 to 2500 sows and an average herd size of 501 were collected from three central databases containing: Meat inspection data collected at animal level in the abattoir, mortality data at herd level from the rendering plants of DAKA, and medicine records at both herd and animal group level (sow with piglets, weaners or finishers) from the central database Vetstat. Selected measurements taken from these central databases were used to construct the DBWI. The relative welfare impacts of both individual database measurements and the databases overall were assigned in consultation with a panel consisting of 12 experts. The experts were drawn from production advisory activities, animal science and in one case an animal welfare organization. The expert panel weighted each measurement on a scale from 1 (not-important) to 5 (very important). The experts also gave opinions on the relative weightings of measurements for each of the three databases by stating a relative weight of each database in the DBWI. On the basis of this, the aggregated DBWI was normalized. The aggregation of AWI was based on weighted summary of herd prevalence's of 20 clinical and behavioural measurements originating from a 1 day data collection. AWI did not show linear dependency of DBWI. This suggests that DBWI is not suited to replace an animal welfare index using on-farm animal-based measurements.

  9. Establishment and maintenance of a standardized glioma tissue bank: Huashan experience.

    PubMed

    Aibaidula, Abudumijiti; Lu, Jun-feng; Wu, Jin-song; Zou, He-jian; Chen, Hong; Wang, Yu-qian; Qin, Zhi-yong; Yao, Yu; Gong, Ye; Che, Xiao-ming; Zhong, Ping; Li, Shi-qi; Bao, Wei-min; Mao, Ying; Zhou, Liang-fu

    2015-06-01

    Cerebral glioma is the most common brain tumor as well as one of the top ten malignant tumors in human beings. In spite of the great progress on chemotherapy and radiotherapy as well as the surgery strategies during the past decades, the mortality and morbidity are still high. One of the major challenges is to explore the pathogenesis and invasion of glioma at various "omics" levels (such as proteomics or genomics) and the clinical implications of biomarkers for diagnosis, prognosis or treatment of glioma patients. Establishment of a standardized tissue bank with high quality biospecimens annotated with clinical information is pivotal to the solution of these questions as well as the drug development process and translational research on glioma. Therefore, based on previous experience of tissue banks, standardized protocols for sample collection and storage were developed. We also developed two systems for glioma patient and sample management, a local database for medical records and a local image database for medical images. For future set-up of a regional biobank network in Shanghai, we also founded a centralized database for medical records. Hence we established a standardized glioma tissue bank with sufficient clinical data and medical images in Huashan Hospital. By September, 2013, tissues samples from 1,326 cases were collected. Histological diagnosis revealed that 73 % were astrocytic tumors, 17 % were oligodendroglial tumors, 2 % were oligoastrocytic tumors, 4 % were ependymal tumors and 4 % were other central nervous system neoplasms.

  10. Microcomputer Database Management Systems for Bibliographic Data.

    ERIC Educational Resources Information Center

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  11. The Data Base and Decision Making in Public Schools.

    ERIC Educational Resources Information Center

    Hedges, William D.

    1984-01-01

    Describes generic types of databases--file management systems, relational database management systems, and network/hierarchical database management systems--with their respective strengths and weaknesses; discusses factors to be considered in determining whether a database is desirable; and provides evaluative criteria for use in choosing…

  12. Architecture design of a generic centralized adjudication module integrated in a web-based clinical trial management system.

    PubMed

    Zhao, Wenle; Pauls, Keith

    2016-04-01

    Centralized outcome adjudication has been used widely in multicenter clinical trials in order to prevent potential biases and to reduce variations in important safety and efficacy outcome assessments. Adjudication procedures could vary significantly among different studies. In practice, the coordination of outcome adjudication procedures in many multicenter clinical trials remains as a manual process with low efficiency and high risk of delay. Motivated by the demands from two large clinical trial networks, a generic outcome adjudication module has been developed by the network's data management center within a homegrown clinical trial management system. In this article, the system design strategy and database structure are presented. A generic database model was created to transfer different adjudication procedures into a unified set of sequential adjudication steps. Each adjudication step was defined by one activate condition, one lock condition, one to five categorical data items to capture adjudication results, and one free text field for general comments. Based on this model, a generic outcome adjudication user interface and a generic data processing program were developed within a homegrown clinical trial management system to provide automated coordination of outcome adjudication. By the end of 2014, this generic outcome adjudication module had been implemented in 10 multicenter trials. A total of 29 adjudication procedures were defined with the number of adjudication steps varying from 1 to 7. The implementation of a new adjudication procedure in this generic module took an experienced programmer 1 or 2 days. A total of 7336 outcome events had been adjudicated and 16,235 adjudication step activities had been recorded. In a multicenter trial, 1144 safety outcome event submissions went through a three-step adjudication procedure and reported a median of 3.95 days from safety event case report form submission to adjudication completion. In another trial, 277 clinical outcome events were adjudicated by a six-step procedure and took a median of 23.84 days from outcome event case report form submission to adjudication procedure completion. A generic outcome adjudication module integrated in the clinical trial management system made the automated coordination of efficacy and safety outcome adjudication a reality. © The Author(s) 2015.

  13. Distributed cyberinfrastructure tools for automated data processing of structural monitoring data

    NASA Astrophysics Data System (ADS)

    Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul

    2012-04-01

    The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.

  14. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  15. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  16. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  17. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  18. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  19. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... individual database managers; and to perform other functions as needed for the administration of the TV bands... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers...

  20. How to maintain blood supply during computer network breakdown: a manual backup system.

    PubMed

    Zeiler, T; Slonka, J; Bürgi, H R; Kretschmer, V

    2000-12-01

    Electronic data management systems using computer network systems and client/server architecture are increasingly used in laboratories and transfusion services. Severe problems arise if there is no network access to the database server and critical functions are not available. We describe a manual backup system (MBS) developed to maintain the delivery of blood products to patients in a hospital transfusion service in case of a computer network breakdown. All data are kept on a central SQL database connected to peripheral workstations in a local area network (LAN). Request entry from wards is performed via machine-readable request forms containing self-adhesive specimen labels with barcodes for test tubes. Data entry occurs on-line by bidirectional automated systems or off-line manually. One of the workstations in the laboratory contains a second SQL database which is frequently and incrementally updated. This workstation is run as a stand-alone, read-only database if the central SQL database is not available. In case of a network breakdown, the time-graded MBS is launched. Patient data, requesting ward and ordered tests/requests, are photocopied through a template from the request forms on special MBS worksheets serving as laboratory journal for manual processing and result report (a copy is left in the laboratory). As soon as the network is running again the data from the off-line period are entered into the primary SQL server. The MBS was successfully used at several occasions. The documentation of a 90-min breakdown period is presented in detail. Additional work resulted from the copy work and the belated manual data entry after restoration of the system. There was no delay in issue of blood products or result reporting. The backup system described has been proven to be simple, quick and safe to maintain urgent blood supply and distribution of laboratory results in case of unexpected network breakdown.

  1. Ottawa Panel Evidence-Based Clinical Practice Guidelines for Foot Care in the Management of Juvenile Idiopathic Arthritis.

    PubMed

    Brosseau, Lucie; Toupin-April, Karine; Wells, George; Smith, Christine A; Pugh, Arlanna G; Stinson, Jennifer N; Duffy, Ciarán M; Gifford, Wendy; Moher, David; Sherrington, Catherine; Cavallo, Sabrina; De Angelis, Gino; Loew, Laurianne; Rahman, Prinon; Marcotte, Rachel; Taki, Jade; Bisaillon, Jacinthe; King, Judy; Coda, Andrea; Hendry, Gordon J; Gauvreau, Julie; Hayles, Martin; Hayles, Kay; Feldman, Brian; Kenny, Glen P; Li, Jing Xian; Briggs, Andrew M; Martini, Rose; Feldman, Debbie Ehrmann; Maltais, Désirée B; Tupper, Susan; Bigford, Sarah; Bisch, Marg

    2016-07-01

    To create evidence-based guidelines evaluating foot care interventions for the management of juvenile idiopathic arthritis (JIA). An electronic literature search of the following databases from database inception to May 2015 was conducted: MEDLINE (Ovid), EMBASE (Ovid), Cochrane CENTRAL, and clinicaltrials.gov. The Ottawa Panel selection criteria targeted studies that assessed foot care or foot orthotic interventions for the management of JIA in those aged 0 to ≤18 years. The Physiotherapy Evidence Database scale was used to evaluate study quality, of which only high-quality studies were included (score, ≥5). A total of 362 records were screened, resulting in 3 full-text articles and 1 additional citation containing supplementary information included for the analysis. Two reviewers independently extracted study data (intervention, comparator, outcome, time period, study design) from the included studies by using standardized data extraction forms. Directed by Cochrane Collaboration methodology, the statistical analysis produced figures and graphs representing the strength of intervention outcomes and their corresponding grades (A, B, C+, C, C-, D+, D, D-). Clinical significance was achieved when an improvement of ≥30% between the intervention and control groups was present, whereas P>.05 indicated statistical significance. An expert panel Delphi consensus (≥80%) was required for the endorsement of recommendations. All included studies were of high quality and analyzed the effects of multidisciplinary foot care, customized foot orthotics, and shoe inserts for the management of JIA. Custom-made foot orthotics and prefabricated shoe inserts displayed the greatest improvement in pain intensity, activity limitation, foot pain, and disability reduction (grades A, C+). The use of customized foot orthotics and prefabricated shoe inserts seems to be a good choice for managing foot pain and function in JIA. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  2. DbMap: improving database interoperability issues in medical software using a simple, Java-Xml based solution.

    PubMed Central

    Karadimas, H.; Hemery, F.; Roland, P.; Lepage, E.

    2000-01-01

    In medical software development, the use of databases plays a central role. However, most of the databases have heterogeneous encoding and data models. To deal with these variations in the application code directly is error-prone and reduces the potential reuse of the produced software. Several approaches to overcome these limitations have been proposed in the medical database literature, which will be presented. We present a simple solution, based on a Java library, and a central Metadata description file in XML. This development approach presents several benefits in software design and development cycles, the main one being the simplicity in maintenance. PMID:11079915

  3. Centralized database for interconnection system design. [for spacecraft

    NASA Technical Reports Server (NTRS)

    Billitti, Joseph W.

    1989-01-01

    A database application called DFACS (Database, Forms and Applications for Cabling and Systems) is described. The objective of DFACS is to improve the speed and accuracy of interconnection system information flow during the design and fabrication stages of a project, while simultaneously supporting both the horizontal (end-to-end wiring) and the vertical (wiring by connector) design stratagems used by the Jet Propulsion Laboratory (JPL) project engineering community. The DFACS architecture is centered around a centralized database and program methodology which emulates the manual design process hitherto used at JPL. DFACS has been tested and successfully applied to existing JPL hardware tasks with a resulting reduction in schedule time and costs.

  4. Database Management Systems: New Homes for Migrating Bibliographic Records.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Bierbaum, Esther G.

    1987-01-01

    Assesses bibliographic databases as part of visionary text systems such as hypertext and scholars' workstations. Downloading is discussed in terms of the capability to search records and to maintain unique bibliographic descriptions, and relational database management systems, file managers, and text databases are reviewed as possible hosts for…

  5. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  6. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  7. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  8. Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.

    ERIC Educational Resources Information Center

    Pieska, K. A. O.

    1986-01-01

    Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)

  9. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  10. Construction of databases: advances and significance in clinical research.

    PubMed

    Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian

    2015-12-01

    Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.

  11. [The future of clinical laboratory database management system].

    PubMed

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  12. Peer-to-peer architecture for multi-departmental distributed PACS

    NASA Astrophysics Data System (ADS)

    Rosset, Antoine; Heuberger, Joris; Pysher, Lance; Ratib, Osman

    2006-03-01

    We have elected to explore peer-to-peer technology as an alternative to centralized PACS architecture for the increasing requirements for wide access to images inside and outside a radiology department. The goal being to allow users across the enterprise to access any study anytime without the need for prefetching or routing of images from central archive. Images can be accessed between different workstations and local storage nodes. We implemented "bonjour" a new remote file access technology developed by Apple allowing applications to share data and files remotely with optimized data access and data transfer. Our Open-source image display platform called OsiriX was adapted to allow sharing of local DICOM images through direct access of each local SQL database to be accessible from any other OsiriX workstation over the network. A server version of Osirix Core Data database also allows to access distributed archives servers in the same way. The infrastructure implemented allows fast and efficient access to any image anywhere anytime independently from the actual physical location of the data. It also allows benefiting from the performance of distributed low-cost and high capacity storage servers that can provide efficient caching of PACS data that was found to be 10 to 20 x faster that accessing the same date from the central PACS archive. It is particularly suitable for large hospitals and academic environments where clinical conferences, interdisciplinary discussions and successive sessions of image processing are often part of complex workflow or patient management and decision making.

  13. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  14. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  15. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  16. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  17. Online Bibliographic Databases in South Central Pennsylvania: Current Status and Training Needs.

    ERIC Educational Resources Information Center

    Townley, Charles

    A survey of libraries in south central Pennsylvania was designed to identify those that are using or planning to use databases and assess their perceived training needs. This report describes the methodology and analyzes the responses received form the 57 libraries that completed the questionnaire. Data presented in eight tables are concerned with…

  18. Mobile Location-Based Services for Trusted Information in Disaster Management

    NASA Astrophysics Data System (ADS)

    Ragia, Lemonia; Deriaz, Michel; Seigneur, Jean-Marc

    The goal of the present chapter is to provide location-based services for disaster management. The application involves services related to the safety of the people due to an unexpected event. The current prototype is implemented for a specific issue of disaster management which is road traffic control. The users can ask requests on cell phones or via Internet to the system and get an answer in a display or in textual form. The data are in a central database and every user can input data via virtual tags. The system is based on spatial messages which can be sent from any user to any other in a certain distance. In this way all the users and not a separate source provide the necessary information for a dangerous situation. To avoid any contamination problems we use trust security to check the input to the system and a trust engine model to provide information with a considerable reliability.

  19. Central Appalachian basin natural gas database: distribution, composition, and origin of natural gases

    USGS Publications Warehouse

    Román Colón, Yomayra A.; Ruppert, Leslie F.

    2015-01-01

    The U.S. Geological Survey (USGS) has compiled a database consisting of three worksheets of central Appalachian basin natural gas analyses and isotopic compositions from published and unpublished sources of 1,282 gas samples from Kentucky, Maryland, New York, Ohio, Pennsylvania, Tennessee, Virginia, and West Virginia. The database includes field and reservoir names, well and State identification number, selected geologic reservoir properties, and the composition of natural gases (methane; ethane; propane; butane, iso-butane [i-butane]; normal butane [n-butane]; iso-pentane [i-pentane]; normal pentane [n-pentane]; cyclohexane, and hexanes). In the first worksheet, location and American Petroleum Institute (API) numbers from public or published sources are provided for 1,231 of the 1,282 gas samples. A second worksheet of 186 gas samples was compiled from published sources and augmented with public location information and contains carbon, hydrogen, and nitrogen isotopic measurements of natural gas. The third worksheet is a key for all abbreviations in the database. The database can be used to better constrain the stratigraphic distribution, composition, and origin of natural gas in the central Appalachian basin.

  20. MiRNA-TF-gene network analysis through ranking of biomolecules for multi-informative uterine leiomyoma dataset.

    PubMed

    Mallik, Saurav; Maulik, Ujjwal

    2015-10-01

    Gene ranking is an important problem in bioinformatics. Here, we propose a new framework for ranking biomolecules (viz., miRNAs, transcription-factors/TFs and genes) in a multi-informative uterine leiomyoma dataset having both gene expression and methylation data using (statistical) eigenvector centrality based approach. At first, genes that are both differentially expressed and methylated, are identified using Limma statistical test. A network, comprising these genes, corresponding TFs from TRANSFAC and ITFP databases, and targeter miRNAs from miRWalk database, is then built. The biomolecules are then ranked based on eigenvector centrality. Our proposed method provides better average accuracy in hub gene and non-hub gene classifications than other methods. Furthermore, pre-ranked Gene set enrichment analysis is applied on the pathway database as well as GO-term databases of Molecular Signatures Database with providing a pre-ranked gene-list based on different centrality values for comparing among the ranking methods. Finally, top novel potential gene-markers for the uterine leiomyoma are provided. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Implementation of the Clinical Encounters Tracking system at the Indiana University School of Medicine.

    PubMed

    Hatfield, Amy J; Bangert, Michael P

    2005-01-01

    The Indiana University School of Medicine (IUSM) Office of Medical Education &Student Services directed the IUSM Educational Technology Unit to develop a Clinical Encounters Tracking system in response to the Liaison Committee on Medical Education's (LCME) updated accreditation standards. A personal digital assistant (PDA) and centralized database server solution was implemented. Third-year medical students are required to carry a PDA on which they record clinical encounter experiences during all clerkship clinical rotations. Clinical encounters data collected on the PDAs are routinely uploaded to the central server via the PDA HotSyncing process. Real-time clinical encounter summary reports are accessed in the school's online curriculum management system: ANGEL. The resulting IUSM Clinical Encounters Tracking program addresses the LCME accreditation standard which mandates the tracking of medical students' required clinical curriculum experiences.

  2. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    ERIC Educational Resources Information Center

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  3. 78 FR 28756 - Defense Federal Acquisition Regulation Supplement: System for Award Management Name Changes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... Excluded Parties Listing System (EPLS) databases into the System for Award Management (SAM) database. DATES... combined the functional capabilities of the CCR, ORCA, and EPLS procurement systems into the SAM database... identification number and the type of organization from the System for Award Management database. 0 3. Revise the...

  4. An Examination of Selected Software Testing Tools: 1992

    DTIC Science & Technology

    1992-12-01

    Report ....................................................... 27-19 Figure 27-17. Metrics Manager Database Full Report...historical test database , the test management and problem reporting tools were examined using the sample test database provided by each supplier. 4-4...track the impact of new methods, organi- zational structures, and technologies. Metrics Manager is supported by an industry database that allows

  5. The Zebrafish Model Organism Database: new support for human disease models, mutation details, gene expression phenotypes and searching

    PubMed Central

    Howe, Douglas G.; Bradford, Yvonne M.; Eagle, Anne; Fashena, David; Frazer, Ken; Kalita, Patrick; Mani, Prita; Martin, Ryan; Moxon, Sierra Taylor; Paddock, Holly; Pich, Christian; Ramachandran, Sridhar; Ruzicka, Leyla; Schaper, Kevin; Shao, Xiang; Singer, Amy; Toro, Sabrina; Van Slyke, Ceri; Westerfield, Monte

    2017-01-01

    The Zebrafish Model Organism Database (ZFIN; http://zfin.org) is the central resource for zebrafish (Danio rerio) genetic, genomic, phenotypic and developmental data. ZFIN curators provide expert manual curation and integration of comprehensive data involving zebrafish genes, mutants, transgenic constructs and lines, phenotypes, genotypes, gene expressions, morpholinos, TALENs, CRISPRs, antibodies, anatomical structures, models of human disease and publications. We integrate curated, directly submitted, and collaboratively generated data, making these available to zebrafish research community. Among the vertebrate model organisms, zebrafish are superbly suited for rapid generation of sequence-targeted mutant lines, characterization of phenotypes including gene expression patterns, and generation of human disease models. The recent rapid adoption of zebrafish as human disease models is making management of these data particularly important to both the research and clinical communities. Here, we describe recent enhancements to ZFIN including use of the zebrafish experimental conditions ontology, ‘Fish’ records in the ZFIN database, support for gene expression phenotypes, models of human disease, mutation details at the DNA, RNA and protein levels, and updates to the ZFIN single box search. PMID:27899582

  6. SuperPain—a resource on pain-relieving compounds targeting ion channels

    PubMed Central

    Gohlke, Björn O.; Preissner, Robert; Preissner, Saskia

    2014-01-01

    Pain is more than an unpleasant sensory experience associated with actual or potential tissue damage: it is the most common reason for physician consultation and often dramatically affects quality of life. The management of pain is often difficult and new targets are required for more effective and specific treatment. SuperPain (http://bioinformatics.charite.de/superpain/) is freely available database for pain-stimulating and pain-relieving compounds, which bind or potentially bind to ion channels that are involved in the transmission of pain signals to the central nervous system, such as TRPV1, TRPM8, TRPA1, TREK1, TRESK, hERG, ASIC, P2X and voltage-gated sodium channels. The database consists of ∼8700 ligands, which are characterized by experimentally measured binding affinities. Additionally, 100 000 putative ligands are included. Moreover, the database provides 3D structures of receptors and predicted ligand-binding poses. These binding poses and a structural classification scheme provide hints for the design of new analgesic compounds. A user-friendly graphical interface allows similarity searching, visualization of ligands docked into the receptor, etc. PMID:24271391

  7. SuperPain--a resource on pain-relieving compounds targeting ion channels.

    PubMed

    Gohlke, Björn O; Preissner, Robert; Preissner, Saskia

    2014-01-01

    Pain is more than an unpleasant sensory experience associated with actual or potential tissue damage: it is the most common reason for physician consultation and often dramatically affects quality of life. The management of pain is often difficult and new targets are required for more effective and specific treatment. SuperPain (http://bioinformatics.charite.de/superpain/) is freely available database for pain-stimulating and pain-relieving compounds, which bind or potentially bind to ion channels that are involved in the transmission of pain signals to the central nervous system, such as TRPV1, TRPM8, TRPA1, TREK1, TRESK, hERG, ASIC, P2X and voltage-gated sodium channels. The database consists of ∼8700 ligands, which are characterized by experimentally measured binding affinities. Additionally, 100 000 putative ligands are included. Moreover, the database provides 3D structures of receptors and predicted ligand-binding poses. These binding poses and a structural classification scheme provide hints for the design of new analgesic compounds. A user-friendly graphical interface allows similarity searching, visualization of ligands docked into the receptor, etc.

  8. Component Database for the APS Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veseli, S.; Arnold, N. D.; Jarosz, D. P.

    The Advanced Photon Source Upgrade (APS-U) project will replace the existing APS storage ring with a multi-bend achromat (MBA) lattice to provide extreme transverse coherence and extreme brightness x-rays to its users. As the time to replace the existing storage ring accelerator is of critical concern, an aggressive one-year removal/installation/testing period is being planned. To aid in the management of the thousands of components to be installed in such a short time, the Component Database (CDB) application is being developed with the purpose to identify, document, track, locate, and organize components in a central database. Three major domains are beingmore » addressed: Component definitions (which together make up an exhaustive "Component Catalog"), Designs (groupings of components to create subsystems), and Component Instances (“Inventory”). Relationships between the major domains offer additional "system knowledge" to be captured that will be leveraged with future tools and applications. It is imperative to provide sub-system engineers with a functional application early in the machine design cycle. Topics discussed in this paper include the initial design and deployment of CDB, as well as future development plans.« less

  9. Designing Reliable Cohorts of Cardiac Patients across MIMIC and eICU

    PubMed Central

    Chronaki, Catherine; Shahin, Abdullah; Mark, Roger

    2016-01-01

    The design of the patient cohort is an essential and fundamental part of any clinical patient study. Knowledge of the Electronic Health Records, underlying Database Management System, and the relevant clinical workflows are central to an effective cohort design. However, with technical, semantic, and organizational interoperability limitations, the database queries associated with a patient cohort may need to be reconfigured in every participating site. i2b2 and SHRINE advance the notion of patient cohorts as first class objects to be shared, aggregated, and recruited for research purposes across clinical sites. This paper reports on initial efforts to assess the integration of Medical Information Mart for Intensive Care (MIMIC) and Philips eICU, two large-scale anonymized intensive care unit (ICU) databases, using standard terminologies, i.e. LOINC, ICD9-CM and SNOMED-CT. Focus of this work is lab and microbiology observations and key demographics for patients with a primary cardiovascular ICD9-CM diagnosis. Results and discussion reflecting on reference core terminology standards, offer insights on efforts to combine detailed intensive care data from multiple ICUs worldwide. PMID:27774488

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enders, Alexander L.; Lousteau, Angela L.

    The Desktop Analysis Reporting Tool (DART) is a software package that allows users to easily view and analyze daily files that span long periods. DART gives users the capability to quickly determine the state of health of a radiation portal monitor (RPM), troubleshoot and diagnose problems, and view data in various time frames to perform trend analysis. In short, it converts the data strings written in the daily files into meaningful tables and plots. The standalone version of DART (“soloDART”) utilizes a database engine that is included with the application; no additional installations are necessary. There is also a networkedmore » version of DART (“polyDART”) that is designed to maximize the benefit of a centralized data repository while distributing the workload to individual desktop machines. This networked approach requires a more complex database manager Structured Query Language (SQL) Server; however, SQL Server is not currently provided with DART. Regardless of which version is used, DART will import daily files from RPMs, store the relevant data in its database, and it can produce reports for status, trend analysis, and reporting purposes.« less

  11. The Fleet Application for Scheduling and Tracking (FAST) Management Website

    NASA Technical Reports Server (NTRS)

    Marrero-Perez, Radames J.

    2014-01-01

    The FAST application was designed to replace the paper and pen method of checking out and checking in GSA Vehicles at KSC. By innovating from a paper and pen based checkout system to a fully digital one, not only the resources wasted by printing the checkout forms have been reduced, but it also reduces significantly the time that users and fleet managers need to interact with the system as well as improving the record accuracy for each vehicle. The vehicle information is pulled from a centralized database server in the SPSDL. In an attempt to add a new feature to the FAST application, the author of this report (alongside the FAST developers) has been designing and developing the FAST Management Website. The GSA fleet managers had to rely on the FAST developers in order to add new vehicles, edit vehicles and previous transactions, or for generating vehicles reports. By providing an easy-to-use FAST Management Website portal, the GSA fleet managers are now able to easily move vehicles, edit records, and print reports.

  12. WOVOdat: A New Tool for Managing and Accessing Data of Worldwide Volcanic Unrest

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Malone, S. D.; Newhall, C. G.

    2002-12-01

    WOVOdat (World Organization of Volcano Observatories database of volcanic unrest) will for the first time bring together data of worldwide volcanic seismicity, ground deformation, fumarolic activity, and other changes within or adjacent to a volcanic system. Although a large body of data and experience has been built over the past century, currently, we have no means of accessing that collective experience for use during crises and for research. WOVOdat will be the central resource of a data management system; other components will include utilities for data input and archiving, structured data retrieval, and data mining; educational modules; and links to institutional databases such as IRIS (global seismicity), UNAVCO (global GPS coordinates and strain vectors), and Smithsonian's Global Volcanism Program (historical eruptions). Data will be geospatially and time-referenced, to provide four dimensional images of how volcanic systems respond to magma intrusion, regional strain, and other disturbances prior to and during eruption. As part of the design phase, a small WOVOdat team is currently collecting information from observatories about their data types, formats, and local data management. The database schema is being designed such that responses to common, yet complex, queries are rapid (e.g., where else has similar unrest occurred and what was the outcome?) while also allowing for more detailed research analysis of relationships between various parameters (e.g., what do temporal relations between long-period earthquakes, transient deformation, and spikes in gas emission tell us about the geometry and physical properties of magma and a volcanic edifice?). We are excited by the potential of WOVOdat, and we invite participation in its design and development. Next steps involve formalizing and testing the design, and, developing utilities for translating data of various formats into common formats. The large job of populating the database will follow, and eventually we will have a great new tool for eruption forecasting and research.

  13. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal.

    PubMed

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M Juliana; Hural, John

    2014-07-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure that viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×10(6)±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8-3.2×10(6) cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and a recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%-130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Extending GIS Technology to Study Karst Features of Southeastern Minnesota

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Tipping, R. G.; Alexander, E. C.; Alexander, S. C.

    2001-12-01

    This paper summarizes ongoing research on karst feature distribution of southeastern Minnesota. The main goals of this interdisciplinary research are: 1) to look for large-scale patterns in the rate and distribution of sinkhole development; 2) to conduct statistical tests of hypotheses about the formation of sinkholes; 3) to create management tools for land-use managers and planners; and 4) to deliver geomorphic and hydrogeologic criteria for making scientifically valid land-use policies and ethical decisions in karst areas of southeastern Minnesota. Existing county and sub-county karst feature datasets of southeastern Minnesota have been assembled into a large GIS-based database capable of analyzing the entire data set. The central database management system (DBMS) is a relational GIS-based system interacting with three modules: GIS, statistical and hydrogeologic modules. ArcInfo and ArcView were used to generate a series of 2D and 3D maps depicting karst feature distributions in southeastern Minnesota. IRIS ExplorerTM was used to produce satisfying 3D maps and animations using data exported from GIS-based database. Nearest-neighbor analysis has been used to test sinkhole distributions in different topographic and geologic settings. All current nearest-neighbor analyses testify that sinkholes in southeastern Minnesota are not evenly distributed in this area (i.e., they tend to be clustered). More detailed statistical methods such as cluster analysis, histograms, probability estimation, correlation and regression have been used to study the spatial distributions of some mapped karst features of southeastern Minnesota. A sinkhole probability map for Goodhue County has been constructed based on sinkhole distribution, bedrock geology, depth to bedrock, GIS buffer analysis and nearest-neighbor analysis. A series of karst features for Winona County including sinkholes, springs, seeps, stream sinks and outcrop has been mapped and entered into the Karst Feature Database of Southeastern Minnesota. The Karst Feature Database of Winona County is being expanded to include all the mapped karst features of southeastern Minnesota. Air photos from 1930s to 1990s of Spring Valley Cavern Area in Fillmore County were scanned and geo-referenced into our GIS system. This technology has been proved to be very useful to identify sinkholes and study the rate of sinkhole development.

  15. Distributed computing for macromolecular crystallography

    PubMed Central

    Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Ballard, Charles

    2018-01-01

    Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community. PMID:29533240

  16. Distributed computing for macromolecular crystallography.

    PubMed

    Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Winn, Martyn; Ballard, Charles

    2018-02-01

    Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community.

  17. Database Searching by Managers.

    ERIC Educational Resources Information Center

    Arnold, Stephen E.

    Managers and executives need the easy and quick access to business and management information that online databases can provide, but many have difficulty articulating their search needs to an intermediary. One possible solution would be to encourage managers and their immediate support staff members to search textual databases directly as they now…

  18. 32 CFR 105.15 - Defense Sexual Assault Incident Database (DSAID).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 1 2013-07-01 2013-07-01 false Defense Sexual Assault Incident Database (DSAID... Sexual Assault Incident Database (DSAID). (a) Purpose. (1) In accordance with section 563 of Public Law... activities. It shall serve as a centralized, case-level database for the collection and maintenance of...

  19. 40 CFR 1400.13 - Read-only database.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Read-only database. 1400.13 Section... INFORMATION Other Provisions § 1400.13 Read-only database. The Administrator is authorized to establish... public off-site consequence analysis information by means of a central database under the control of the...

  20. 32 CFR 105.15 - Defense Sexual Assault Incident Database (DSAID).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 1 2014-07-01 2014-07-01 false Defense Sexual Assault Incident Database (DSAID... Sexual Assault Incident Database (DSAID). (a) Purpose. (1) In accordance with section 563 of Public Law... activities. It shall serve as a centralized, case-level database for the collection and maintenance of...

  1. 40 CFR 1400.13 - Read-only database.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Read-only database. 1400.13 Section... INFORMATION Other Provisions § 1400.13 Read-only database. The Administrator is authorized to establish... public off-site consequence analysis information by means of a central database under the control of the...

  2. 40 CFR 1400.13 - Read-only database.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Read-only database. 1400.13 Section... INFORMATION Other Provisions § 1400.13 Read-only database. The Administrator is authorized to establish... public off-site consequence analysis information by means of a central database under the control of the...

  3. 40 CFR 1400.13 - Read-only database.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Read-only database. 1400.13 Section... INFORMATION Other Provisions § 1400.13 Read-only database. The Administrator is authorized to establish... public off-site consequence analysis information by means of a central database under the control of the...

  4. Mission and Assets Database

    NASA Technical Reports Server (NTRS)

    Baldwin, John; Zendejas, Silvino; Gutheinz, Sandy; Borden, Chester; Wang, Yeou-Fang

    2009-01-01

    Mission and Assets Database (MADB) Version 1.0 is an SQL database system with a Web user interface to centralize information. The database stores flight project support resource requirements, view periods, antenna information, schedule, and forecast results for use in mid-range and long-term planning of Deep Space Network (DSN) assets.

  5. Management information system of medical equipment using mobile devices

    NASA Astrophysics Data System (ADS)

    Núñez, C.; Castro, D.

    2011-09-01

    The large numbers of technologies currently incorporated into mobile devices transform them into excellent tools for capture and to manage the information, because of the increasing computing power and storage that allow to add many miscellaneous applications. In order to obtain benefits of these technologies, in the biomedical engineering field, it was developed a mobile information system for medical equipment management. The central platform for the system it's a mobile phone, which by a connection with a web server, it's capable to send and receive information relative to any medical equipment. Decoding a type of barcodes, known as QR-Codes, the management process is simplified and improved. These barcodes identified the medical equipments in a database, when these codes are photographed and decoded with the mobile device, you can access to relevant information about the medical equipment in question. This Project in it's actual state is a basic support tool for the maintenance of medical equipment. It is also a modern alternative, competitive and economic in the actual market.

  6. Generalized Database Management System Support for Numeric Database Environments.

    ERIC Educational Resources Information Center

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  7. EPA Facility Registry Service (FRS): CERCLIS

    EPA Pesticide Factsheets

    This data provides location and attribute information on Facilities regulated under the Comprehensive Environmental Responsibility Compensation and Liability Information System (CERCLIS) for a intranet web feature service . The data provided in this service are obtained from EPA's Facility Registry Service (FRS). The FRS is an integrated source of comprehensive (air, water, and waste) environmental information about facilities, sites or places. This service connects directly to the FRS database to provide this data as a feature service. FRS creates high-quality, accurate, and authoritative facility identification records through rigorous verification and management procedures that incorporate information from program national systems, state master facility records, data collected from EPA's Central Data Exchange registrations and data management personnel. Additional Information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.

  8. Hand-held computer operating system program for collection of resident experience data.

    PubMed

    Malan, T K; Haffner, W H; Armstrong, A Y; Satin, A J

    2000-11-01

    To describe a system for recording resident experience involving hand-held computers with the Palm Operating System (3 Com, Inc., Santa Clara, CA). Hand-held personal computers (PCs) are popular, easy to use, inexpensive, portable, and can share data among other operating systems. Residents in our program carry individual hand-held database computers to record Residency Review Committee (RRC) reportable patient encounters. Each resident's data is transferred to a single central relational database compatible with Microsoft Access (Microsoft Corporation, Redmond, WA). Patient data entry and subsequent transfer to a central database is accomplished with commercially available software that requires minimal computer expertise to implement and maintain. The central database can then be used for statistical analysis or to create required RRC resident experience reports. As a result, the data collection and transfer process takes less time for residents and program director alike, than paper-based or central computer-based systems. The system of collecting resident encounter data using hand-held computers with the Palm Operating System is easy to use, relatively inexpensive, accurate, and secure. The user-friendly system provides prompt, complete, and accurate data, enhancing the education of residents while facilitating the job of the program director.

  9. Orthopaedic Footwear Design

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Although the need for orthopaedic shoes is increasing, the number of skilled shoemakers has declined. This has led to the development of a CAD/CAM system to design and fabricate, orthopaedic footwear. The NASA-developed RIM database management system is the central repository for CUSTOMLAST's information storage. Several other modules also comprise the system. The project was initiated by Langley Research Center and Research Triangle Institute in cooperation with the Veterans Administration and the National Institute for Disability and Rehabilitation Research. Later development was done by North Carolina State University and the University of Missouri-Columbia. The software is licensed by both universities.

  10. 77 FR 43078 - Federal Acquisition Regulation; Information Collection; Central Contractor Registration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-23

    ...; Information Collection; Central Contractor Registration AGENCY: Department of Defense (DOD), General Services... requirement concerning the Central Contractor Registration database. Public comments are particularly invited... Information Collection 9000- 0159, Central Contractor Registration, by any of the following methods...

  11. 78 FR 12316 - Federal Acquisition Regulation; Information Collection; Central Contractor Registration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ...; Information Collection; Central Contractor Registration AGENCIES: Department of Defense (DOD), General... collection requirement concerning the Central Contractor Registration database. A notice was published in the... Information Collection 9000- 0159, Central Contractor Registration, by any of the following methods...

  12. EPA Facility Registry Service (FRS): ICIS

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Integrated Compliance Information System (ICIS). When complete, ICIS will provide a database that will contain integrated enforcement and compliance information across most of EPA's programs. The vision for ICIS is to replace EPA's independent databases that contain enforcement data with a single repository for that information. Currently, ICIS contains all Federal Administrative and Judicial enforcement actions and a subset of the Permit Compliance System (PCS), which supports the National Pollutant Discharge Elimination System (NPDES). ICIS exchanges non-sensitive enforcement/compliance activities, non-sensitive formal enforcement actions and NPDES information with FRS. This web feature service contains the enforcement/compliance activities and formal enforcement action related facilities; the NPDES facilities are contained in the PCS_NPDES web feature service. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on f

  13. Electronic data collection for clinical trials using tablet and handheld PCs

    NASA Astrophysics Data System (ADS)

    Alaoui, Adil; Vo, Minh; Patel, Nikunj; McCall, Keith; Lindisch, David; Watson, Vance; Cleary, Kevin

    2005-04-01

    This paper describes a system that uses electronic forms to collect patient and procedure data for clinical trials. During clinical trials, patients are typically required to provide background information such as demographics and medical history, as well as review and complete any consent forms. Physicians or their assistants then usually have additional forms for recording technical data from the procedure and for gathering follow-up information from patients after completion of the procedure. This approach can lead to substantial amounts of paperwork to collect and manage over the course of a clinical trial with a large patient base. By using e-forms instead, data can be transmitted to a single, centralized database, reducing the problem of managing paper forms. Additionally, the system can provide a means for relaying information from the database to the physician on his/her portable wireless device, such as to alert the physician when a patient has completed the pre-procedure forms and is ready to begin the procedure. This feature could improve the workflow in busy clinical practices. In the future, the system could be expanded so physicians could use their portable wireless device to pull up entire hospital records and view other pre-procedure data and patient images.

  14. New approach to managing genital warts.

    PubMed

    Lopaschuk, Catharine C

    2013-07-01

    To summarize and determine the appropriate use for the new and old management tools for genital warts. The following databases were searched: MEDLINE, PubMed, EMBASE, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, ACP Journal Club, and Trip. The bibliographies of retrieved papers were also reviewed. Clinical trials, qualitative review articles, consensus reports, and clinical practice guidelines were retrieved. Symptomatic warts are prevalent in at least 1% of the population between the ages of 15 and 49, with estimates of up to 50% of the population being infected with human papillomavirus at some point in their lifetime. Imiquimod and podophyllotoxin are 2 new treatments for external genital warts that are less painful and can be applied by patients at home. In addition, the quadrivalent human papillomavirus vaccine has been shown to be efficacious in preventing genital warts and cervical cancer. There is still a role for the older treatment methods in certain situations, such as intravaginal, urethral, anal, or recalcitrant warts; or for pregnant patients. The new treatments of external genital warts can reduce the pain of treatment and the number of office visits. Other treatment methods are still useful in certain situations.

  15. New approach to managing genital warts

    PubMed Central

    Lopaschuk, Catharine C.

    2013-01-01

    Abstract Objective To summarize and determine the appropriate use for the new and old management tools for genital warts. Sources of information The following databases were searched: MEDLINE, PubMed, EMBASE, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, ACP Journal Club, and Trip. The bibliographies of retrieved papers were also reviewed. Clinical trials, qualitative review articles, consensus reports, and clinical practice guidelines were retrieved. Main message Symptomatic warts are prevalent in at least 1% of the population between the ages of 15 and 49, with estimates of up to 50% of the population being infected with human papillomavirus at some point in their lifetime. Imiquimod and podophyllotoxin are 2 new treatments for external genital warts that are less painful and can be applied by patients at home. In addition, the quadrivalent human papillomavirus vaccine has been shown to be efficacious in preventing genital warts and cervical cancer. There is still a role for the older treatment methods in certain situations, such as intravaginal, urethral, anal, or recalcitrant warts; or for pregnant patients. Conclusion The new treatments of external genital warts can reduce the pain of treatment and the number of office visits. Other treatment methods are still useful in certain situations. PMID:23851535

  16. Structure and content components of self-management interventions that improve health-related quality of life in people with inflammatory bowel disease: a systematic review, meta-analysis and meta-regression.

    PubMed

    Tu, Wenjing; Xu, Guihua; Du, Shizheng

    2015-10-01

    The purpose of this review was to identify and categorise the components of the content and structure of effective self-management interventions for patients with inflammatory bowel disease. Inflammatory bowel diseases are chronic gastrointestinal disorders impacting health-related quality of life. Although the efficacy of self-management interventions has been demonstrated in previous studies, the most effective components of the content and structure of these interventions remain unknown. A systematic review, meta-analysis and meta-regression of randomised controlled trials was used. A systematic search of six electronic databases, including Pubmed, Embase, Cochrane central register of controlled trials, Web of Science, Cumulative Index of Nursing and Allied Health Literature and Chinese Biomedical Literature Database, was conducted. Content analysis was used to categorise the components of the content and structure of effective self-management interventions for inflammatory bowel disease. Clinically important and statistically significant beneficial effects on health-related quality of life were explored, by comparing the association between effect sizes and various components of self-management interventions such as the presence or absence of specific content and different delivery methods. Fifteen randomised controlled trials were included in this review. Distance or remote self-management interventions demonstrated a larger effect size. However, there is no evidence for a positive effect associated with specific content component of self-management interventions in adult patients with inflammatory bowel disease in general. The results showed that self-management interventions have positive effects on health-related quality of life in patients with inflammatory bowel disease, and distance or remote self-management programmes had better outcomes than other types of interventions. This review provides useful information to clinician and researchers when determining components of effective self-management programmes for patients with inflammatory bowel disease. More high-quality randomised controlled trials are needed to test the results. © 2015 John Wiley & Sons Ltd.

  17. A central database for the Global Terrestrial Network for Permafrost (GTN-P)

    NASA Astrophysics Data System (ADS)

    Elger, Kirsten; Lanckman, Jean-Pierre; Lantuit, Hugues; Karlsson, Ævar Karl; Johannsson, Halldór

    2013-04-01

    The Global Terrestrial Network for Permafrost (GTN-P) is the primary international observing network for permafrost sponsored by the Global Climate Observing System (GCOS) and the Global Terrestrial Observing System (GTOS), and managed by the International Permafrost Association (IPA). It monitors the Essential Climate Variable (ECV) permafrost that consists of permafrost temperature and active-layer thickness, with the long-term goal of obtaining a comprehensive view of the spatial structure, trends, and variability of changes in the active layer and permafrost. The network's two international monitoring components are (1) CALM (Circumpolar Active Layer Monitoring) and the (2) Thermal State of Permafrost (TSP), which is made of an extensive borehole-network covering all permafrost regions. Both programs have been thoroughly overhauled during the International Polar Year 2007-2008 and extended their coverage to provide a true circumpolar network stretching over both Hemispheres. GTN-P has gained considerable visibility in the science community in providing the baseline against which models are globally validated and incorporated in climate assessments. Yet it was until now operated on a voluntary basis, and is now being redesigned to meet the increasing expectations from the science community. To update the network's objectives and deliver the best possible products to the community, the IPA organized a workshop to define the user's needs and requirements for the production, archival, storage and dissemination of the permafrost data products it manages. From the beginning on, GNT-P data was "outfitted" with an open data policy with free data access via the World Wide Web. The existing data, however, is far from being homogeneous: is not yet optimized for databases, there is no framework for data reporting or archival and data documentation is incomplete. As a result, and despite the utmost relevance of permafrost in the Earth's climate system, the data has not been used by as many researchers as intended by the initiators of these global programs. The European Union project PAGE21 created opportunities to develop this central database for GTN-P data during the duration of the project and beyond. The database aims to be the one location where the researcher can find data, metadata and information of all relevant parameters for a specific site. Each component of the Data Management System (DMS), including parameters, data levels and metadata formats were developed in cooperation with GTN-P and the IPA. The general framework of the GTN-P DMS is based on an object-oriented model (OOM) and implemented into a spatial database. To ensure interoperability and enable potential inter-database search, field names are following international metadata standards. The outputs of the DMS will be tailored to the needs of the modeling community but also to the ones of other stakeholders. In particular, new products will be developed in partnership with the IPA and other relevant international organizations to raise awareness on permafrost in the policy-making arena. The DMS will be released to a broader public in May 2013 and we expect to have the first active data upload - via an online interface - after 2013's summer field season.

  18. Ottawa Panel Evidence-Based Clinical Practice Guidelines for Structured Physical Activity in the Management of Juvenile Idiopathic Arthritis.

    PubMed

    Cavallo, Sabrina; Brosseau, Lucie; Toupin-April, Karine; Wells, George A; Smith, Christine A; Pugh, Arlanna G; Stinson, Jennifer; Thomas, Roanne; Ahmed, Sara; Duffy, Ciarán M; Rahman, Prinon; Àlvarez-Gallardo, Inmaculada C; Loew, Laurianne; De Angelis, Gino; Feldman, Debbie Ehrmann; Majnemer, Annette; Gagnon, Isabelle J; Maltais, Désirée; Mathieu, Marie-Ève; Kenny, Glen P; Tupper, Susan; Whitney-Mahoney, Kristi; Bigford, Sarah

    2017-05-01

    To create guidelines focused on the use of structured physical activity (PA) in the management of juvenile idiopathic arthritis (JIA). A systematic literature search was conducted using the electronic databases Cochrane Central Register of Controlled Trials, MEDLINE (Ovid), EMBASE (Ovid), and Physiotherapy Evidence Database for all studies related to PA programs for JIA from January 1966 until December 2014, and was updated in May 2015. Study selection was completed independently by 2 reviewers. Studies were included if they involved individuals aged ≤21 years diagnosed with JIA who were taking part in therapeutic exercise or other PA interventions for which effects of various disease-related outcomes were compared with a control group (eg, no PA program or activity of lower intensity). Two reviewers independently extracted information on interventions, comparators, outcomes, time period, and study design. The statistical analysis was reported using the Cochrane Collaboration methods. The quality of the included studies was assessed according to the Physiotherapy Evidence Database Scale. Five randomized controlled trials (RCTs) fit the selection criteria; of these, 4 were high-quality RCTs. The following recommendations were developed: (1) Pilates for improving quality of life, pain, functional ability, and range of motion (ROM) (grade A); (2) home exercise program for improving quality of life and functional ability (grade A); (3) aquatic aerobic fitness for decreasing the number of active joints (grade A); and (4) and cardio-karate aerobic exercise for improving ROM and number of active joints (grade C+). The Ottawa Panel recommends the following structured exercises and physical activities for the management of JIA: Pilates, cardio-karate, home and aquatic exercises. Pilates showed improvement in a higher number of outcomes. Copyright © 2017. Published by Elsevier Inc.

  19. ViPAR: a software platform for the Virtual Pooling and Analysis of Research Data.

    PubMed

    Carter, Kim W; Francis, Richard W; Carter, K W; Francis, R W; Bresnahan, M; Gissler, M; Grønborg, T K; Gross, R; Gunnes, N; Hammond, G; Hornig, M; Hultman, C M; Huttunen, J; Langridge, A; Leonard, H; Newman, S; Parner, E T; Petersson, G; Reichenberg, A; Sandin, S; Schendel, D E; Schalkwyk, L; Sourander, A; Steadman, C; Stoltenberg, C; Suominen, A; Surén, P; Susser, E; Sylvester Vethanayagam, A; Yusof, Z

    2016-04-01

    Research studies exploring the determinants of disease require sufficient statistical power to detect meaningful effects. Sample size is often increased through centralized pooling of disparately located datasets, though ethical, privacy and data ownership issues can often hamper this process. Methods that facilitate the sharing of research data that are sympathetic with these issues and which allow flexible and detailed statistical analyses are therefore in critical need. We have created a software platform for the Virtual Pooling and Analysis of Research data (ViPAR), which employs free and open source methods to provide researchers with a web-based platform to analyse datasets housed in disparate locations. Database federation permits controlled access to remotely located datasets from a central location. The Secure Shell protocol allows data to be securely exchanged between devices over an insecure network. ViPAR combines these free technologies into a solution that facilitates 'virtual pooling' where data can be temporarily pooled into computer memory and made available for analysis without the need for permanent central storage. Within the ViPAR infrastructure, remote sites manage their own harmonized research dataset in a database hosted at their site, while a central server hosts the data federation component and a secure analysis portal. When an analysis is initiated, requested data are retrieved from each remote site and virtually pooled at the central site. The data are then analysed by statistical software and, on completion, results of the analysis are returned to the user and the virtually pooled data are removed from memory. ViPAR is a secure, flexible and powerful analysis platform built on open source technology that is currently in use by large international consortia, and is made publicly available at [http://bioinformatics.childhealthresearch.org.au/software/vipar/]. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.

  20. Building Databases for Education. ERIC Digest.

    ERIC Educational Resources Information Center

    Klausmeier, Jane A.

    This digest provides a brief explanation of what a database is; explains how a database can be used; identifies important factors that should be considered when choosing database management system software; and provides citations to sources for finding reviews and evaluations of database management software. The digest is concerned primarily with…

  1. Review of parental activation interventions for parents of children with special health care needs.

    PubMed

    Mirza, M; Krischer, A; Stolley, M; Magaña, S; Martin, M

    2018-05-01

    A large number of U.S. children are identified as having special health care needs (CSHCN). Despite parents' central role in managing their child's needs, many parents report difficulties in navigating service systems, finding information about their child's condition, and accessing health care and community resources. Therefore, there is a need for interventions that "activate" parents of children with special health care needs to increase their knowledge, skills, and confidence in managing, coordinating, and advocating for their child's needs. This study sought to review the existing literature and examine the effects of parent support interventions that focus on parental activation either in part or whole, on child, parent, or family outcomes. Specific aims included (a) summarizing the nature and content of interventions; (b) describing changes in relevant outcomes; (c) identifying limitations and making recommendations for future research. Following electronic databases were searched: MEDLINE, EMBASE, PsycINFO via ProQuest, PubMed, Cumulative Index to Nursing and Allied Health via EBSCO, Education Resources Information Center (ERIC) via ProQuest, The Cochrane Library (Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Cochrane Methodology Register), and Google Scholar. Twenty-two studies were selected, data were extracted, and quality was assessed using standardized procedures. Five intervention categories were identified: parent-to-parent supports, psycho-educational groups, content-specific groups, community health worker model, and self-management-based interventions. Although most studies showed positive effects of the intervention, evidence was inconsistent for parental outcomes such as self-efficacy, confidence, strain, depression, and perceived social support. Evidence was more consistent in showing improvement in parent coping and in use of community-based services and resources. There is a need to boost active ingredients of interventions that specifically target enhancing parent skill sets relevant to areas of self-efficacy, confidence, and empowerment. Future studies must also adapt intervention and study design to recruit socioeconomically vulnerable families. © 2018 John Wiley & Sons Ltd.

  2. A public HTLV-1 molecular epidemiology database for sequence management and data mining.

    PubMed

    Araujo, Thessika Hialla Almeida; Souza-Brito, Leandro Inacio; Libin, Pieter; Deforche, Koen; Edwards, Dustin; de Albuquerque-Junior, Antonio Eduardo; Vandamme, Anne-Mieke; Galvao-Castro, Bernardo; Alcantara, Luiz Carlos Junior

    2012-01-01

    It is estimated that 15 to 20 million people are infected with the human T-cell lymphotropic virus type 1 (HTLV-1). At present, there are more than 2,000 unique HTLV-1 isolate sequences published. A central database to aggregate sequence information from a range of epidemiological aspects including HTLV-1 infections, pathogenesis, origins, and evolutionary dynamics would be useful to scientists and physicians worldwide. Described here, we have developed a database that collects and annotates sequence data and can be accessed through a user-friendly search interface. The HTLV-1 Molecular Epidemiology Database website is available at http://htlv1db.bahia.fiocruz.br/. All data was obtained from publications available at GenBank or through contact with the authors. The database was developed using Apache Webserver 2.1.6 and SGBD MySQL. The webpage interfaces were developed in HTML and sever-side scripting written in PHP. The HTLV-1 Molecular Epidemiology Database is hosted on the Gonçalo Moniz/FIOCRUZ Research Center server. There are currently 2,457 registered sequences with 2,024 (82.37%) of those sequences representing unique isolates. Of these sequences, 803 (39.67%) contain information about clinical status (TSP/HAM, 17.19%; ATL, 7.41%; asymptomatic, 12.89%; other diseases, 2.17%; and no information, 60.32%). Further, 7.26% of sequences contain information on patient gender while 5.23% of sequences provide the age of the patient. The HTLV-1 Molecular Epidemiology Database retrieves and stores annotated HTLV-1 proviral sequences from clinical, epidemiological, and geographical studies. The collected sequences and related information are now accessible on a publically available and user-friendly website. This open-access database will support clinical research and vaccine development related to viral genotype.

  3. Database Management: Building, Changing and Using Databases. Collected Papers and Abstracts of the Mid-Year Meeting of the American Society for Information Science (15th, Portland, Oregon, May 1986).

    ERIC Educational Resources Information Center

    American Society for Information Science, Washington, DC.

    This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…

  4. The Network Configuration of an Object Relational Database Management System

    NASA Technical Reports Server (NTRS)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bower, J.C.; Burford, M.J.; Downing, T.R.

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using themore » site map database.« less

  6. Building an R&D chemical registration system.

    PubMed

    Martin, Elyette; Monge, Aurélien; Duret, Jacques-Antoine; Gualandi, Federico; Peitsch, Manuel C; Pospisil, Pavel

    2012-05-31

    Small molecule chemistry is of central importance to a number of R&D companies in diverse areas such as the pharmaceutical, nutraceutical, food flavoring, and cosmeceutical industries. In order to store and manage thousands of chemical compounds in such an environment, we have built a state-of-the-art master chemical database with unique structure identifiers. Here, we present the concept and methodology we used to build the system that we call the Unique Compound Database (UCD). In the UCD, each molecule is registered only once (uniqueness), structures with alternative representations are entered in a uniform way (normalization), and the chemical structure drawings are recognizable to chemists and to a cartridge. In brief, structural molecules are entered as neutral entities which can be associated with a salt. The salts are listed in a dictionary and bound to the molecule with the appropriate stoichiometric coefficient in an entity called "substance". The substances are associated with batches. Once a molecule is registered, some properties (e.g., ADMET prediction, IUPAC name, chemical properties) are calculated automatically. The UCD has both automated and manual data controls. Moreover, the UCD concept enables the management of user errors in the structure entry by reassigning or archiving the batches. It also allows updating of the records to include newly discovered properties of individual structures. As our research spans a wide variety of scientific fields, the database enables registration of mixtures of compounds, enantiomers, tautomers, and compounds with unknown stereochemistries.

  7. [Selected aspects of computer-assisted literature management].

    PubMed

    Reiss, M; Reiss, G

    1998-01-01

    We want to report about our own experiences with a database manager. Bibliography database managers are used to manage information resources: specifically, to maintain a database to references and create bibliographies and reference lists for written works. A database manager allows to enter summary information (record) for articles, book sections, books, dissertations, conference proceedings, and so on. Other features that may be included in a database manager include the ability to import references from different sources, such as MEDLINE. The word processing components allow to generate reference list and bibliographies in a variety of different styles, generates a reference list from a word processor manuscript. The function and the use of the software package EndNote 2 for Windows are described. Its advantages in fulfilling different requirements for the citation style and the sort order of reference lists are emphasized.

  8. Establishment of an international database for genetic variants in esophageal cancer.

    PubMed

    Vihinen, Mauno

    2016-10-01

    The establishment of a database has been suggested in order to collect, organize, and distribute genetic information about esophageal cancer. The World Organization for Specialized Studies on Diseases of the Esophagus and the Human Variome Project will be in charge of a central database of information about esophageal cancer-related variations from publications, databases, and laboratories; in addition to genetic details, clinical parameters will also be included. The aim will be to get all the central players in research, clinical, and commercial laboratories to contribute. The database will follow established recommendations and guidelines. The database will require a team of dedicated curators with different backgrounds. Numerous layers of systematics will be applied to facilitate computational analyses. The data items will be extensively integrated with other information sources. The database will be distributed as open access to ensure exchange of the data with other databases. Variations will be reported in relation to reference sequences on three levels--DNA, RNA, and protein-whenever applicable. In the first phase, the database will concentrate on genetic variations including both somatic and germline variations for susceptibility genes. Additional types of information can be integrated at a later stage. © 2016 New York Academy of Sciences.

  9. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    ERIC Educational Resources Information Center

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  10. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  11. Effect of self-acupressure for symptom management: a systematic review.

    PubMed

    Song, Hyun Jin; Seo, Hyun-Ju; Lee, Heeyoung; Son, Heejeong; Choi, Sun Mi; Lee, Sanghun

    2015-02-01

    To assess the efficacy and safety of self-administered acupressure to alleviate symptoms of various health problems, including allergic disease, cancer, respiratory disease, dysmenorrhea, perceived stress, insomnia, and sleep disturbances. We searched core, Korean, Chinese, and Japanese databases, including Ovid-MEDLINE, Ovid-EMBASE, the Cochrane Central Register of Controlled Trials (CENTRAL), the Cumulative Index to Nursing and Allied Health Literature (CINAHL), six representative electronic Korean medical databases, China National Knowledge Infrastructure (CNKI), and Japan Science and Technology Information Aggregator (J-STAGE). We included randomized controlled trials (RCTs) and quasi-RCTs that examined disease-specific effects or symptom relief, adverse reactions, and quality-of-life (QOL) for self-administered acupressure. Data collection and assessment of the methodological quality of the included studies were conducted by two independent reviewers. Eight RCTs and two quasi-RCTs showed positive effects and safety of self-acupressure therapy in clinically diverse populations. Quality assessment revealed moderate quality for the RCTs, with 50% or more of the trials assessed as presenting a low risk of bias in seven domains. All of the selected 10 studies reported positive effects for primary outcomes of self-acupressure therapy for symptom management, including significant improvements in symptom scores in allergic disease, nausea and vomiting in cancer, symptom scores in respiratory disease, pain symptoms in dysmenorrhea, and stress/fatigue scores and sleep disturbances in healthy people. Our findings suggest that self-administered acupressure shows promise to alleviate the symptoms of various health problems. Therefore, further research with larger samples and methodologically well-designed RCTs is required to establish the efficacy of self-administered acupressure. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Understanding and Analyzing Latency of Near Real-time Satellite Data

    NASA Astrophysics Data System (ADS)

    Han, W.; Jochum, M.; Brust, J.

    2016-12-01

    Acquiring and disseminating time-sensitive satellite data in a timely manner is much concerned by researchers and decision makers of weather forecast, severe weather warning, disaster and emergency response, environmental monitoring, and so on. Understanding and analyzing the latency of near real-time satellite data is very useful and helpful to explore the whole data transmission flow, indentify the possible issues, and connect data providers and users better. The STAR (Center for Satellite Applications and Research of NOAA) Central Data Repository (SCDR) is a central repository to acquire, manipulate, and disseminate various types of near real-time satellite datasets to internal and external users. In this system, important timestamps, including observation beginning/end, processing, uploading, downloading, and ingestion, are retrieved and organized in the database, so the time length of each transmission phase can be figured out easily. Open source NoSQL database MongoDB is selected to manage the timestamp information because of features of dynamic schema, aggregation and data processing. A user-friendly user interface is developed to visualize and characterize the latency interactively. Taking the Himawari-8 HSD (Himawari Standard Data) file as an example, the data transmission phases, including creating HSD file from satellite observation, uploading the file to HimawariCloud, updating file link in the webpage, downloading and ingesting the file to SCDR, are worked out from the above mentioned timestamps. The latencies can be observed by time of period, day of week, or hour of day in chart or table format, and the anomaly latencies can be detected and reported through the user interface. Latency analysis provides data providers and users actionable insight on how to improve the data transmission of near real-time satellite data, and enhance its acquisition and management.

  13. Online molecular image repository and analysis system: A multicenter collaborative open-source infrastructure for molecular imaging research and application.

    PubMed

    Rahman, Mahabubur; Watabe, Hiroshi

    2018-05-01

    Molecular imaging serves as an important tool for researchers and clinicians to visualize and investigate complex biochemical phenomena using specialized instruments; these instruments are either used individually or in combination with targeted imaging agents to obtain images related to specific diseases with high sensitivity, specificity, and signal-to-noise ratios. However, molecular imaging, which is a multidisciplinary research field, faces several challenges, including the integration of imaging informatics with bioinformatics and medical informatics, requirement of reliable and robust image analysis algorithms, effective quality control of imaging facilities, and those related to individualized disease mapping, data sharing, software architecture, and knowledge management. As a cost-effective and open-source approach to address these challenges related to molecular imaging, we develop a flexible, transparent, and secure infrastructure, named MIRA, which stands for Molecular Imaging Repository and Analysis, primarily using the Python programming language, and a MySQL relational database system deployed on a Linux server. MIRA is designed with a centralized image archiving infrastructure and information database so that a multicenter collaborative informatics platform can be built. The capability of dealing with metadata, image file format normalization, and storing and viewing different types of documents and multimedia files make MIRA considerably flexible. With features like logging, auditing, commenting, sharing, and searching, MIRA is useful as an Electronic Laboratory Notebook for effective knowledge management. In addition, the centralized approach for MIRA facilitates on-the-fly access to all its features remotely through any web browser. Furthermore, the open-source approach provides the opportunity for sustainable continued development. MIRA offers an infrastructure that can be used as cross-boundary collaborative MI research platform for the rapid achievement in cancer diagnosis and therapeutics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Development of expert systems for analyzing electronic documents

    NASA Astrophysics Data System (ADS)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  15. Preparing College Students To Search Full-Text Databases: Is Instruction Necessary?

    ERIC Educational Resources Information Center

    Riley, Cheryl; Wales, Barbara

    Full-text databases allow Central Missouri State University's clients to access some of the serials that libraries have had to cancel due to escalating subscription costs; EbscoHost, the subject of this study, is one such database. The database is available free to all Missouri residents. A survey was designed consisting of 21 questions intended…

  16. Resident database interfaces to the DAVID system, a heterogeneous distributed database management system

    NASA Technical Reports Server (NTRS)

    Moroh, Marsha

    1988-01-01

    A methodology for building interfaces of resident database management systems to a heterogeneous distributed database management system under development at NASA, the DAVID system, was developed. The feasibility of that methodology was demonstrated by construction of the software necessary to perform the interface task. The interface terminology developed in the course of this research is presented. The work performed and the results are summarized.

  17. An ethnobotanical analysis of parasitic plants (Parijibi) in the Nepal Himalaya.

    PubMed

    O'Neill, Alexander Robert; Rana, Santosh Kumar

    2016-02-24

    Indigenous biocultural knowledge is a vital part of Nepalese environmental management strategies; however, much of it may soon be lost given Nepal's rapidly changing socio-ecological climate. This is particularly true for knowledge surrounding parasitic and mycoheterotrophic plant species, which are well represented throughout the Central-Eastern Himalayas but lack a collated record. Our study addresses this disparity by analyzing parasitic and mycoheterotrophic plant species diversity in Nepal as well as the ethnobotanical knowledge that surrounds them. Botanical texts, online databases, and herbarium records were reviewed to create an authoritative compendium of parasitic and mycoheterotrophic plant species native or naturalized to the Nepal Central-Eastern Himalaya. Semi-structured interviews were then conducted with 141 informants to better understand the biocultural context of these species, emphasizing ethnobotanical uses, in 12 districts of Central-Eastern Nepal. Nepal is a hotspot of botanical diversity, housing 15 families and 29 genera of plants that exhibit parasitic or mycoheterotrophic habit. Over 150 of the known 4500 parasitic plant species (~3 %) and 28 of the 160 mycoheterotrophic species (~18 %) are native or naturalized to Nepal; 13 of our surveyed parasitic species are endemic. Of all species documented, approximately 17 % of parasitic and 7 % of mycoheterotrophic plants have ethnobotanical uses as medicine (41 %), fodder (23 %), food (17 %), ritual objects (11 %), or material (8 %). Parasitic and mycoheterotrophic plant species exhibit high diversity in the Nepal Central-Eastern Himalaya and are the fodder for biocultural relationships that may help inform future environmental management projects in the region.

  18. An adaptable XML based approach for scientific data management and integration

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo

    2008-03-01

    Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.

  19. An Adaptable XML Based Approach for Scientific Data Management and Integration.

    PubMed

    Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo

    2008-02-20

    Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.

  20. Computer Security Products Technology Overview

    DTIC Science & Technology

    1988-10-01

    13 3. DATABASE MANAGEMENT SYSTEMS ................................... 15 Definition...this paper addresses fall into the areas of multi-user hosts, database management systems (DBMS), workstations, networks, guards and gateways, and...provide a portion of that protection, for example, a password scheme, a file protection mechanism, a secure database management system, or even a

  1. An Introduction to Database Management Systems.

    ERIC Educational Resources Information Center

    Warden, William H., III; Warden, Bette M.

    1984-01-01

    Description of database management systems for microcomputers highlights system features and factors to consider in microcomputer system selection. A method for ranking database management systems is explained and applied to a defined need, i.e., software support for indexing a weekly newspaper. A glossary of terms and 32-item bibliography are…

  2. Medical Management of Oral Lichen Planus: A Systematic Review

    PubMed Central

    Chokshi, Krunal; Desai, Sachin; Malu, Rahul; Chokshi, Achala

    2016-01-01

    Introduction Oral Lichen Planus (OLP) is a chronic inflammatory, T-cell-mediated autoimmune oral mucosal disease with unclear aetiology. The clinical management of OLP poses considerable difficulties to the oral physician. Aim The aim was to assess the efficacy of any form of intervention used to medically manage OLP. Materials and Methods We searched and analysed the following databases (from January 1990 to December 2014):- Cochrane Oral Health Group Trials Register, Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE and EMBASE. All Randomised Controlled Trials (RCTs) for the medical management of OLP which compared active treatment with placebo or between active treatments were considered in this systematic review. Participants of any age, gender or race having symptomatic OLP (including mixed forms), unconnected to any identifiable cause (e.g. lichenoid drug reactions) and confirmed by histopathology have been included. Interventions of all types, including topical treatments or systemic drugs of variable dosage, duration & frequency of delivery have been considered. All the trials identified were appraised by five review authors and the data for all the trials were synthesised using specifically designed data extraction form. Binary data has been presented as risk ratios (RR) with 95% confidence intervals (CI) and continuous data as mean differences (MD) with 95% CIs. Results A total of 35 RCTs were included in this systematic review on medical management of OLP. No strong evidence suggesting superiority of any specific intervention in reducing pain and clinical signs of OLP were shown by the RCTs included here. Conclusion Future RCTs on a larger scale, adopting standardized outcome assessing parameters should be considered. PMID:27042598

  3. Multicriteria analysis using open-source data and software for the implementation of a centralized biomedical waste management system in a developing country (Guinea, Conakry).

    NASA Astrophysics Data System (ADS)

    Pérez Peña, José Vicente; Baldó, Mane; Acosta, Yarci; Verschueren, Laurent; Thibaud, Kenmognie; Bilivogui, Pépé; Jean-Paul Ngandu, Alain; Beavogui, Maoro

    2017-04-01

    In the last decade the increasing interest for public health has promoted specific regulations for the transport, storage, transformation and/or elimination of potentially toxic waste. A special concern should focus on the effective management of biomedical waste, due to the environmental and health risk associated with them. The first stage for the effective management these waste includes the selection of the best sites for the location of facilities for its storage and/or elimination. Best-site selection is accomplished by means of multi-criteria decision analyses (MCDA) that aim to minimize the social and environmental impact, and to maximize management efficiency. In this work we presented a methodology that uses open-source software and data to analyze the best location for the implantation of a centralized waste management system in a developing country (Guinea, Conakry). We applied an analytical hierarchy process (AHP) using different thematic layers such as land use (derived from up-to-date Sentinel 2 remote sensing images), soil type, distance and type of roads, hydrography, distance to dense populated areas, etc. Land-use data were derived from up-to-date Sentinel 2 remote sensing images, whereas roads and hydrography were obtained from the Open Street Map database and latter validated with administrative data. We performed the AHP analysis with the aid of QGIS open-software Geospatial Information System. This methodology is very effective for developing countries as it uses open-source software and data for the MCDA analysis, thus reducing costs in these first stages of the integrated analysis.

  4. Geochronology Database for Central Colorado

    USGS Publications Warehouse

    Klein, T.L.; Evans, K.V.; deWitt, E.H.

    2010-01-01

    This database is a compilation of published and some unpublished isotopic and fission track age determinations in central Colorado. The compiled area extends from the southern Wyoming border to the northern New Mexico border and from approximately the longitude of Denver on the east to Gunnison on the west. Data for the tephrochronology of Pleistocene volcanic ash, carbon-14, Pb-alpha, common-lead, and U-Pb determinations on uranium ore minerals have been excluded.

  5. Virtual Queue in a Centralized Database Environment

    NASA Astrophysics Data System (ADS)

    Kar, Amitava; Pal, Dibyendu Kumar

    2010-10-01

    Today is the era of the Internet. Every matter whether it be a gather of knowledge or planning a holiday or booking of ticket etc everything can be obtained from the internet. This paper intends to calculate the different queuing measures when some booking or purchase is done through the internet subject to the limitations in the number of tickets or seats. It involves a lot of database activities like read and write. This paper takes care of the time involved in the requests of a service, taken as arrival and the time involved in providing the required information, taken as service and thereby tries to calculate the distribution of arrival and service and the various measures of the queuing. This paper considers the database as centralized database for the sake of simplicity as the alternating concept of distributed database would rather complicate the calculation.

  6. The use of gabapentin in the management of postoperative pain after total hip arthroplasty: a meta-analysis of randomised controlled trials.

    PubMed

    Han, Chao; Li, Xiao-Dan; Jiang, Hong-Qiang; Ma, Jian-Xiong; Ma, Xin-Long

    2016-07-12

    Pain management after total hip arthroplasty (THA) varies and has been widely studied in recent years. Gabapentin as a third-generation antiepileptic drug that selectively affects the nociceptive process has been used for pain relief after THA. This meta-analysis was conducted to examine the efficacy of gabapentin in THA. An electronic-based search was conducted using the following databases: PubMed, EMBASE, Ovid MEDLINE, ClinicalTrials.gov, and Cochrane Central Register of Controlled Trials (CENTRAL). Randomised controlled trials (RCTs) involving gabapentin and a placebo for THA were included. The meta-analysis was performed following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Five trials met the inclusion criteria. The cumulative narcotic consumption and the visual analogue scale (VAS) scores at 24 and 48 h postoperatively were used for postoperative pain assessment. There was a significant decrease in morphine consumption at 24 h (P = 0.00). Compared with the control group, the VAS score (at rest) at 48 h was less in the gabapentin group (P = 0.00). The administration of gabapentin is effective in decreasing postoperative narcotic consumption and the VAS score.

  7. Biocontainment, biosecurity, and security practices in beef feedyards.

    PubMed

    Brandt, Aric W; Sanderson, Michael W; DeGroot, Brad D; Thomson, Dan U; Hollis, Larry C

    2008-01-15

    To determine the biocontainment, biosecurity, and security practices at beef feedyards in the Central Plains of the United States. Survey. Managers of feedyards in Colorado, Kansas, Nebraska, Oklahoma, and Texas that feed beef cattle for finish before slaughter; feedyards had to have an active concentrated animal feeding operation permit with a 1-time capacity of >or= 1,000 cattle. A voluntary survey of feedyard personnel was conducted. Identified feedyard personnel were interviewed and responses regarding facility design, security, employees, disease preparedness, feedstuffs, hospital or treatment systems, sanitation, cattle sources, handling of sick cattle, and disposal of carcasses were collected in a database questionnaire. The survey was conducted for 106 feedyards with a 1-time capacity that ranged from 1,300 to 125,000 cattle. Feedyards in general did not have high implementation of biocontainment, biosecurity, or security practices. Smaller feedyards were, in general, less likely to use good practices than were larger feedyards. Results of the survey provided standard practices for biocontainment, biosecurity, and security in feedyards located in Central Plains states. Information gained from the survey results can be used by consulting veterinarians and feedyard managers as a basis for discussion and to target training efforts.

  8. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    PubMed

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  9. Completeness of the disease recording systems for dairy cows in Denmark, Finland, Norway and Sweden with special reference to clinical mastitis

    PubMed Central

    2012-01-01

    Background In the Nordic countries Denmark, Finland, Norway and Sweden, the majority of dairy herds are covered by disease recording systems, in general based on veterinary registration of diagnoses and treatments. Disease data are submitted to the national cattle databases where they are combined with, e.g., production data at cow level, and used for breeding programmes, advisory work and herd health management. Previous studies have raised questions about the quality of the disease data. The main aim of this study was to examine the country-specific completeness of the disease data, regarding clinical mastitis (CM) diagnosis, in each of the national cattle databases. A second aim was to estimate country-specific CM incidence rates (IRs). Results Over 4 months in 2008, farmers in the four Nordic countries recorded clinical diseases in their dairy cows. Their registrations were matched to registrations in the central cattle databases. The country-specific completeness of disease registrations was calculated as the proportion of farmer-recorded cases that could be found in the central database. The completeness (95% confidence interval) for veterinary-supervised cases of CM was 0.94 (0.92, 0.97), 0.56 (0.48, 0.64), 0.82 (0.75, 0.90) and 0.78 (0.70, 0.85) in Denmark, Finland, Norway and Sweden, respectively. The completeness of registration of all CM cases, which includes all cases noted by farmers, regardless of whether the cows were seen or treated by a veterinarian or not, was 0.90 (0.87, 0.93), 0.51 (0.43, 0.59), 0.75 (0.67, 0.83) and 0.67 (0.60, 0.75), respectively, in the same countries. The IRs, estimated by Poisson regression in cases per 100 cow-years, based on the farmers’ recordings, were 46.9 (41.7, 52.7), 38.6 (34.2, 43.5), 31.3 (27.2, 35.9) and 26.2 (23.2, 26.9), respectively, which was between 20% (DK) and 100% (FI) higher than the IRs based on recordings in the central cattle databases. Conclusions The completeness for veterinary-supervised cases of CM was considerably less than 100% in all four Nordic countries and differed between countries. Hence, the number of CM cases in dairy cows is underestimated. This has an impact on all areas where the disease data are used. PMID:22866606

  10. 77 FR 39687 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ..., Form, and OMB Number: Defense Sexual Assault Incident Database (DSAID); OMB Control Number 0704-0482... sexual assault data collected by the Military Services. This database shall be a centralized, case-level database for the uniform collection of data regarding incidence of sexual assaults involving persons...

  11. 48 CFR 404.1103 - Procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... employees shall not enter information into the Central Contractor Registration (CCR) database on behalf of... be advised to submit a written application to CCR for registration into the CCR database. USDA... registered in the CCR database shall be done via the CCR Internet Web site http://www.ccr.gov. This...

  12. 48 CFR 404.1103 - Procedures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... employees shall not enter information into the Central Contractor Registration (CCR) database on behalf of... be advised to submit a written application to CCR for registration into the CCR database. USDA... registered in the CCR database shall be done via the CCR Internet Web site http://www.ccr.gov. This...

  13. 48 CFR 404.1103 - Procedures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... employees shall not enter information into the Central Contractor Registration (CCR) database on behalf of... be advised to submit a written application to CCR for registration into the CCR database. USDA... registered in the CCR database shall be done via the CCR Internet Web site http://www.ccr.gov. This...

  14. 48 CFR 204.1103 - Procedures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... information as recorded in the Central Contractor Registration (CCR) database at the time of award. (2) When... record is active in the CCR database; and (ii) The contractor's Data Universal Numbering System (DUNS... database, the contracting officer shall process a novation or change-of-name agreement, or an address...

  15. 48 CFR 404.1103 - Procedures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... employees shall not enter information into the Central Contractor Registration (CCR) database on behalf of... be advised to submit a written application to CCR for registration into the CCR database. USDA... registered in the CCR database shall be done via the CCR Internet Web site http://www.ccr.gov. This...

  16. 48 CFR 52.204-7 - Central Contractor Registration.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... (CCR) database means the primary Government repository for Contractor information required for the...) for the same concern. Registered in the CCR database means that— (1) The Contractor has entered all... Federal Funding Accountability and Transparency Act of 2006 (see subpart 4.14), into the CCR database; and...

  17. 48 CFR 204.1103 - Procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... information as recorded in the Central Contractor Registration (CCR) database at the time of award. (2) When... record is active in the CCR database; and (ii) The contractor's Data Universal Numbering System (DUNS... database, the contracting officer shall process a novation or change-of-name agreement, or an address...

  18. 48 CFR 404.1103 - Procedures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... employees shall not enter information into the Central Contractor Registration (CCR) database on behalf of... be advised to submit a written application to CCR for registration into the CCR database. USDA... registered in the CCR database shall be done via the CCR Internet Web site http://www.ccr.gov. This...

  19. 48 CFR 204.1103 - Procedures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... information as recorded in the Central Contractor Registration (CCR) database at the time of award. (2) When... record is active in the CCR database; and (ii) The contractor's Data Universal Numbering System (DUNS... database, the contracting officer shall process a novation or change-of-name agreement, or an address...

  20. Incorporating Aquatic Interspecies Toxicity Estimates into Large Databases: Model Evaluations and Data Gains

    EPA Science Inventory

    The Chemical Aquatic Fate and Effects (CAFE) database, developed by NOAA’s Emergency Response Division (ERD), is a centralized data repository that allows for unrestricted access to fate and effects data. While this database was originally designed to help support decisions...

  1. Architecture design of a generic centralized adjudication module integrated in a web-based clinical trial management system

    PubMed Central

    Zhao, Wenle; Pauls, Keith

    2015-01-01

    Background Centralized outcome adjudication has been used widely in multi-center clinical trials in order to prevent potential biases and to reduce variations in important safety and efficacy outcome assessments. Adjudication procedures could vary significantly among different studies. In practice, the coordination of outcome adjudication procedures in many multicenter clinical trials remains as a manual process with low efficiency and high risk of delay. Motivated by the demands from two large clinical trial networks, a generic outcome adjudication module has been developed by the network’s data management center within a homegrown clinical trial management system. In this paper, the system design strategy and database structure are presented. Methods A generic database model was created to transfer different adjudication procedures into a unified set of sequential adjudication steps. Each adjudication step was defined by one activate condition, one lock condition, one to five categorical data items to capture adjudication results, and one free text field for general comments. Based on this model, a generic outcome adjudication user interface and a generic data processing program were developed within a homegrown clinical trial management system to provide automated coordination of outcome adjudication. Results By the end of 2014, this generic outcome adjudication module had been implemented in 10 multicenter trials. A total of 29 adjudication procedures were defined with the number of adjudication steps varying from 1 to 7. The implementation of a new adjudication procedure in this generic module took an experienced programmer one or two days. A total of 7,336 outcome events had been adjudicated and 16,235 adjudication step activities had been recorded. In a multicenter trial, 1144 safety outcome event submissions went through a three-step adjudication procedure and reported a median of 3.95 days from safety event case report form submission to adjudication completion. In another trial, 277 clinical outcome events were adjudicated by a six-step procedure and took a median of 23.84 days from outcome event case report form submission to adjudication procedure completion. Conclusions A generic outcome adjudication module integrated in the clinical trial management system made the automated coordination of efficacy and safety outcome adjudication a reality. PMID:26464429

  2. The future application of GML database in GIS

    NASA Astrophysics Data System (ADS)

    Deng, Yuejin; Cheng, Yushu; Jing, Lianwen

    2006-10-01

    In 2004, the Geography Markup Language (GML) Implementation Specification (version 3.1.1) was published by Open Geospatial Consortium, Inc. Now more and more applications in geospatial data sharing and interoperability depend on GML. The primary purpose of designing GML is for exchange and transportation of geo-information by standard modeling and encoding of geography phenomena. However, the problems of how to organize and access lots of GML data effectively arise in applications. The research on GML database focuses on these problems. The effective storage of GML data is a hot topic in GIS communities today. GML Database Management System (GDBMS) mainly deals with the problem of storage and management of GML data. Now two types of XML database, namely Native XML Database, and XML-Enabled Database are classified. Since GML is an application of the XML standard to geographic data, the XML database system can also be used for the management of GML. In this paper, we review the status of the art of XML database, including storage, index and query languages, management systems and so on, then move on to the GML database. At the end, the future prospect of GML database in GIS application is presented.

  3. Development of a website and biobank database for the Nanosized Cancer Polymarker Biochip Project: a Multicenter Italian Experience.

    PubMed

    Leon, Antonette E; Fabricio, Aline S C; Benvegnù, Fabio; Michilin, Silvia; Secco, Annamaria; Spangaro, Omar; Meo, Sabrina; Gion, Massimo

    2011-01-01

    The Nanosized Cancer Polymarker Biochip Project (RBLA03S4SP) funded by an Italian MIUR-FIRB grant (Italian Ministry of University and Research - Investment Funds for Basic Research) has led to the creation of a free-access dynamic website, available at the web address https://serviziweb.ulss12.ve.it/firbabo, and of a centralized database with password-restricted access. The project network is composed of 9 research units (RUs) and has been active since 2005. The aim of the FIRB project was the design, production and validation of optoelectronic and chemoelectronic biosensors for the simultaneous detection of a novel class of cancer biomarkers associated with immunoglobulins of the M class (IgM) for early diagnosis of cancer. Biomarker immune complexes (BM-ICs) were assessed on samples of clinical cases and matched controls for breast, colorectal, liver, ovarian and prostate malignancies. This article describes in detail the architecture of the project website, the central database application, and the biobank developed for the FIRB Nanosized Cancer Polymarker Biochip Project. The article also illustrates many unique aspects that should be considered when developing a database within a multidisciplinary scenario. The main deliverables of the project were numerous, including the development of an online database which archived 1400 case report forms (700 cases and 700 matched controls) and more than 2700 experimental results relative to the BM-ICs assayed. The database also allowed for the traceability and retrieval of 21,000 aliquots archived in the centralized bank and stored as backup in the RUs, and for the development of a centralized biological bank in the coordinating unit with 6300 aliquots of serum. The constitution of the website and biobank database enabled optimal coordination of the RUs involved, highlighting the importance of sharing samples and scientific data in a multicenter setting for the achievement of the project goals.

  4. Database Systems. Course Three. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the third of seven in the Information Systems curriculum. The purpose of the course is to familiarize students with database management concepts and standard database management software. Databases and their roles, advantages, and limitations are explained. An overview of the course sets forth the condition and performance standard…

  5. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  6. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  7. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  8. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  9. Controlling air pollution in a city: A perspective from SOAR-PESTLE analysis.

    PubMed

    Gheibi, Mohammad; Karrabi, Mohsen; Mohammadi, Ali; Dadvar, Azin

    2018-04-16

    Strengths, opportunities, aspirations, and results (SOAR) analysis is a strategic planning framework that helps organizations focus on their current strengths and opportunities to create a vision of future aspirations and the results they will bring. PESTLE is an analytical framework for understanding external influences on a business. This research paper describes a field study and interviews of city hall managers from the city of Mashhad, Iran, conducted to investigate the application of SOAR and PESTLE frameworks for managing Mashhad's air pollution. Strategies are prioritized by the technique for order of preference by similarity to ideal solution (TOPSIS), Shannon entropy (SE), and analytic network process (ANP) multicriteria decision-making (MCDM) methods, considering economic conditions, managers' opinions, consensus, city council approvals, and national documents. The results of this research study show that creating centralized databases, supporting local governments, and developing smart city infrastructure, with weights of 0.194, 0.182, and 0.161, respectively, are the highest ranked strategies for managing air pollution in Mashhad. It can also be concluded that citizen involvement is key to achieving success in the employment of any management strategy. Integr Environ Assess Manag 2018;00:000-000. © 2018 SETAC. © 2018 SETAC.

  10. Economic Evaluations in the Diagnosis and Management of Traumatic Brain Injury: A Systematic Review and Analysis of Quality.

    PubMed

    Alali, Aziz S; Burton, Kirsteen; Fowler, Robert A; Naimark, David M J; Scales, Damon C; Mainprize, Todd G; Nathens, Avery B

    2015-07-01

    Economic evaluations provide a unique opportunity to identify the optimal strategies for the diagnosis and management of traumatic brain injury (TBI), for which uncertainty is common and the economic burden is substantial. The objective of this study was to systematically review and examine the quality of contemporary economic evaluations in the diagnosis and management of TBI. Two reviewers independently searched MEDLINE, EMBASE, Cochrane Central Register of Controlled Trials, NHS Economic Evaluation Database, Health Technology Assessment Database, EconLit, and the Tufts CEA Registry for comparative economic evaluations published from 2000 onward (last updated on August 30, 2013). Data on methods, results, and quality were abstracted in duplicate. The results were summarized quantitatively and qualitatively. Of 3539 citations, 24 economic evaluations met our inclusion criteria. Nine were cost-utility, five were cost-effectiveness, three were cost-minimization, and seven were cost-consequences analyses. Only six studies were of high quality. Current evidence from high-quality studies suggests the economic attractiveness of the following strategies: a low medical threshold for computed tomography (CT) scanning of asymptomatic infants with possible inflicted TBI, selective CT scanning of adults with mild TBI as per the Canadian CT Head Rule, management of severe TBI according to the Brain Trauma Foundation guidelines, management of TBI in dedicated neurocritical care units, and early transfer of patients with TBI with nonsurgical lesions to neuroscience centers. Threshold-guided CT scanning, adherence to Brain Trauma Foundation guidelines, and care for patients with TBI, including those with nonsurgical lesions, in specialized settings appear to be economically attractive strategies. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Type 2 diabetes–related foot care knowledge and foot self-care practice interventions in the United States: a systematic review of the literature

    PubMed Central

    Bonner, Timethia; Foster, Margaret; Spears-Lanoix, Erica

    2016-01-01

    Introduction The purpose of this systematic literature review is to review published studies on foot care knowledge and foot care practice interventions as part of diabetic foot care self-management interventions. Methods Medline, CINAHL, CENTRAL, and Cochrane Central Register of Controlled Trials databases were searched. References from the included studies were reviewed to identify any missing studies that could be included. Only foot care knowledge and foot care practice intervention studies that focused on the person living with type 2 diabetes were included in this review. Author, study design, sample, intervention, and results were extracted. Results Thirty studies met the inclusion criteria and were classified according to randomized controlled trial (n=9), survey design (n=13), cohort studies (n=4), cross-sectional studies (n=2), qualitative studies (n=2), and case series (n=1). Improving lower extremity complications associated with type 2 diabetes can be done through effective foot care interventions that include foot care knowledge and foot care practices. Conclusion Preventing these complications, understanding the risk factors, and having the ability to manage complications outside of the clinical encounter is an important part of a diabetes foot self-care management program. Interventions and research studies that aim to reduce lower extremity complications are still lacking. Further research is needed to test foot care interventions across multiple populations and geographic locations. PMID:26899439

  12. Type 2 diabetes-related foot care knowledge and foot self-care practice interventions in the United States: a systematic review of the literature.

    PubMed

    Bonner, Timethia; Foster, Margaret; Spears-Lanoix, Erica

    2016-01-01

    The purpose of this systematic literature review is to review published studies on foot care knowledge and foot care practice interventions as part of diabetic foot care self-management interventions. Medline, CINAHL, CENTRAL, and Cochrane Central Register of Controlled Trials databases were searched. References from the included studies were reviewed to identify any missing studies that could be included. Only foot care knowledge and foot care practice intervention studies that focused on the person living with type 2 diabetes were included in this review. Author, study design, sample, intervention, and results were extracted. Thirty studies met the inclusion criteria and were classified according to randomized controlled trial (n=9), survey design (n=13), cohort studies (n=4), cross-sectional studies (n=2), qualitative studies (n=2), and case series (n=1). Improving lower extremity complications associated with type 2 diabetes can be done through effective foot care interventions that include foot care knowledge and foot care practices. Preventing these complications, understanding the risk factors, and having the ability to manage complications outside of the clinical encounter is an important part of a diabetes foot self-care management program. Interventions and research studies that aim to reduce lower extremity complications are still lacking. Further research is needed to test foot care interventions across multiple populations and geographic locations.

  13. National Levee Database: monitoring, vulnerability assessment and management in Italy

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Camici, Stefania; Maccioni, Pamela; Moramarco, Tommaso

    2015-04-01

    A properly designed and constructed levees system can often be an effective device for repelling floodwaters and provide barriers against inundation to protect urbanized and industrial areas. However, the delineation of flooding-prone areas and the related hydraulic hazard mapping taking account of uncertainty (Apel et al., 2008) are usually developed with a scarce consideration of the possible occurrence of levee failures along river channels (Mazzoleni et al., 2014). Indeed, it is well known that flooding is frequently the result of levee failures that can be triggered by several factors, as: (1) overtopping, (2) scouring of the foundation, (3) seepage/piping of levee body/foundation, and (4) sliding of the foundation. Among these failure mechanisms that are influenced by the levee's geometrical configuration, hydraulic conditions (e.g. river level and seepage), and material properties (e.g. permeability, cohesion, porosity, compaction), the piping caused by seepage (ICOLD, http://www.icold-cigb.org) is considered one of the most dominant levee failure mechanisms (Colleselli F., 1994; Wallingford H. R., 2003). The difficulty of estimating the hydraulic parameters to properly describe the seepage line within the body and foundation of the levee implies that the study of the critical flood wave routing is typically carried out by assuming that the levee system is undamaged during the flood event. In this context, implementing and making operational a National Levee Database (NLD), effectively structured and continuously updated, becomes fundamental to have a searchable inventory of information about levees available as a key resource supporting decisions and actions affecting levee safety. The ItaliaN LEvee Database (INLED) has been recently developed by the Research Institute for Geo-Hydrological Protection (IRPI) for the Civil Protection Department of the Presidency of Council of Ministers. INLED has the main focus of collecting comprehensive information about Italian levees and historical breach failures to be exploited in the framework of an operational procedure addressed to the seepage vulnerability assessment of river reaches where the levee system is an important structural measure against flooding. For its structure, INLED is a dynamic geospatial database with ongoing efforts to add levee data from authorities with the charge of hydraulic risk mitigation. In particular, the database is aimed to provide the available information about: i) location and condition of levees; ii) morphological and geometrical properties; iii) photographic documentation; iv) historical levee failures; v) assessment of vulnerability to overtopping and seepage carried out through a procedure based on simple vulnerability indexes (Camici et al. 2014); vi) management, control and maintenance; vii)flood hazard maps developed by assuming the levee system undamaged/damaged during the flood event. Currently, INLED contains data of levees that are mostly located in the Tiber basin, Central Italy. References Apel H., Merz B. & Thieken A.H. Quantification of uncertainties in flood risk assessments. Int J River Basin Manag 2008, 6, (2), 149-162. Camici S,, Barbetta S., Moramarco T., Levee body vulnerability to seepage: the case study of the levee failure along the Foenna stream on 1st January 2006 (central Italy)", Journal of Flood Risk Management, in press. Colleselli F. Geotechnical problems related to river and channel embankments. Rotterdam, the Netherlands: Springer, 1994. H. R.Wallingford Consultants (HRWC). Risk assessment for flood and coastal defence for strategic planning: high level methodology technical report, London, 2003. Mazzoleni M., Bacchi B., Barontini S., Di Baldassarre G., Pilotti M. & Ranzi R. Flooding hazard mapping in floodplain areas affected by piping breaches in the Po River, Italy. J Hydrol Eng 2014, 19, (4), 717-731.

  14. Systematic review of the links between human resource management practices and performance.

    PubMed

    Patterson, M; Rick, J; Wood, S; Carroll, C; Balain, S; Booth, A

    2010-10-01

    In recent years human resource management (HRM) has been seen as an important factor in the successful realisation of organisational change programmes. The UK NHS is undergoing substantial organisational change and there is a need to establish which human resource (HR) initiatives may be most effective. To assess the results from a wide-ranging series of systematic reviews of the evidence on HRM and performance. The first part assesses evidence on use of HRM in the UK and fidelity of practice implemented. The second part considers evidence for the impact of HRM practices on intermediate outcomes, which can impact on final outcomes, such as organisational performance or patient care. The following databases were searched: Applied Social Sciences Index and Abstracts (ASSIA), British Nursing Index (BNI), Business Source Premier, Campbell Collaboration, Cochrane Central Register of Controlled Trials (CENTRAL), Cochrane Database of Systematic Reviews (CDSR), Cumulative Index to Nursing and Allied Health Literature (CINAHL), Database of Abstracts of Reviews of Effectiveness (DARE), DH-Data, EMBASE, Health Management Information Consortium (HMIC), International Bibliography of the Social Sciences (IBSS), King's Fund database, MEDLINE, NHS Economic Evaluation Database (NHS EED), National Research Register (NRR), PREMEDLINE, PsycINFO, ReFeR, Social Sciences Citation Index (SSCI) and Science Citation Index (SCI). The searches were conducted in May/June 2006. Broad categories of HRM interventions and intermediate outcomes were generated: 10 HRM categories and 12 intermediate outcome categories. Seven patient final outcomes were derived from the NHS Performance Indicators and the NHS Improvement Plan. The quality criteria used to select papers incorporated a longitudinal study design filter to provide evidence of the causal direction of relationships between HRM and relevant outcomes. Single HRM practices were considered. Within the health-specific literature, focus was on the impact of HRM on patient outcomes. Information is presented on the reliability of measures in each of the intermediate outcome areas. Work design practices that enhance employee autonomy and control influenced a number of outcomes and there was consistent evidence for the positive impact of increased job control on employee outcomes, such as job satisfaction, absence and health. For employee participation, the small number of studies reviewed supported the involvement of employees in design/implementation of changes that affect their work. In health literature in particular, employee involvement through quality improvement teams resulted in improved patient outcomes. Findings were positive for the impact of training on the intended outcomes of the initiatives. Support for the impact of performance management practices was apparent, in particular feedback on performance outcomes and the use of participative goal setting. Strong associations were found among all intermediate outcomes, and the relationship between most intermediate behaviours and outcomes were significant. Limited evidence was available on the use of HRM and on the implementation of policy. Also, the specific practices studied within each HRM category differ so there was little evidence to show whether similar practices have the same effects in health and non-health settings. Some potentially effective practices for both health and non-health areas were identified, and HRM methods could be used to support change processes within the NHS; the findings relating to work organisation are particularly promising with regard to changes in methods of service delivery. Using training to support the implementation of change is highlighted. However, future multilevel studies that embrace the individual, team and organisational level are needed. Studies should look into interventions aimed at improving HR outcomes and performance, and allow for pre- and post-intervention measurement of practices and outcomes.

  15. Evolution of grid-wide access to database resident information in ATLAS using Frontier

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Bujor, F.; de Stefano, J.; Dewhurst, A. L.; Dykstra, D.; Front, D.; Gallas, E.; Gamboa, C. F.; Luehring, F.; Walker, R.

    2012-12-01

    The ATLAS experiment deployed Frontier technology worldwide during the initial year of LHC collision data taking to enable user analysis jobs running on the Worldwide LHC Computing Grid to access database resident data. Since that time, the deployment model has evolved to optimize resources, improve performance, and streamline maintenance of Frontier and related infrastructure. In this presentation we focus on the specific changes in the deployment and improvements undertaken, such as the optimization of cache and launchpad location, the use of RPMs for more uniform deployment of underlying Frontier related components, improvements in monitoring, optimization of fail-over, and an increasing use of a centrally managed database containing site specific information (for configuration of services and monitoring). In addition, analysis of Frontier logs has allowed us a deeper understanding of problematic queries and understanding of use cases. Use of the system has grown beyond user analysis and subsystem specific tasks such as calibration and alignment, extending into production processing areas, such as initial reconstruction and trigger reprocessing. With a more robust and tuned system, we are better equipped to satisfy the still growing number of diverse clients and the demands of increasingly sophisticated processing and analysis.

  16. PDS: A Performance Database Server

    DOE PAGES

    Berry, Michael W.; Dongarra, Jack J.; Larose, Brian H.; ...

    1994-01-01

    The process of gathering, archiving, and distributing computer benchmark data is a cumbersome task usually performed by computer users and vendors with little coordination. Most important, there is no publicly available central depository of performance data for all ranges of machines from personal computers to supercomputers. We present an Internet-accessible performance database server (PDS) that can be used to extract current benchmark data and literature. As an extension to the X-Windows-based user interface (Xnetlib) to the Netlib archival system, PDS provides an on-line catalog of public domain computer benchmarks such as the LINPACK benchmark, Perfect benchmarks, and the NAS parallelmore » benchmarks. PDS does not reformat or present the benchmark data in any way that conflicts with the original methodology of any particular benchmark; it is thereby devoid of any subjective interpretations of machine performance. We believe that all branches (research laboratories, academia, and industry) of the general computing community can use this facility to archive performance metrics and make them readily available to the public. PDS can provide a more manageable approach to the development and support of a large dynamic database of published performance metrics.« less

  17. NEIS (NASA Environmental Information System)

    NASA Technical Reports Server (NTRS)

    Cook, Beth

    1995-01-01

    The NASA Environmental Information System (NEIS) is a tool to support the functions of the NASA Operational Environment Team (NOET). The NEIS is designed to provide a central environmental technology resource drawing on all NASA centers' capabilities, and to support program managers who must ultimately deliver hardware compliant with performance specifications and environmental requirements. The NEIS also tracks environmental regulations, usages of materials and processes, and new technology developments. It has proven to be a useful instrument for channeling information throughout the aerospace community, NASA, other federal agencies, educational institutions, and contractors. The associated paper will discuss the dynamic databases within the NEIS, and the usefulness it provides for environmental compliance efforts.

  18. High rate information systems - Architectural trends in support of the interdisciplinary investigator

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Preheim, Larry E.

    1990-01-01

    Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.

  19. 48 CFR 52.204-7 - Central Contractor Registration.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... (CCR) database means the primary Government repository for Contractor information required for the...) for the same concern. Registered in the CCR database means that— (1) The Contractor has entered all mandatory information, including the DUNS number or the DUNS+4 number, into the CCR database; and (2) The...

  20. Combining new technologies for effective collection development: a bibliometric study using CD-ROM and a database management program.

    PubMed Central

    Burnham, J F; Shearer, B S; Wall, J C

    1992-01-01

    Librarians have used bibliometrics for many years to assess collections and to provide data for making selection and deselection decisions. With the advent of new technology--specifically, CD-ROM databases and reprint file database management programs--new cost-effective procedures can be developed. This paper describes a recent multidisciplinary study conducted by two library faculty members and one allied health faculty member to test a bibliometric method that used the MEDLINE and CINAHL databases on CD-ROM and the Papyrus database management program to produce a new collection development methodology. PMID:1600424

  1. Endovascular intervention for central venous cannulation in patients with vascular occlusion after previous catheterization.

    PubMed

    Pikwer, Andreas; Acosta, Stefan; Kölbel, Tilo; Åkeson, Jonas

    2010-01-01

    This study was designed to assess endovascular intervention for central venous cannulation in patients with vascular occlusion after previous catheterization. Patients referred for endovascular management of central venous occlusion during a 42-month period were identified from a regional endovascular database, providing prospective information on techniques and clinical outcome. Corresponding patient records, angiograms, and radiographic reports were analyzed retrospectively. Sixteen patients aged 48 years (range 0.5-76), including 11 females, were included. All patients but 1 had had multiple central venous catheters with a median total indwelling time of 37 months. Eleven patients cannulated for hemodialysis had had significantly fewer individual catheters inserted compared with 5 patients cannulated for nutritional support (mean 3.6 vs. 10.2, p<0.001) before endovascular intervention. Preoperative imaging by magnetic resonance tomography (MRT) in 8 patients, computed tomography (CT) venography in 3, conventional angiography in 6, and/or ultrasonography in 8, verified 15 brachiocephalic, 13 internal jugular, 3 superior caval, and/or 3 subclavian venous occlusions. Patients were subjected to recanalization (n=2), recanalization and percutaneous transluminal angioplasty (n=5), or stenting for vena cava superior syndrome (n=1) prior to catheter insertion. The remaining 8 patients were cannulated by avoiding the occluded route. Central venous occlusion occurs particularly in patients under hemodialysis and with a history of multiple central venous catheterizations with large-diameter catheters and/or long total indwelling time periods. Patients with central venous occlusion verified by CT or MRT venography and need for central venous access should be referred for endovascular intervention.

  2. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2013-06-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. Copyright 2013 by JohnWiley & Sons, Inc.

  3. Screening_mgmt: a Python module for managing screening data.

    PubMed

    Helfenstein, Andreas; Tammela, Päivi

    2015-02-01

    High-throughput screening is an established technique in drug discovery and, as such, has also found its way into academia. High-throughput screening generates a considerable amount of data, which is why specific software is used for its analysis and management. The commercially available software packages are often beyond the financial limits of small-scale academic laboratories and, furthermore, lack the flexibility to fulfill certain user-specific requirements. We have developed a Python module, screening_mgmt, which is a lightweight tool for flexible data retrieval, analysis, and storage for different screening assays in one central database. The module reads custom-made analysis scripts and plotting instructions, and it offers a graphical user interface to import, modify, and display the data in a uniform manner. During the test phase, we used this module for the management of 10,000 data points of various origins. It has provided a practical, user-friendly tool for sharing and exchanging information between researchers. © 2014 Society for Laboratory Automation and Screening.

  4. Conceptual Modeling in the Time of the Revolution: Part II

    NASA Astrophysics Data System (ADS)

    Mylopoulos, John

    Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.

  5. Standard Energy Efficiency Data Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheifetz, D. Magnus

    2014-07-15

    The SEED platform is expected to be a building energy performance data management tool that provides federal, state and local governments, building owners and operators with an easy, flexible and cost-effective method to collect information about groups of buildings, oversee compliance with energy disclosure laws and demonstrate the economic and environmental benefits of energy efficiency. It will allow users to leverage a local application to manage data disclosure and large data sets without the IT investment of developing custom applications. The first users of SEED will be agencies that need to collect, store, and report/share large data sets generated bymore » benchmarking, energy auditing, retro-commissioning or retrofitting of many buildings. Similarly, building owners and operators will use SEED to manage their own energy data in a common format and centralized location. SEED users will also control the disclosure of their information for compliance requirements, recognition programs such as ENERGY STAR, or data sharing with the Buildings Performance Database and/or other third parties at their discretion.« less

  6. DEMS - a second generation diabetes electronic management system.

    PubMed

    Gorman, C A; Zimmerman, B R; Smith, S A; Dinneen, S F; Knudsen, J B; Holm, D; Jorgensen, B; Bjornsen, S; Planet, K; Hanson, P; Rizza, R A

    2000-06-01

    Diabetes electronic management system (DEMS) is a component-based client/server application, written in Visual C++ and Visual Basic, with the database server running Sybase System 11. DEMS is built entirely with a combination of dynamic link libraries (DLLs) and ActiveX components - the only exception is the DEMS.exe. DEMS is a chronic disease management system for patients with diabetes. It is used at the point of care by all members of the diabetes team including physicians, nurses, dieticians, clinical assistants and educators. The system is designed for maximum clinical efficiency and facilitates appropriately supervised delegation of care. Dispersed clinical sites may be supervised from a central location. The system is designed for ease of navigation; immediate provision of many types of automatically generated reports; quality audits; aids to compliance with good care guidelines; and alerts, advisories, prompts, and warnings that guide the care provider. The system now contains data on over 34000 patients and is in daily use at multiple sites.

  7. PubDNA Finder: a web database linking full-text articles to sequences of nucleic acids.

    PubMed

    García-Remesal, Miguel; Cuevas, Alejandro; Pérez-Rey, David; Martín, Luis; Anguita, Alberto; de la Iglesia, Diana; de la Calle, Guillermo; Crespo, José; Maojo, Víctor

    2010-11-01

    PubDNA Finder is an online repository that we have created to link PubMed Central manuscripts to the sequences of nucleic acids appearing in them. It extends the search capabilities provided by PubMed Central by enabling researchers to perform advanced searches involving sequences of nucleic acids. This includes, among other features (i) searching for papers mentioning one or more specific sequences of nucleic acids and (ii) retrieving the genetic sequences appearing in different articles. These additional query capabilities are provided by a searchable index that we created by using the full text of the 176 672 papers available at PubMed Central at the time of writing and the sequences of nucleic acids appearing in them. To automatically extract the genetic sequences occurring in each paper, we used an original method we have developed. The database is updated monthly by automatically connecting to the PubMed Central FTP site to retrieve and index new manuscripts. Users can query the database via the web interface provided. PubDNA Finder can be freely accessed at http://servet.dia.fi.upm.es:8080/pubdnafinder

  8. [Plug-in Based Centralized Control System in Operating Rooms].

    PubMed

    Wang, Yunlong

    2017-05-30

    Centralized equipment controls in an operating room (OR) is crucial to an efficient workflow in the OR. To achieve centralized control, an integrative OR needs to focus on designing a control panel that can appropriately incorporate equipment from different manufactures with various connecting ports and controls. Here we propose to achieve equipment integration using plug-in modules. Each OR will be equipped with a dynamic plug-in control panel containing physically removable connecting ports. Matching outlets will be installed onto the control panels of each equipment used at any given time. This dynamic control panel will be backed with a database containing plug-in modules that can connect any two types of connecting ports common among medical equipment manufacturers. The correct connecting ports will be called using reflection dynamics. This database will be updated regularly to include new connecting ports on the market, making it easy to maintain, update, expand and remain relevant as new equipment are developed. Together, the physical panel and the database will achieve centralized equipment controls in the OR that can be easily adapted to any equipment in the OR.

  9. Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database

    PubMed Central

    Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.

    2010-01-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938

  10. Federated web-accessible clinical data management within an extensible neuroimaging database.

    PubMed

    Ozyurt, I Burak; Keator, David B; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R; Bockholt, Jeremy; Grethe, Jeffrey S

    2010-12-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site.

  11. Implementation of a data management software system for SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, Kenneth

    1986-01-01

    The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.

  12. Semantic World Modelling and Data Management in a 4d Forest Simulation and Information System

    NASA Astrophysics Data System (ADS)

    Roßmann, J.; Hoppen, M.; Bücken, A.

    2013-08-01

    Various types of 3D simulation applications benefit from realistic forest models. They range from flight simulators for entertainment to harvester simulators for training and tree growth simulations for research and planning. Our 4D forest simulation and information system integrates the necessary methods for data extraction, modelling and management. Using modern methods of semantic world modelling, tree data can efficiently be extracted from remote sensing data. The derived forest models contain position, height, crown volume, type and diameter of each tree. This data is modelled using GML-based data models to assure compatibility and exchangeability. A flexible approach for database synchronization is used to manage the data and provide caching, persistence, a central communication hub for change distribution, and a versioning mechanism. Combining various simulation techniques and data versioning, the 4D forest simulation and information system can provide applications with "both directions" of the fourth dimension. Our paper outlines the current state, new developments, and integration of tree extraction, data modelling, and data management. It also shows several applications realized with the system.

  13. Towards building high performance medical image management system for clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Lee, Rubao; Zhang, Xiaodong; Saltz, Joel

    2011-03-01

    Medical image based biomarkers are being established for therapeutic cancer clinical trials, where image assessment is among the essential tasks. Large scale image assessment is often performed by a large group of experts by retrieving images from a centralized image repository to workstations to markup and annotate images. In such environment, it is critical to provide a high performance image management system that supports efficient concurrent image retrievals in a distributed environment. There are several major challenges: high throughput of large scale image data over the Internet from the server for multiple concurrent client users, efficient communication protocols for transporting data, and effective management of versioning of data for audit trails. We study the major bottlenecks for such a system, propose and evaluate a solution by using a hybrid image storage with solid state drives and hard disk drives, RESTfulWeb Services based protocols for exchanging image data, and a database based versioning scheme for efficient archive of image revision history. Our experiments show promising results of our methods, and our work provides a guideline for building enterprise level high performance medical image management systems.

  14. Invasiveness is associated with metastasis and decreased survival in hemangiopericytoma of the central nervous system.

    PubMed

    Kinslow, Connor J; Rajpara, Raj S; Wu, Cheng-Chia; Bruce, Samuel S; Canoll, Peter D; Wang, Shih-Hsiu; Sonabend, Adam M; Sheth, Sameer A; McKhann, Guy M; Sisti, Michael B; Bruce, Jeffrey N; Wang, Tony J C

    2017-06-01

    Meningeal hemangiopericytoma (m-HPC) is a rare tumor of the central nervous system (CNS), which is distinguished clinically from meningioma by its tendency to recur and metastasize. The histological classification and grading scheme for m-HPC is still evolving and few studies have identified tumor features that are associated with metastasis. All patients at our institution with m-HPC were assessed for patient, tumor, and treatment characteristics associated with survival, recurrence, and metastasis. New findings were validated using the SEER database. Twenty-seven patients were identified in our institutional records with m-HPC with a median follow-up time of 85 months. Invasiveness was the strongest predictor of decreased overall survival (OS) and decreased metastasis-free survival (MFS) (p = 0.004 and 0.001). On subgroup analysis, bone invasion trended towards decreased OS (p = 0.056). Bone invasion and soft tissue invasion were significantly associated with decreased MFS (p = 0.001 and 0.012). An additional 315 patients with m-HPC were identified in the SEER database that had information on tumor invasion and 263 with information on distant metastasis. Invasion was significantly associated with decreased survival (HR = 5.769, p = 0.007) and metastasis (OR 134, p = 0.000) in the SEER data. In this study, the authors identified a previously unreported tumor characteristic, invasiveness, as the strongest factor associated with decreased survival and metastasis. The association of invasion with decreased survival and metastasis was confirmed in a separate, larger, publicly available database. Invasion may be a useful parameter in the histological grading and clinical management of hemangiopericytoma of the CNS.

  15. WE-D-9A-06: Open Source Monitor Calibration and Quality Control Software for Enterprise Display Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bevins, N; Vanderhoek, M; Lang, S

    2014-06-15

    Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary andmore » secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system.« less

  16. Root resorption during orthodontic treatment.

    PubMed

    Walker, Sally

    2010-01-01

    Medline, Embase, LILACS, The Cochrane Library (Cochrane Database of Systematic Reviews, CENTRAL, and Cochrane Oral Health Group Trials Register) Web of Science, EBM Reviews, Computer Retrieval of Information on Scientific Project (CRISP, www.crisp.cit.nih.gov), On-Line Computer Library Center (www.oclc.org), Google Index to Scientific and Technical Proceedings, PAHO (www.paho.org), WHOLis (www.who.int/library/databases/en), BBO (Brazilian Bibliography of Dentistry), CEPS (Chinese Electronic Periodical Services), Conference materials (www.bl.uk/services/bsds/dsc/conference.html), ProQuest Dissertation Abstracts and Thesis database, TrialCentral (www.trialscentral.org), National Research Register (www.controlled-trials.com), www.Clinicaltrials.gov and SIGLE (System for Information on Grey Literature in Europe). Randomised controlled trials including split mouth design, recording the presence or absence of external apical root resorption (EARR) by treatment group at the end of the treatment period. Data were extracted independently by two reviewers using specially designed and piloted forms. Quality was also assessed independently by the same reviewers. After evaluating titles and abstracts, 144 full articles were obtained of which 13 articles, describing 11 trials, fulfilled the criteria for inclusion. Differences in the methodological approaches and reporting results made quantitative statistical comparisons impossible. Evidence suggests that comprehensive orthodontic treatment causes increased incidence and severity of root resorption, and heavy forces might be particularly harmful. Orthodontically induced inflammatory root resorption is unaffected by archwire sequencing, bracket prescription, and self-ligation. Previous trauma and tooth morphology are unlikely causative factors. There is some evidence that a two- to three-month pause in treatment decreases total root resorption. The results were inconclusive in the clinical management of root resorption, but there is evidence to support the use of light forces, especially with incisor intrusion.

  17. Creating of Central Geospatial Database of the Slovak Republic and Procedures of its Revision

    NASA Astrophysics Data System (ADS)

    Miškolci, M.; Šafář, V.; Šrámková, R.

    2016-06-01

    The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD - official name in Slovak language is Centrálna Priestorová Databáza - CPD) and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD) for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic) and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing) is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.

  18. Graph Databases for Large-Scale Healthcare Systems: A Framework for Efficient Data Management and Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Yubin; Shankar, Mallikarjun; Park, Byung H.

    Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcaremore » RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.« less

  19. Materials And Processes Technical Information System (MAPTIS) LDEF materials database

    NASA Technical Reports Server (NTRS)

    Davis, John M.; Strickland, John W.

    1992-01-01

    The Materials and Processes Technical Information System (MAPTIS) is a collection of materials data which was computerized and is available to engineers in the aerospace community involved in the design and development of spacecraft and related hardware. Consisting of various database segments, MAPTIS provides the user with information such as material properties, test data derived from tests specifically conducted for qualification of materials for use in space, verification and control, project management, material information, and various administrative requirements. A recent addition to the project management segment consists of materials data derived from the LDEF flight. This tremendous quantity of data consists of both pre-flight and post-flight data in such diverse areas as optical/thermal, mechanical and electrical properties, atomic concentration surface analysis data, as well as general data such as sample placement on the satellite, A-O flux, equivalent sun hours, etc. Each data point is referenced to the primary investigator(s) and the published paper from which the data was taken. The MAPTIS system is envisioned to become the central location for all LDEF materials data. This paper consists of multiple parts, comprising a general overview of the MAPTIS System and the types of data contained within, and the specific LDEF data element and the data contained in that segment.

  20. Transforming data into action: the Sonoma County Human Services Department.

    PubMed

    Harrison, Lindsay

    2012-01-01

    In order to centralize data-based initiatives, the Director of the Department worked with the Board of Supervisors and the executive team to develop a new Planning, Research, and Evaluation (PRE) division. PRE is establishing rules for data-based decision making and consolidating data collection to ensure quality and consistency. It aims to target resources toward visionary, pro-active program planning and implementation, and inform the public about the role of Human Services in creating a healthy, safe and productive environment. PRE staff spent several months studying the job functions of staff, to determine how they use information to inform practice, consulting other counties about their experiences. The PRE team developed Datascript, outlining two agency aims: (a) foster a decision-making environment that values and successfully uses empirical evidence for strategic change, and (b) manage the role and image of the Human Services Department in the external environment. The case study describes action steps developed to achieve each aim. Copyright © Taylor & Francis Group, LLC

  1. Effect of health information technology interventions on lipid management in clinical practice: a systematic review of randomized controlled trials.

    PubMed

    Aspry, Karen E; Furman, Roy; Karalis, Dean G; Jacobson, Terry A; Zhang, Audrey M; Liptak, Gregory S; Cohen, Jerome D

    2013-01-01

    Large gaps in lipid treatment and medication adherence persist in high-risk outpatients in the United States. Health information technology (HIT) is being applied to close quality gaps in chronic illness care, but its utility for lipid management has not been widely studied. To perform a qualitative review of the impact of HIT interventions on lipid management processes of care (screening or testing; drug initiation, titration or adherence; or referrals) or clinical outcomes (percent at low density lipoprotein cholesterol goal; absolute lipid levels; absolute risk scores; or cardiac hospitalizations) in outpatients with coronary heart disease or at increased risk. PubMed and Google Scholar databases were searched using Medical Subject Headings related to clinical informatics and cholesterol or lipid management. English language articles that described a randomized controlled design, tested at least one HIT tool in high risk outpatients, and reported at least 1 lipid management process measure or clinical outcome, were included. Thirty-four studies that enrolled 87,874 persons were identified. Study ratings, outcomes, and magnitude of effects varied widely. Twenty-three trials reported a significant positive effect from a HIT tool on lipid management, but only 14 showed evidence that HIT interventions improve clinical outcomes. There was mixed evidence that provider-level computerized decision support improves outcomes. There was more evidence in support of patient-level tools that provide connectivity to the healthcare system, as well as system-level interventions that involve database monitoring and outreach by centralized care teams. Randomized controlled trials show wide variability in the effects of HIT on lipid management outcomes. Evidence suggests that multilevel HIT approaches that target not only providers but include patients and systems approaches will be needed to improve lipid treatment, adherence and quality. Copyright © 2013 National Lipid Association. Published by Elsevier Inc. All rights reserved.

  2. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FOREST SERVICE... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases...

  3. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS NATIONAL PARK... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases...

  4. Enabling heterogenous multi-scale database for emergency service functions through geoinformation technologies

    NASA Astrophysics Data System (ADS)

    Bhanumurthy, V.; Venugopala Rao, K.; Srinivasa Rao, S.; Ram Mohan Rao, K.; Chandra, P. Satya; Vidhyasagar, J.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Geographical Information Science (GIS) is now graduated from traditional desktop system to Internet system. Internet GIS is emerging as one of the most promising technologies for addressing Emergency Management. Web services with different privileges are playing an important role in dissemination of the emergency services to the decision makers. Spatial database is one of the most important components in the successful implementation of Emergency Management. It contains spatial data in the form of raster, vector, linked with non-spatial information. Comprehensive data is required to handle emergency situation in different phases. These database elements comprise core data, hazard specific data, corresponding attribute data, and live data coming from the remote locations. Core data sets are minimum required data including base, thematic, infrastructure layers to handle disasters. Disaster specific information is required to handle a particular disaster situation like flood, cyclone, forest fire, earth quake, land slide, drought. In addition to this Emergency Management require many types of data with spatial and temporal attributes that should be made available to the key players in the right format at right time. The vector database needs to be complemented with required resolution satellite imagery for visualisation and analysis in disaster management. Therefore, the database is interconnected and comprehensive to meet the requirement of an Emergency Management. This kind of integrated, comprehensive and structured database with appropriate information is required to obtain right information at right time for the right people. However, building spatial database for Emergency Management is a challenging task because of the key issues such as availability of data, sharing policies, compatible geospatial standards, data interoperability etc. Therefore, to facilitate using, sharing, and integrating the spatial data, there is a need to define standards to build emergency database systems. These include aspects such as i) data integration procedures namely standard coding scheme, schema, meta data format, spatial format ii) database organisation mechanism covering data management, catalogues, data models iii) database dissemination through a suitable environment, as a standard service for effective service dissemination. National Database for Emergency Management (NDEM) is such a comprehensive database for addressing disasters in India at the national level. This paper explains standards for integrating, organising the multi-scale and multi-source data with effective emergency response using customized user interfaces for NDEM. It presents standard procedure for building comprehensive emergency information systems for enabling emergency specific functions through geospatial technologies.

  5. Wireless LAN security management with location detection capability in hospitals.

    PubMed

    Tanaka, K; Atarashi, H; Yamaguchi, I; Watanabe, H; Yamamoto, R; Ohe, K

    2012-01-01

    In medical institutions, unauthorized access points and terminals obstruct the stable operation of a large-scale wireless local area network (LAN) system. By establishing a real-time monitoring method to detect such unauthorized wireless devices, we can improve the efficiency of security management. We detected unauthorized wireless devices by using a centralized wireless LAN system and a location detection system at 370 access points at the University of Tokyo Hospital. By storing the detected radio signal strength and location information in a database, we evaluated the risk level from the detection history. We also evaluated the location detection performance in our hospital ward using Wi-Fi tags. The presence of electric waves outside the hospital and those emitted from portable game machines with wireless communication capability was confirmed from the detection result. The location detection performance showed an error margin of approximately 4 m in detection accuracy and approximately 5% in false detection. Therefore, it was effective to consider the radio signal strength as both an index of likelihood at the detection location and an index for the level of risk. We determined the location of wireless devices with high accuracy by filtering the detection results on the basis of radio signal strength and detection history. Results of this study showed that it would be effective to use the developed location database containing radio signal strength and detection history for security management of wireless LAN systems and more general-purpose location detection applications.

  6. The Best Anticoagulation Therapy in Multiple-Trauma Patients with Mechanical Heart Valves: Evaluation of Latest Guidelines and Studies.

    PubMed

    Moeinipour, Aliasghar; Zarifian, Ahmadreza; Sheikh Andalibi, Mohammad Sobhan; Shamloo, Alireza Sepehri; Ahmadabadi, Ali; Amouzeshi, Ahmad; Hoseinikhah, Hamid

    2015-12-22

    It is common practice for patients with prosthetic cardiac devices, especially heart valve prosthesis, arterial stents, defibrillators, and pacemaker devices, to use anticoagulation treatment. When these patients suffer from multiple trauma after motor vehicle accidents, the best medical management for this challenging position is mandatory. This strategy should include a rapid diagnosis of all possible multiple organ injuries, with special attention to anticoagulation therapy so as to minimize the risk of thromboembolism complication in prosthetic devices. In this review, we describe the best medical management for patients with multiple trauma who use anticoagulants after heart valve replacement. We searched electronic databases PubMed/Medline, Scopus, Embase, and Google Scholar using the following terms: anticoagulant, warfarin, heparin, and multiple trauma. Also, similar studies suggested by the databases were included. Non-English articles were excluded from the review. For patients who use anticoagulation therapy, teamwork between cardiac surgeons, general surgeons, anesthesiologists, and cardiologists is essential. For optimal medical management, multiple consults between members of this team is mandatory for rapid diagnosis of all possible damaged organs, with special attention to the central nervous system, chest, and abdominal traumas. With this strategy, it is important to take note of anticoagulation drugs to minimize the risk of thromboembolism complications in cardiac devices. The best anticoagulant agents for emergency operations in patients with multiple trauma who are using an anticoagulant after heart valve replacement are fresh frozen plasma (FFP) and prothrombin complex concentrates (PCC).

  7. Karlsruhe Database for Radioactive Wastes (KADABRA) - Accounting and Management System for Radioactive Waste Treatment - 12275

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Himmerkus, Felix; Rittmeyer, Cornelia

    2012-07-01

    The data management system KADABRA was designed according to the purposes of the Cen-tral Decontamination Department (HDB) of the Wiederaufarbeitungsanlage Karlsruhe Rueckbau- und Entsorgungs-GmbH (WAK GmbH), which is specialized in the treatment and conditioning of radioactive waste. The layout considers the major treatment processes of the HDB as well as regulatory and legal requirements. KADABRA is designed as an SAG ADABAS application on IBM system Z mainframe. The main function of the system is the data management of all processes related to treatment, transfer and storage of radioactive material within HDB. KADABRA records the relevant data concerning radioactive residues, interimmore » products and waste products as well as the production parameters relevant for final disposal. Analytical data from the laboratory and non destructive assay systems, that describe the chemical and radiological properties of residues, production batches, interim products as well as final waste products, can be linked to the respective dataset for documentation and declaration. The system enables the operator to trace the radioactive material through processing and storage. Information on the actual sta-tus of the material as well as radiological data and storage position can be gained immediately on request. A variety of programs accessed to the database allow the generation of individual reports on periodic or special request. KADABRA offers a high security standard and is constantly adapted to the recent requirements of the organization. (authors)« less

  8. Alternatives to relational databases in precision medicine: Comparison of NoSQL approaches for big data storage using supercomputers

    NASA Astrophysics Data System (ADS)

    Velazquez, Enrique Israel

    Improvements in medical and genomic technologies have dramatically increased the production of electronic data over the last decade. As a result, data management is rapidly becoming a major determinant, and urgent challenge, for the development of Precision Medicine. Although successful data management is achievable using Relational Database Management Systems (RDBMS), exponential data growth is a significant contributor to failure scenarios. Growing amounts of data can also be observed in other sectors, such as economics and business, which, together with the previous facts, suggests that alternate database approaches (NoSQL) may soon be required for efficient storage and management of big databases. However, this hypothesis has been difficult to test in the Precision Medicine field since alternate database architectures are complex to assess and means to integrate heterogeneous electronic health records (EHR) with dynamic genomic data are not easily available. In this dissertation, we present a novel set of experiments for identifying NoSQL database approaches that enable effective data storage and management in Precision Medicine using patients' clinical and genomic information from the cancer genome atlas (TCGA). The first experiment draws on performance and scalability from biologically meaningful queries with differing complexity and database sizes. The second experiment measures performance and scalability in database updates without schema changes. The third experiment assesses performance and scalability in database updates with schema modifications due dynamic data. We have identified two NoSQL approach, based on Cassandra and Redis, which seems to be the ideal database management systems for our precision medicine queries in terms of performance and scalability. We present NoSQL approaches and show how they can be used to manage clinical and genomic big data. Our research is relevant to the public health since we are focusing on one of the main challenges to the development of Precision Medicine and, consequently, investigating a potential solution to the progressively increasing demands on health care.

  9. Implementation of the CUAHSI information system for regional hydrological research and workflow

    NASA Astrophysics Data System (ADS)

    Bugaets, Andrey; Gartsman, Boris; Bugaets, Nadezhda; Krasnopeyev, Sergey; Krasnopeyeva, Tatyana; Sokolov, Oleg; Gonchukov, Leonid

    2013-04-01

    Environmental research and education have become increasingly data-intensive as a result of the proliferation of digital technologies, instrumentation, and pervasive networks through which data are collected, generated, shared, and analyzed. Over the next decade, it is likely that science and engineering research will produce more scientific data than has been created over the whole of human history (Cox et al., 2006). Successful using these data to achieve new scientific breakthroughs depends on the ability to access, organize, integrate, and analyze these large datasets. The new project of PGI FEB RAS (http://tig.dvo.ru), FERHRI (www.ferhri.org) and Primgidromet (www.primgidromet.ru) is focused on creation of an open unified hydrological information system according to the international standards to support hydrological investigation, water management and forecasts systems. Within the hydrologic science community, the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (http://his.cuahsi.org) has been developing a distributed network of data sources and functions that are integrated using web services and that provide access to data, tools, and models that enable synthesis, visualization, and evaluation of hydrologic system behavior. Based on the top of CUAHSI technologies two first template databases were developed for primary datasets of special observations on experimental basins in the Far East Region of Russia. The first database contains data of special observation performed on the former (1957-1994) Primorskaya Water-Balance Station (1500 km2). Measurements were carried out on 20 hydrological and 40 rain gauging station and were published as special series but only as hardcopy books. Database provides raw data from loggers with hourly and daily time support. The second database called «FarEastHydro» provides published standard daily measurement performed at Roshydromet observation network (200 hydrological and meteorological stations) for the period beginning 1930 through 1990. Both of the data resources are maintained in a test mode at the project site http://gis.dvo.ru:81/, which is permanently updated. After first success, the decision was made to use the CUAHSI technology as a basis for development of hydrological information system to support data publishing and workflow of Primgidromet, the regional office of Federal State Hydrometeorological Agency. At the moment, Primgidromet observation network is equipped with 34 automatic SEBA hydrological pressure sensor pneumatic gauges PS-Light-2 and 36 automatic SEBA weather stations. Large datasets generated by sensor networks are organized and stored within a central ODM database which allows to unambiguously interpret the data with sufficient metadata and provides traceable heritage from raw measurements to useable information. Organization of the data within a central CUAHSI ODM database was the most critical step, with several important implications. This technology is widespread and well documented, and it ensures that all datasets are publicly available and readily used by other investigators and developers to support additional analyses and hydrological modeling. Implementation of ODM within a Relational Database Management System eliminates the potential data manipulation errors and intermediate the data processing steps. Wrapping CUAHSI WaterOneFlow web-service into OpenMI 2.0 linkable component (www.openmi.org) allows a seamless integration with well-known hydrological modeling systems.

  10. Reading Gate Positions with a Smartphone

    NASA Astrophysics Data System (ADS)

    van Overloop, Peter-Jules; Hut, Rolf

    2015-04-01

    Worldwide many flow gates are built in water networks in order to direct water to appropriate locations. Most of these gates are adjusted manually by field operators of water management organizations and it is often centrally not known what the new position of the gate is. This makes centralized management of the entire water network difficult. One of the reasons why the measurement of the gate position is usually not executed, is that for certain gates it is not easy to do such a reading. Tilting weirs or radial gates are examples where operators need special equipment (measuring rod and long level) to determine the position and it could even be a risky procedure. Another issue is that once the measurement is done, the value is jotted down in a notebook and later, at the office, entered in a computer system. So the entire monitoring procedure is not real-time and prone to human errors. A new way of monitoring gate positions is introduced. It consists of a level that is attached to the gate and an app with which a picture can be taken from the level. Using dedicated pattern recognition algorithms, the gate position can be read by using the angle of the level versus reference points on the gate, the radius of that gate and the absolute level of the joint around which the gate turn. The method uses gps-localization of the smartphone to store the gate position in the right location in the central database.

  11. Cell Phone-Based System (Chaak) for Surveillance of Immatures of Dengue Virus Mosquito Vectors

    PubMed Central

    LOZANO–FUENTES, SAUL; WEDYAN, FADI; HERNANDEZ–GARCIA, EDGAR; SADHU, DEVADATTA; GHOSH, SUDIPTO; BIEMAN, JAMES M.; TEP-CHEL, DIANA; GARCÍA–REJÓN, JULIÁN E.; EISEN, LARS

    2014-01-01

    Capture of surveillance data on mobile devices and rapid transfer of such data from these devices into an electronic database or data management and decision support systems promote timely data analyses and public health response during disease outbreaks. Mobile data capture is used increasingly for malaria surveillance and holds great promise for surveillance of other neglected tropical diseases. We focused on mosquito-borne dengue, with the primary aims of: 1) developing and field-testing a cell phone-based system (called Chaak) for capture of data relating to the surveillance of the mosquito immature stages, and 2) assessing, in the dengue endemic setting of Mérida, México, the cost-effectiveness of this new technology versus paper-based data collection. Chaak includes a desktop component, where a manager selects premises to be surveyed for mosquito immatures, and a cell phone component, where the surveyor receives the assigned tasks and captures the data. Data collected on the cell phone can be transferred to a central database through different modes of transmission, including near-real time where data are transferred immediately (e.g., over the Internet) or by first storing data on the cell phone for future transmission. Spatial data are handled in a novel, semantically driven, geographic information system. Compared with a pen-and-paper-based method, use of Chaak improved the accuracy and increased the speed of data transcription into an electronic database. The cost-effectiveness of using the Chaak system will depend largely on the up-front cost of purchasing cell phones and the recurring cost of data transfer over a cellular network. PMID:23926788

  12. The effect of care pathways for hip fractures: a systematic review.

    PubMed

    Leigheb, Fabrizio; Vanhaecht, Kris; Sermeus, Walter; Lodewijckx, Cathy; Deneckere, Svin; Boonen, Steven; Boto, Paulo Alexandre Faria; Mendes, Rita Veloso; Panella, Massimiliano

    2012-07-01

    We performed a systematic review for primary studies on care pathways (CPs) for hip fracture (HF). The online databases MEDLINE-PubMed, Ovid-EMBASE, CINAHL-EBSCO host, and The Cochrane Library (Cochrane Central Register of Clinical Trials, Health Technology Assessment Database, NHS Economic Evaluation Database) were searched. Two researchers reviewed the literature independently. Primary studies that met predefined inclusion criteria were assessed for their methodological quality. A total of 15 publications were included: 15 primary studies corresponding with 12 main investigations. Primary studies were evaluated for clinical outcomes, process outcomes, and economic outcomes. The studies assessed a wide range of outcome measures. While a number of divergent clinical outcomes were reported, most studies showed positive results of process management and health-services utilization. In terms of mortality, the results provided evidence for a positive impact of CPs on in-hospital mortality. Most studies also showed a significantly reduced risk of complications, including medical complications, wound infections, and pressure sores. Moreover, time-span process measures showed that an improvement in the organization of care was achieved through the use of CPs. Conflicting results were observed with regard to functional recovery and mobility between patients treated with CPs compared to usual care. Although our review suggests that CPs can have positive effects in patients with HF, the available evidence is insufficient for formal recommendations. There is a need for more research on CPs with selected process and outcome indicators, for in-hospital and postdischarge management of HF, with an emphasis on well-designed randomized trials.

  13. --No Title--

    Science.gov Websites

    interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and

  14. 76 FR 73564 - Federal Acquisition Regulation; Updates to Contract Reporting and Central Contractor Registration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-29

    ... Federal Acquisition Regulation; Updates to Contract Reporting and Central Contractor Registration AGENCIES... Procurement Data System (FPDS). Additionally, changes are proposed for the clauses requiring contractor registration in the Central Contractor Registration (CCR) database and DUNS number reporting. DATES: Interested...

  15. Databases for multilevel biophysiology research available at Physiome.jp.

    PubMed

    Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki

    2015-01-01

    Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications.

  16. Blending Education and Polymer Science: Semiautomated Creation of a Thermodynamic Property Database

    ERIC Educational Resources Information Center

    Tchoua, Roselyne B.; Qin, Jian; Audus, Debra J.; Chard, Kyle; Foster, Ian T.; de Pablo, Juan

    2016-01-01

    Structured databases of chemical and physical properties play a central role in the everyday research activities of scientists and engineers. In materials science, researchers and engineers turn to these databases to quickly query, compare, and aggregate various properties, thereby allowing for the development or application of new materials. The…

  17. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2002-08-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, and relational databases, as well as ACeDB. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system.

  18. Mass-storage management for distributed image/video archives

    NASA Astrophysics Data System (ADS)

    Franchi, Santina; Guarda, Roberto; Prampolini, Franco

    1993-04-01

    The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.

  19. Acupuncture for treating sciatica: a systematic review protocol

    PubMed Central

    Qin, Zongshi; Liu, Xiaoxu; Yao, Qin; Zhai, Yanbing; Liu, Zhishun

    2015-01-01

    Introduction This systematic review aims to assess the effectiveness and safety of acupuncture for treating sciatica. Methods The following nine databases will be searched from their inception to 30 October 2014: MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials (CENTRAL), the Chinese Biomedical Literature Database (CBM), the Chinese Medical Current Content (CMCC), the Chinese Scientific Journal Database (VIP database), the Wan-Fang Database, the China National Knowledge Infrastructure (CNKI) and Citation Information by National Institute of Informatics (CiNii). Randomised controlled trials (RCTs) of acupuncture for sciatica in English, Chinese or Japanese without restriction of publication status will be included. Two researchers will independently undertake study selection, extraction of data and assessment of study quality. Meta-analysis will be conducted after screening of studies. Data will be analysed using risk ratio for dichotomous data, and standardised mean difference or weighted mean difference for continuous data. Dissemination This systematic review will be disseminated electronically through a peer-reviewed publication or conference presentations. Trial registration number PROSPERO CRD42014015001. PMID:25922105

  20. A spatial-temporal system for dynamic cadastral management.

    PubMed

    Nan, Liu; Renyi, Liu; Guangliang, Zhu; Jiong, Xie

    2006-03-01

    A practical spatio-temporal database (STDB) technique for dynamic urban land management is presented. One of the STDB models, the expanded model of Base State with Amendments (BSA), is selected as the basis for developing the dynamic cadastral management technique. Two approaches, the Section Fast Indexing (SFI) and the Storage Factors of Variable Granularity (SFVG), are used to improve the efficiency of the BSA model. Both spatial graphic data and attribute data, through a succinct engine, are stored in standard relational database management systems (RDBMS) for the actual implementation of the BSA model. The spatio-temporal database is divided into three interdependent sub-databases: present DB, history DB and the procedures-tracing DB. The efficiency of database operation is improved by the database connection in the bottom layer of the Microsoft SQL Server. The spatio-temporal system can be provided at a low-cost while satisfying the basic needs of urban land management in China. The approaches presented in this paper may also be of significance to countries where land patterns change frequently or to agencies where financial resources are limited.

  1. Delivery system characteristics and their association with quality and costs of care: implications for accountable care organizations.

    PubMed

    Chukmaitov, Askar; Harless, David W; Bazzoli, Gloria J; Carretta, Henry J; Siangphoe, Umaporn

    2015-01-01

    Implementation of accountable care organizations (ACOs) is currently underway, but there is limited empirical evidence on the merits of the ACO model. The aim was to study the associations between delivery system characteristics and ACO competencies, including centralization strategies to manage organizations, hospital integration with physicians and outpatient facilities, health information technology, infrastructure to monitor community health and report quality, and risk-adjusted 30-day all-cause mortality and case-mixed-adjusted inpatient costs for the Medicare population. Panel data (2006-2009) were assembled from Florida and multiple sources: inpatient hospital discharge, vital statistics, the American Hospital Association, the Healthcare Information and Management Systems Society, and other databases. We applied a panel study design, controlling for hospital and market characteristics. Hospitals that were in centralized health systems or became more centralized over the study period had significantly larger reductions in mortality compared with hospitals that remained freestanding. Surprisingly, tightly integrated hospital-physician arrangements were associated with increased mortality; as such, hospitals may wish to proceed cautiously when developing specific types of alignment with local physician organizations. We observed no statistically significant differences in the growth rate of costs across hospitals in any of the health systems studied relative to freestanding hospitals. Although we observed quality improvement in some organizational types, these outcome improvements were not coupled with the additional desired objective of lower cost growth. This implies that additional changes not present during our study period, potentially changes in provider payment approaches, are essential for achieving the ACO objectives of higher quality of care at lower costs. Provider organizations implementing ACOs should consider centralizing service delivery as a viable strategy to improve quality of care, although the strategy did not result in lower cost growth.

  2. Central Colorado Assessment Project (CCAP)-Geochemical data for rock, sediment, soil, and concentrate sample media

    USGS Publications Warehouse

    Granitto, Matthew; DeWitt, Ed H.; Klein, Terry L.

    2010-01-01

    This database was initiated, designed, and populated to collect and integrate geochemical data from central Colorado in order to facilitate geologic mapping, petrologic studies, mineral resource assessment, definition of geochemical baseline values and statistics, environmental impact assessment, and medical geology. The Microsoft Access database serves as a geochemical data warehouse in support of the Central Colorado Assessment Project (CCAP) and contains data tables describing historical and new quantitative and qualitative geochemical analyses determined by 70 analytical laboratory and field methods for 47,478 rock, sediment, soil, and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed either in the analytical laboratories of the USGS or by contract with commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects. In addition, geochemical data from 7,470 sediment and soil samples collected and analyzed under the Atomic Energy Commission National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program (henceforth called NURE) have been included in this database. In addition to data from 2,377 samples collected and analyzed under CCAP, this dataset includes archived geochemical data originally entered into the in-house Rock Analysis Storage System (RASS) database (used by the USGS from the mid-1960s through the late 1980s) and the in-house PLUTO database (used by the USGS from the mid-1970s through the mid-1990s). All of these data are maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB and from the NURE database were used to generate most of this dataset. In addition, USGS data that have been excluded previously from the NGDB because the data predate earliest USGS geochemical databases, or were once excluded for programmatic reasons, have been included in the CCAP Geochemical Database and are planned to be added to the NGDB.

  3. The watershed and river systems management program

    USGS Publications Warehouse

    Markstrom, S.L.; Frevert, D.; Leavesley, G.H.; ,

    2005-01-01

    The Watershed and River System Management Program (WaRSMP), a joint effort between the U.S. Geological Survey (USGS) and the U.S. Bureau of Reclamation (Reclamation), is focused on research and development of decision support systems and their application to achieve an equitable balance among diverse water resource management demands. Considerations include: (1) legal and political constraints; (2) stake holder and consensus-building; (3) sound technical knowledge; (4) flood control, consumptive use, and hydropower; (5) water transfers; (6) irrigation return flows and water quality; (7) recreation; (8) habitat for endangered species; (9) water supply and proration; (10) near-surface groundwater; and (11) water ownership, accounting, and rights. To address the interdisciplinary and multi-stake holder needs of real-time watershed management, WaRSMP has developed a decision support system toolbox. The USGS Object User Interface facilitates the coupling of Reclamation's RiverWare reservoir operations model with the USGS Modular Modeling and Precipitation Runoff Modeling Systems through a central database. This integration is accomplished through the use of Model and Data Management Interfaces. WaRSMP applications include Colorado River Main stem and Gunnison Basin, the Yakima Basin, the Middle Rio Grande Basin, the Truckee-Carson Basin, and the Umatilla Basin.

  4. Nutritional metabolomics: Progress in addressing complexity in diet and health

    PubMed Central

    Jones, Dean P.; Park, Youngja; Ziegler, Thomas R.

    2013-01-01

    Nutritional metabolomics is rapidly maturing to use small molecule chemical profiling to support integration of diet and nutrition in complex biosystems research. These developments are critical to facilitate transition of nutritional sciences from population-based to individual-based criteria for nutritional research, assessment and management. This review addresses progress in making these approaches manageable for nutrition research. Important concept developments concerning the exposome, predictive health and complex pathobiology, serve to emphasize the central role of diet and nutrition in integrated biosystems models of health and disease. Improved analytic tools and databases for targeted and non-targeted metabolic profiling, along with bioinformatics, pathway mapping and computational modeling, are now used for nutrition research on diet, metabolism, microbiome and health associations. These new developments enable metabolome-wide association studies (MWAS) and provide a foundation for nutritional metabolomics, along with genomics, epigenomics and health phenotyping, to support integrated models required for personalized diet and nutrition forecasting. PMID:22540256

  5. Design of Control Plane Architecture Based on Cloud Platform and Experimental Network Demonstration for Multi-domain SDON

    NASA Astrophysics Data System (ADS)

    Li, Ming; Yin, Hongxi; Xing, Fangyuan; Wang, Jingchao; Wang, Honghuan

    2016-02-01

    With the features of network virtualization and resource programming, Software Defined Optical Network (SDON) is considered as the future development trend of optical network, provisioning a more flexible, efficient and open network function, supporting intraconnection and interconnection of data centers. Meanwhile cloud platform can provide powerful computing, storage and management capabilities. In this paper, with the coordination of SDON and cloud platform, a multi-domain SDON architecture based on cloud control plane has been proposed, which is composed of data centers with database (DB), path computation element (PCE), SDON controller and orchestrator. In addition, the structure of the multidomain SDON orchestrator and OpenFlow-enabled optical node are proposed to realize the combination of centralized and distributed effective management and control platform. Finally, the functional verification and demonstration are performed through our optical experiment network.

  6. Development of a Personal Digital Assistant (PDA) based client/server NICU patient data and charting system.

    PubMed

    Carroll, A E; Saluja, S; Tarczy-Hornoch, P

    2001-01-01

    Personal Digital Assistants (PDAs) offer clinicians the ability to enter and manage critical information at the point of care. Although PDAs have always been designed to be intuitive and easy to use, recent advances in technology have made them even more accessible. The ability to link data on a PDA (client) to a central database (server) allows for near-unlimited potential in developing point of care applications and systems for patient data management. Although many stand-alone systems exist for PDAs, none are designed to work in an integrated client/server environment. This paper describes the design, software and hardware selection, and preliminary testing of a PDA based patient data and charting system for use in the University of Washington Neonatal Intensive Care Unit (NICU). This system will be the subject of a subsequent study to determine its impact on patient outcomes and clinician efficiency.

  7. Computer Science and Technology: Modeling and Measurement Techniques for Evaluation of Design Alternatives in the Implementation of Database Management Software. Final Report.

    ERIC Educational Resources Information Center

    Deutsch, Donald R.

    This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…

  8. Database Management System

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In 1981 Wayne Erickson founded Microrim, Inc, a company originally focused on marketing a microcomputer version of RIM (Relational Information Manager). Dennis Comfort joined the firm and is now vice president, development. The team developed an advanced spinoff from the NASA system they had originally created, a microcomputer database management system known as R:BASE 4000. Microrim added many enhancements and developed a series of R:BASE products for various environments. R:BASE is now the second largest selling line of microcomputer database management software in the world.

  9. Qualitative research in teen experiences living with food-induced anaphylaxis: A meta-aggregation.

    PubMed

    Johnson, Sara F; Woodgate, Roberta L

    2017-11-01

    To describe the central experiences of teens living with food-induced anaphylaxis as a first step in responding to healthcare needs in this population. As prevalence of allergy increases and commonly outgrown allergies persist longer, chronic management for teens becomes increasingly important. Synthesizing existing research helps to recognize management needs specific to teens with food allergy. Meta-aggregation for qualitative systematic review, to create synthesis for clinical improvement; guided by Joanna Briggs Institute methods and their Qualitative Assessment and Review Instrument. Seven relevant databases were searched for original qualitative research July 2015; 10 studies (published 2007-2015) met inclusion criteria. Both authors undertook critical appraisal, with consensus by discussion. Findings from line-by-line extraction were grouped into categories and syntheses. In studies with mixed populations, we included only teens (age 12-19) with food-induced anaphylaxis. We developed three syntheses from nine categories and 64 subcategories to reflect central experiences of teens with food-induced anaphylaxis, including: (1) defining the allergic self; (2) finding a balance and (3) controlling the uncontrollable. The syntheses encompass importance of allergic identity/understanding, difficulties in coping with burdens of food allergy and reflect the complex risk interactions teens must negotiate in social contexts. There is a need to respect teens as active participants in managing food-induced anaphylaxis, while recognizing that social expectations and a lack of public awareness/safety can dangerously affect one's needs and decisions. This helps broaden how we conceptualize the needs of teens living with food-induced anaphylaxis, informing ongoing care and management. © 2017 John Wiley & Sons Ltd.

  10. NBIC: National Ballast Information Clearinghouse

    Science.gov Websites

    Smithsonian Environmental Research Center Logo US Coast Guard Logo Submit BW Report | Search NBIC Database / Database Manager: Tami Huber Senior Analyst / Ecologist: Mark Minton Data Managers Ashley Arnwine Jessica Hardee Amanda Reynolds Database Design and Programming / Application Programming: Paul Winterbauer

  11. AGRICULTURAL BEST MANAGEMENT PRACTICE EFFECTIVENESS DATABASE

    EPA Science Inventory

    Resource Purpose:The Agricultural Best Management Practice Effectiveness Database contains the results of research projects which have collected water quality data for the purpose of determining the effectiveness of agricultural management practices in reducing pollutants ...

  12. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    PubMed

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Assessing the quality of life history information in publicly available databases.

    PubMed

    Thorson, James T; Cope, Jason M; Patrick, Wesley S

    2014-01-01

    Single-species life history parameters are central to ecological research and management, including the fields of macro-ecology, fisheries science, and ecosystem modeling. However, there has been little independent evaluation of the precision and accuracy of the life history values in global and publicly available databases. We therefore develop a novel method based on a Bayesian errors-in-variables model that compares database entries with estimates from local experts, and we illustrate this process by assessing the accuracy and precision of entries in FishBase, one of the largest and oldest life history databases. This model distinguishes biases among seven life history parameters, two types of information available in FishBase (i.e., published values and those estimated from other parameters), and two taxa (i.e., bony and cartilaginous fishes) relative to values from regional experts in the United States, while accounting for additional variance caused by sex- and region-specific life history traits. For published values in FishBase, the model identifies a small positive bias in natural mortality and negative bias in maximum age, perhaps caused by unacknowledged mortality caused by fishing. For life history values calculated by FishBase, the model identified large and inconsistent biases. The model also demonstrates greatest precision for body size parameters, decreased precision for values derived from geographically distant populations, and greatest between-sex differences in age at maturity. We recommend that our bias and precision estimates be used in future errors-in-variables models as a prior on measurement errors. This approach is broadly applicable to global databases of life history traits and, if used, will encourage further development and improvements in these databases.

  14. XML: James Webb Space Telescope Database Issues, Lessons, and Status

    NASA Technical Reports Server (NTRS)

    Detter, Ryan; Mooney, Michael; Fatig, Curtis

    2003-01-01

    This paper will present the current concept using extensible Markup Language (XML) as the underlying structure for the James Webb Space Telescope (JWST) database. The purpose of using XML is to provide a JWST database, independent of any portion of the ground system, yet still compatible with the various systems using a variety of different structures. The testing of the JWST Flight Software (FSW) started in 2002, yet the launch is scheduled for 2011 with a planned 5-year mission and a 5-year follow on option. The initial database and ground system elements, including the commands, telemetry, and ground system tools will be used for 19 years, plus post mission activities. During the Integration and Test (I&T) phases of the JWST development, 24 distinct laboratories, each geographically dispersed, will have local database tools with an XML database. Each of these laboratories database tools will be used for the exporting and importing of data both locally and to a central database system, inputting data to the database certification process, and providing various reports. A centralized certified database repository will be maintained by the Space Telescope Science Institute (STScI), in Baltimore, Maryland, USA. One of the challenges for the database is to be flexible enough to allow for the upgrade, addition or changing of individual items without effecting the entire ground system. Also, using XML should allow for the altering of the import and export formats needed by the various elements, tracking the verification/validation of each database item, allow many organizations to provide database inputs, and the merging of the many existing database processes into one central database structure throughout the JWST program. Many National Aeronautics and Space Administration (NASA) projects have attempted to take advantage of open source and commercial technology. Often this causes a greater reliance on the use of Commercial-Off-The-Shelf (COTS), which is often limiting. In our review of the database requirements and the COTS software available, only very expensive COTS software will meet 90% of requirements. Even with the high projected initial cost of COTS, the development and support for custom code over the 19-year mission period was forecasted to be higher than the total licensing costs. A group did look at reusing existing database tools and formats. If the JWST database was already in a mature state, the reuse made sense, but with the database still needing to handing the addition of different types of command and telemetry structures, defining new spacecraft systems, accept input and export to systems which has not been defined yet, XML provided the flexibility desired. It remains to be determined whether the XML database will reduce the over all cost for the JWST mission.

  15. 15 years of monitoring occupational exposure to respirable dust and quartz within the European industrial minerals sector.

    PubMed

    Zilaout, Hicham; Vlaanderen, Jelle; Houba, Remko; Kromhout, Hans

    2017-07-01

    In 2000, a prospective Dust Monitoring Program (DMP) was started in which measurements of worker's exposure to respirable dust and quartz are collected in member companies from the European Industrial Minerals Association (IMA-Europe). After 15 years, the resulting IMA-DMP database allows a detailed overview of exposure levels of respirable dust and quartz over time within this industrial sector. Our aim is to describe the IMA-DMP and the current state of the corresponding database which due to continuation of the IMA-DMP is still growing. The future use of the database will also be highlighted including its utility for the industrial minerals producing sector. Exposure data are being obtained following a common protocol including a standardized sampling strategy, standardized sampling and analytical methods and a data management system. Following strict quality control procedures, exposure data are consequently added to a central database. The data comprises personal exposure measurements including auxiliary information on work and other conditions during sampling. Currently, the IMA-DMP database consists of almost 28,000 personal measurements which have been performed from 2000 until 2015 representing 29 half-yearly sampling campaigns. The exposure data have been collected from 160 different worksites owned by 35 industrial mineral companies and comes from 23 European countries and approximately 5000 workers. The IMA-DMP database provides the European minerals sector with reliable data regarding worker personal exposures to respirable dust and quartz. The database can be used as a powerful tool to address outstanding scientific issues on long-term exposure trends and exposure variability, and importantly, as a surveillance tool to evaluate exposure control measures. The database will be valuable for future epidemiological studies on respiratory health effects and will allow for estimation of quantitative exposure response relationships. Copyright © 2017 The Authors. Published by Elsevier GmbH.. All rights reserved.

  16. Orthotic management of instability of the knee related to neuromuscular and central nervous system disorders: systematic review, qualitative study, survey and costing analysis.

    PubMed

    O'Connor, Joanne; McCaughan, Dorothy; McDaid, Catriona; Booth, Alison; Fayter, Debra; Rodriguez-Lopez, Roccio; Bowers, Roy; Dyson, Lisa; Iglesias, Cynthia P; Lalor, Simon; O'Connor, Rory J; Phillips, Margaret; Ramdharry, Gita

    2016-07-01

    Patients who have knee instability that is associated with neuromuscular disease (NMD) and central nervous system (CNS) conditions can be treated using orthoses, such as knee-ankle-foot orthoses (KAFOs). To assess existing evidence on the effectiveness of orthoses; patient perspectives; types of orthotic devices prescribed in the UK NHS; and associated costs. Qualitative study of views of orthoses users - a qualitative in-depth interview study was undertaken. Data were analysed for thematic content. A coding scheme was developed and an inductive approach was used to identify themes. Systematic review - 18 databases were searched up to November 2014: MEDLINE, MEDLINE In-Process & Other Non-Indexed Citations, Cumulative Index to Nursing and Allied Health, EMBASE, PASCAL, Scopus, Science Citation Index, BIOSIS Previews, Physiotherapy Evidence Database, Recal Legacy, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, Health Technology Assessment database, Cochrane Central Register of Controlled Trials, Conference Proceedings Citation Index: Science, Health Management Consortium, ClinicalTrials.gov, International Clinical Trials Registry Platform and National Technical Information Service. Studies of adults using an orthosis for instability of the knee related to NMD or a CNS disorder were included. Data were extracted and quality was assessed by two researchers. Narrative synthesis was undertaken. Survey and costing analysis - a web survey of orthotists, physiotherapists and rehabilitation medicine physicians was undertaken. Telephone interviews with orthotists informed a costing analysis. Qualitative study - a total of 24 people participated. Potential for engagement in daily activities was of vital importance to patients; the extent to which their device enabled this was the yardstick by which it was measured. Patients' prime desired outcome was a reduction in pain, falls or trips, with improved balance and stability. Effectiveness, reliability, comfort and durability were the most valued features of orthoses. Many expressed frustration with perceived deficiencies in service provision relating to appointment and administrative systems and referral pathways. Systematic review - a total of 21 studies (478 participants) were included of people who had post-polio syndrome, inclusion body myositis, were post stroke or had spinal cord injury. The studies evaluated KAFOs (mainly carbon fibre), stance control KAFO and hip KAFOs. All of the studies were at risk of bias and, in general, were poorly reported. Survey and costing analysis - in total, 238 health-care professionals responded. A range of orthoses is prescribed for knee instability that is related to NMD or CNS conditions, approximately half being custom-made. At least 50% of respondents thought that comfort and confidence in mobility were extremely important treatment outcomes. The cost of individual KAFOs was highly variable, ranging from £73 to £3553. Various types of orthoses are used in the NHS to manage patients with NMD/CNS conditions and knee instability, both custom-made and prefabricated, of variable cost. Evidence on the effectiveness of the orthoses is limited, especially in relation to the outcomes that are important to orthoses users. The population included was broad, limiting any in-depth consideration of specific conditions. The response rate to the survey was low, and the costing analysis was based on some assumptions that may not reflect the true costs of providing KAFOs. Future work should include high-quality research on the effectiveness and cost-effectiveness of orthoses; development of a core set of outcome measures; further exploration of the views and experiences of patients; and the best models of service delivery. This study is registered as PROSPERO CRD42014010180. The qualitative study is registered as Current Controlled Trials ISRCTN65240228. The National Institute for Health Research Health Technology Assessment programme.

  17. Application of cloud database in the management of clinical data of patients with skin diseases.

    PubMed

    Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning

    2015-04-01

    To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.

  18. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FISH AND... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  19. A web based relational database management system for filariasis control

    PubMed Central

    Murty, Upadhyayula Suryanarayana; Kumar, Duvvuri Venkata Rama Satya; Sriram, Kumaraswamy; Rao, Kadiri Madhusudhan; Bhattacharyulu, Chakravarthula Hayageeva Narasimha Venakata; Praveen, Bhoopathi; Krishna, Amirapu Radha

    2005-01-01

    The present study describes a RDBMS (relational database management system) for the effective management of Filariasis, a vector borne disease. Filariasis infects 120 million people from 83 countries. The possible re-emergence of the disease and the complexity of existing control programs warrant the development of new strategies. A database containing comprehensive data associated with filariasis finds utility in disease control. We have developed a database containing information on the socio-economic status of patients, mosquito collection procedures, mosquito dissection data, filariasis survey report and mass blood data. The database can be searched using a user friendly web interface. Availability http://www.webfil.org (login and password can be obtained from the authors) PMID:17597846

  20. 75 FR 55671 - Financial Assistance Use of Universal Identifier and Central Contractor Registration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-14

    ... of Universal Identifier and Central Contractor Registration AGENCY: Office of Federal Financial...) numbers and maintain current registrations in the Central Contractor Registration (CCR) database. An... CONTRACTOR REGISTRATION Sec. Subpart A--General 25.100 Purposes of this part. 25.105 Types of awards to which...

  1. Data mining and visualization of the Alabama accident database

    DOT National Transportation Integrated Search

    2000-08-01

    The Alabama Department of Public Safety has developed and maintains a centralized database that contain traffic accident data collected from crash report completed by local police officers and state troopers. The Critical Analysis Reporting Environme...

  2. Integrating RFID technique to design mobile handheld inventory management system

    NASA Astrophysics Data System (ADS)

    Huang, Yo-Ping; Yen, Wei; Chen, Shih-Chung

    2008-04-01

    An RFID-based mobile handheld inventory management system is proposed in this paper. Differing from the manual inventory management method, the proposed system works on the personal digital assistant (PDA) with an RFID reader. The system identifies electronic tags on the properties and checks the property information in the back-end database server through a ubiquitous wireless network. The system also provides a set of functions to manage the back-end inventory database and assigns different levels of access privilege according to various user categories. In the back-end database server, to prevent improper or illegal accesses, the server not only stores the inventory database and user privilege information, but also keeps track of the user activities in the server including the login and logout time and location, the records of database accessing, and every modification of the tables. Some experimental results are presented to verify the applicability of the integrated RFID-based mobile handheld inventory management system.

  3. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

  4. Alternatives to relational database: comparison of NoSQL and XML approaches for clinical data storage.

    PubMed

    Lee, Ken Ka-Yin; Tang, Wai-Choi; Choi, Kup-Sze

    2013-04-01

    Clinical data are dynamic in nature, often arranged hierarchically and stored as free text and numbers. Effective management of clinical data and the transformation of the data into structured format for data analysis are therefore challenging issues in electronic health records development. Despite the popularity of relational databases, the scalability of the NoSQL database model and the document-centric data structure of XML databases appear to be promising features for effective clinical data management. In this paper, three database approaches--NoSQL, XML-enabled and native XML--are investigated to evaluate their suitability for structured clinical data. The database query performance is reported, together with our experience in the databases development. The results show that NoSQL database is the best choice for query speed, whereas XML databases are advantageous in terms of scalability, flexibility and extensibility, which are essential to cope with the characteristics of clinical data. While NoSQL and XML technologies are relatively new compared to the conventional relational database, both of them demonstrate potential to become a key database technology for clinical data management as the technology further advances. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  5. The NCBI BioSystems database.

    PubMed

    Geer, Lewis Y; Marchler-Bauer, Aron; Geer, Renata C; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI's Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets.

  6. A general temporal data model and the structured population event history register

    PubMed Central

    Clark, Samuel J.

    2010-01-01

    At this time there are 37 demographic surveillance system sites active in sub-Saharan Africa, Asia and Central America, and this number is growing continuously. These sites and other longitudinal population and health research projects generate large quantities of complex temporal data in order to describe, explain and investigate the event histories of individuals and the populations they constitute. This article presents possible solutions to some of the key data management challenges associated with those data. The fundamental components of a temporal system are identified and both they and their relationships to each other are given simple, standardized definitions. Further, a metadata framework is proposed to endow this abstract generalization with specific meaning and to bind the definitions of the data to the data themselves. The result is a temporal data model that is generalized, conceptually tractable, and inherently contains a full description of the primary data it organizes. Individual databases utilizing this temporal data model can be customized to suit the needs of their operators without modifying the underlying design of the database or sacrificing the potential to transparently share compatible subsets of their data with other similar databases. A practical working relational database design based on this general temporal data model is presented and demonstrated. This work has arisen out of experience with demographic surveillance in the developing world, and although the challenges and their solutions are more general, the discussion is organized around applications in demographic surveillance. An appendix contains detailed examples and working prototype databases that implement the examples discussed in the text. PMID:20396614

  7. Analysis and preliminary design of Kunming land use and planning management information system

    NASA Astrophysics Data System (ADS)

    Li, Li; Chen, Zhenjie

    2007-06-01

    This article analyzes Kunming land use planning and management information system from the system building objectives and system building requirements aspects, nails down the system's users, functional requirements and construction requirements. On these bases, the three-tier system architecture based on C/S and B/S is defined: the user interface layer, the business logic layer and the data services layer. According to requirements for the construction of land use planning and management information database derived from standards of the Ministry of Land and Resources and the construction program of the Golden Land Project, this paper divides system databases into planning document database, planning implementation database, working map database and system maintenance database. In the design of the system interface, this paper uses various methods and data formats for data transmission and sharing between upper and lower levels. According to the system analysis results, main modules of the system are designed as follows: planning data management, the planning and annual plan preparation and control function, day-to-day planning management, planning revision management, decision-making support, thematic inquiry statistics, planning public participation and so on; besides that, the system realization technologies are discussed from the system operation mode, development platform and other aspects.

  8. Profile: the Philippine Population Information Network.

    PubMed

    1991-06-01

    The profile of Philippine Population Information Network (POPIN) is described in this article as having changed management structure from the Population Center Foundation to the Government's Population Commission, Information Management and Research Division (IMRD) in 1989. This restructuring resulted in the transfer in 1990 of the Department of Social Welfare and Development to the Office of the President. POPIN also serves Asia/Pacific POPIN. POPCOM makes policy and coordinates and monitors population activities. POPIN's goal is to improve the flow and utilization of population information nationwide. The National Population Library was moved in 1989 to the POPCOM Central Office Building and became the Philippine Information Center. The collection includes 6000 books, 400 research reports, and 4000 other documents (brochures, reprints, conference materials, and so on); 42 video tapes about the Philippine population program and a cassette player are available. In 1989, 14 regional centers were set up in POPCOM regional offices and designated Regional Population Information Centers. There are also school-based information centers operating as satellite information centers. The Regional and school-based centers serve the purpose of providing technical information through collection development, cataloguing, classification, storage and retrieval, and circulation. The target users are policy makers, government and private research agencies, researchers, and faculty and students. Publications developed and produced by the Center include the 3rd Supplement of the Union Catalogue of Population Literature, the 1987-88 Annotated Bibliography of Philippine Population Literature (PPL), the forthcoming 1989-90 edition of the Annotated Bibliography of PPL, and a biyearly newsletter, POPINEWS. Microcomputers have been acquired for the Regional Centers, with the idea of computerizing POPIN. Computer upgrading is also being done within the IMRD to provide POPLINE CD--ROM capability. Central and regional staff have also had their skills upgraded; e.g., IMRD's staff in the use of Micro-ISIS software, which is used for developing databases and directories. Training is being conducted in the ESCAP database and directory grant program, and in information center management and desktop publishing. Linkages have been made with the local networks, which have contributed to the upgrading effort.

  9. Managing Written Directives: A Software Solution to Streamline Workflow.

    PubMed

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide

    2017-06-01

    A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases confusion about the creation, completion, filing, and retrieval of directives. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  10. Serials Management by Microcomputer: The Potential of DBMS.

    ERIC Educational Resources Information Center

    Vogel, J. Thomas; Burns, Lynn W.

    1984-01-01

    Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are…

  11. Some Reliability Issues in Very Large Databases.

    ERIC Educational Resources Information Center

    Lynch, Clifford A.

    1988-01-01

    Describes the unique reliability problems of very large databases that necessitate specialized techniques for hardware problem management. The discussion covers the use of controlled partial redundancy to improve reliability, issues in operating systems and database management systems design, and the impact of disk technology on very large…

  12. Tufts Health Sciences Database: Lessons, Issues, and Opportunities.

    ERIC Educational Resources Information Center

    Lee, Mary Y.; Albright, Susan A.; Alkasab, Tarik; Damassa, David A.; Wang, Paul J.; Eaton, Elizabeth K.

    2003-01-01

    Describes a seven-year experience with developing the Tufts Health Sciences Database, a database-driven information management system that combines the strengths of a digital library, content delivery tools, and curriculum management. Identifies major effects on teaching and learning. Also addresses issues of faculty development, copyright and…

  13. An object-oriented, technology-adaptive information model

    NASA Technical Reports Server (NTRS)

    Anyiwo, Joshua C.

    1995-01-01

    The primary objective was to develop a computer information system for effectively presenting NASA's technologies to American industries, for appropriate commercialization. To this end a comprehensive information management model, applicable to a wide variety of situations, and immune to computer software/hardware technological gyrations, was developed. The model consists of four main elements: a DATA_STORE, a data PRODUCER/UPDATER_CLIENT and a data PRESENTATION_CLIENT, anchored to a central object-oriented SERVER engine. This server engine facilitates exchanges among the other model elements and safeguards the integrity of the DATA_STORE element. It is designed to support new technologies, as they become available, such as Object Linking and Embedding (OLE), on-demand audio-video data streaming with compression (such as is required for video conferencing), Worldwide Web (WWW) and other information services and browsing, fax-back data requests, presentation of information on CD-ROM, and regular in-house database management, regardless of the data model in place. The four components of this information model interact through a system of intelligent message agents which are customized to specific information exchange needs. This model is at the leading edge of modern information management models. It is independent of technological changes and can be implemented in a variety of ways to meet the specific needs of any communications situation. This summer a partial implementation of the model has been achieved. The structure of the DATA_STORE has been fully specified and successfully tested using Microsoft's FoxPro 2.6 database management system. Data PRODUCER/UPDATER and PRESENTATION architectures have been developed and also successfully implemented in FoxPro; and work has started on a full implementation of the SERVER engine. The model has also been successfully applied to a CD-ROM presentation of NASA's technologies in support of Langley Research Center's TAG efforts.

  14. Assessing historical fish community composition using surveys, historical collection data, and species distribution models.

    PubMed

    Labay, Ben; Cohen, Adam E; Sissel, Blake; Hendrickson, Dean A; Martin, F Douglas; Sarkar, Sahotra

    2011-01-01

    Accurate establishment of baseline conditions is critical to successful management and habitat restoration. We demonstrate the ability to robustly estimate historical fish community composition and assess the current status of the urbanized Barton Creek watershed in central Texas, U.S.A. Fish species were surveyed in 2008 and the resulting data compared to three sources of fish occurrence information: (i) historical records from a museum specimen database and literature searches; (ii) a nearly identical survey conducted 15 years earlier; and (iii) a modeled historical community constructed with species distribution models (SDMs). This holistic approach, and especially the application of SDMs, allowed us to discover that the fish community in Barton Creek was more diverse than the historical data and survey methods alone indicated. Sixteen native species with high modeled probability of occurrence within the watershed were not found in the 2008 survey, seven of these were not found in either survey or in any of the historical collection records. Our approach allowed us to more rigorously establish the true baseline for the pre-development fish fauna and then to more accurately assess trends and develop hypotheses regarding factors driving current fish community composition to better inform management decisions and future restoration efforts. Smaller, urbanized freshwater systems, like Barton Creek, typically have a relatively poor historical biodiversity inventory coupled with long histories of alteration, and thus there is a propensity for land managers and researchers to apply inaccurate baseline standards. Our methods provide a way around that limitation by using SDMs derived from larger and richer biodiversity databases of a broader geographic scope. Broadly applied, we propose that this technique has potential to overcome limitations of popular bioassessment metrics (e.g., IBI) to become a versatile and robust management tool for determining status of freshwater biotic communities.

  15. CRAB3: Establishing a new generation of services for distributed analysis at CMS

    NASA Astrophysics Data System (ADS)

    Cinquilli, M.; Spiga, D.; Grandi, C.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Riahi, H.; Vaandering, E.

    2012-12-01

    In CMS Computing the highest priorities for analysis tools are the improvement of the end users’ ability to produce and publish reliable samples and analysis results as well as a transition to a sustainable development and operations model. To achieve these goals CMS decided to incorporate analysis processing into the same framework as data and simulation processing. This strategy foresees that all workload tools (TierO, Tier1, production, analysis) share a common core with long term maintainability as well as the standardization of the operator interfaces. The re-engineered analysis workload manager, called CRAB3, makes use of newer technologies, such as RESTFul based web services and NoSQL Databases, aiming to increase the scalability and reliability of the system. As opposed to CRAB2, in CRAB3 all work is centrally injected and managed in a global queue. A pool of agents, which can be geographically distributed, consumes work from the central services serving the user tasks. The new architecture of CRAB substantially changes the deployment model and operations activities. In this paper we present the implementation of CRAB3, emphasizing how the new architecture improves the workflow automation and simplifies maintainability. In particular, we will highlight the impact of the new design on daily operations.

  16. Improving nutrition surveillance and public health research in Central and Eastern Europe/Balkan Countries using the Balkan Food Platform and dietary tools.

    PubMed

    Gurinović, Mirjana; Milešević, Jelena; Novaković, Romana; Kadvan, Agnes; Djekić-Ivanković, Marija; Šatalić, Zvonimir; Korošec, Mojca; Spiroski, Igor; Ranić, Marija; Dupouy, Eleonora; Oshaug, Arne; Finglas, Paul; Glibetić, Maria

    2016-02-15

    The objective of this paper is to share experience and provide updated information on Capacity Development in the Central and Eastern Europe/Balkan Countries (CEE/BC) region relevant to public health nutrition, particularly in creation of food composition databases (FCDBs), applying dietary intake assessment and monitoring tools, and harmonizing methodology for nutrition surveillance. Balkan Food Platform was established by a Memorandum of Understanding among EuroFIR AISBL, Institute for Medical Research, Belgrade, Capacity Development Network in Nutrition in CEE - CAPNUTRA and institutions from nine countries in the region. Inventory on FCDB status identified lack of harmonized and standardized research tools. To strengthen harmonization in CEE/BC in line with European research trends, the Network members collaborated in development of a Regional FCDB, using web-based food composition data base management software following EuroFIR standards. Comprehensive nutrition assessment and planning tool - DIET ASSESS & PLAN could enable synchronization of nutrition surveillance across countries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. The COMPTEL Processing and Analysis Software system (COMPASS)

    NASA Astrophysics Data System (ADS)

    de Vries, C. P.; COMPTEL Collaboration

    The data analysis system of the gamma-ray Compton Telescope (COMPTEL) onboard the Compton-GRO spacecraft is described. A continous stream of data of the order of 1 kbytes per second is generated by the instrument. The data processing and analysis software is build around a relational database managment system (RDBMS) in order to be able to trace heritage and processing status of all data in the processing pipeline. Four institutes cooperate in this effort requiring procedures to keep local RDBMS contents identical between the sites and swift exchange of data using network facilities. Lately, there has been a gradual move of the system from central processing facilities towards clusters of workstations.

  18. Three alternative structural configurations for phlebotomy: a comparison of effectiveness.

    PubMed

    Mannion, Heidi; Nadder, Teresa

    2007-01-01

    This study was designed to compare the effectiveness of three alternative structural configurations for inpatient phlebotomy. It was hypothesized that decentralized was less effective when compared to centralized inpatient phlebotomy. A non-experimental prospective survey design was conducted at the institution level. Laboratory managers completed an organizational survey and collected data on inpatient blood specimens during a 30-day data collection period. A random sample (n=31) of hospitals with onsite laboratories in the United States was selected from a database purchased from the Joint Commission on Accreditations of Healthcare Organizations (JCAHO). Effectiveness of the blood collection process was measured by the percentage of specimens rejected during the data collection period. Analysis of variance showed a statistically significant difference in the percentage of specimens rejected for centralized, hybrid, and decentralized phlebotomy configurations [F (2, 28) = 4.27, p = .02] with an effect size of .23. Post-hoc comparison using Tukey's HSD indicated that mean percentage of specimens rejected for centralized phlebotomy (M = .045, SD = 0.36) was significantly different from the decentralized configuration (M = 1.42, SD = 0.92, p = .03). found to be more effective when compared to the decentralized configuration.

  19. [Establishement for regional pelvic trauma database in Hunan Province].

    PubMed

    Cheng, Liang; Zhu, Yong; Long, Haitao; Yang, Junxiao; Sun, Buhua; Li, Kanghua

    2017-04-28

    To establish a database for pelvic trauma in Hunan Province, and to start the work of multicenter pelvic trauma registry.
 Methods: To establish the database, literatures relevant to pelvic trauma were screened, the experiences from the established trauma database in China and abroad were learned, and the actual situations for pelvic trauma rescue in Hunan Province were considered. The database for pelvic trauma was established based on the PostgreSQL and the advanced programming language Java 1.6.
 Results: The complex procedure for pelvic trauma rescue was described structurally. The contents for the database included general patient information, injurious condition, prehospital rescue, conditions in admission, treatment in hospital, status on discharge, diagnosis, classification, complication, trauma scoring and therapeutic effect. The database can be accessed through the internet by browser/servicer. The functions for the database include patient information management, data export, history query, progress report, video-image management and personal information management.
 Conclusion: The database with whole life cycle pelvic trauma is successfully established for the first time in China. It is scientific, functional, practical, and user-friendly.

  20. Insertion algorithms for network model database management systems

    NASA Astrophysics Data System (ADS)

    Mamadolimov, Abdurashid; Khikmat, Saburov

    2017-12-01

    The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, forms partial order. When a database is large and a query comparison is expensive then the efficiency requirement of managing algorithms is minimizing the number of query comparisons. We consider updating operation for network model database management systems. We develop a new sequantial algorithm for updating operation. Also we suggest a distributed version of the algorithm.

  1. Efficient data management in a large-scale epidemiology research project.

    PubMed

    Meyer, Jens; Ostrzinski, Stefan; Fredrich, Daniel; Havemann, Christoph; Krafczyk, Janina; Hoffmann, Wolfgang

    2012-09-01

    This article describes the concept of a "Central Data Management" (CDM) and its implementation within the large-scale population-based medical research project "Personalized Medicine". The CDM can be summarized as a conjunction of data capturing, data integration, data storage, data refinement, and data transfer. A wide spectrum of reliable "Extract Transform Load" (ETL) software for automatic integration of data as well as "electronic Case Report Forms" (eCRFs) was developed, in order to integrate decentralized and heterogeneously captured data. Due to the high sensitivity of the captured data, high system resource availability, data privacy, data security and quality assurance are of utmost importance. A complex data model was developed and implemented using an Oracle database in high availability cluster mode in order to integrate different types of participant-related data. Intelligent data capturing and storage mechanisms are improving the quality of data. Data privacy is ensured by a multi-layered role/right system for access control and de-identification of identifying data. A well defined backup process prevents data loss. Over the period of one and a half year, the CDM has captured a wide variety of data in the magnitude of approximately 5terabytes without experiencing any critical incidents of system breakdown or loss of data. The aim of this article is to demonstrate one possible way of establishing a Central Data Management in large-scale medical and epidemiological studies. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  2. Study protocol: differential effects of diet and physical activity based interventions in pregnancy on maternal and fetal outcomes--individual patient data (IPD) meta-analysis and health economic evaluation.

    PubMed

    Ruifrok, Anneloes E; Rogozinska, Ewelina; van Poppel, Mireille N M; Rayanagoudar, Girish; Kerry, Sally; de Groot, Christianne J M; Yeo, SeonAe; Molyneaux, Emma; McAuliffe, Fionnuala M; Poston, Lucilla; Roberts, Tracy; Riley, Richard D; Coomarasamy, Arri; Khan, Khalid; Mol, Ben Willem; Thangaratinam, Shakila

    2014-11-04

    Pregnant women who gain excess weight are at risk of complications during pregnancy and in the long term. Interventions based on diet and physical activity minimise gestational weight gain with varied effect on clinical outcomes. The effect of interventions on varied groups of women based on body mass index, age, ethnicity, socioeconomic status, parity, and underlying medical conditions is not clear. Our individual patient data (IPD) meta-analysis of randomised trials will assess the differential effect of diet- and physical activity-based interventions on maternal weight gain and pregnancy outcomes in clinically relevant subgroups of women. Randomised trials on diet and physical activity in pregnancy will be identified by searching the following databases: MEDLINE, EMBASE, BIOSIS, LILACS, Pascal, Science Citation Index, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Database of Abstracts of Reviews of Effects, and Health Technology Assessment Database. Primary researchers of the identified trials are invited to join the International Weight Management in Pregnancy Collaborative Network and share their individual patient data. We will reanalyse each study separately and confirm the findings with the original authors. Then, for each intervention type and outcome, we will perform as appropriate either a one-step or a two-step IPD meta-analysis to obtain summary estimates of effects and 95% confidence intervals, for all women combined and for each subgroup of interest. The primary outcomes are gestational weight gain and composite adverse maternal and fetal outcomes. The difference in effects between subgroups will be estimated and between-study heterogeneity suitably quantified and explored. The potential for publication bias and availability bias in the IPD obtained will be investigated. We will conduct a model-based economic evaluation to assess the cost effectiveness of the interventions to manage weight gain in pregnancy and undertake a value of information analysis to inform future research. PROSPERO 2013: CRD42013003804.

  3. DICOM-compliant PACS with CD-based image archival

    NASA Astrophysics Data System (ADS)

    Cox, Robert D.; Henri, Christopher J.; Rubin, Richard K.; Bret, Patrice M.

    1998-07-01

    This paper describes the design and implementation of a low- cost PACS conforming to the DICOM 3.0 standard. The goal was to provide an efficient image archival and management solution on a heterogeneous hospital network as a basis for filmless radiology. The system follows a distributed, client/server model and was implemented at a fraction of the cost of a commercial PACS. It provides reliable archiving on recordable CD and allows access to digital images throughout the hospital and on the Internet. Dedicated servers have been designed for short-term storage, CD-based archival, data retrieval and remote data access or teleradiology. The short-term storage devices provide DICOM storage and query/retrieve services to scanners and workstations and approximately twelve weeks of 'on-line' image data. The CD-based archival and data retrieval processes are fully automated with the exception of CD loading and unloading. The system employs lossless compression on both short- and long-term storage devices. All servers communicate via the DICOM protocol in conjunction with both local and 'master' SQL-patient databases. Records are transferred from the local to the master database independently, ensuring that storage devices will still function if the master database server cannot be reached. The system features rules-based work-flow management and WWW servers to provide multi-platform remote data access. The WWW server system is distributed on the storage, retrieval and teleradiology servers allowing viewing of locally stored image data directly in a WWW browser without the need for data transfer to a central WWW server. An independent system monitors disk usage, processes, network and CPU load on each server and reports errors to the image management team via email. The PACS was implemented using a combination of off-the-shelf hardware, freely available software and applications developed in-house. The system has enabled filmless operation in CT, MR and ultrasound within the radiology department and throughout the hospital. The use of WWW technology has enabled the development of an intuitive we- based teleradiology and image management solution that provides complete access to image data.

  4. 12 CFR 234.4 - Standards for central securities depositories and central counterparties.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... it meets or exceeds the following risk-management standards with respect to the payment, clearing... central counterparty's risk-management procedures. (9) The central securities depository or central... plausible market conditions. (b) The Board, by order, may apply heightened risk-management standards to a...

  5. 12 CFR 234.4 - Standards for central securities depositories and central counterparties.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... it meets or exceeds the following risk-management standards with respect to the payment, clearing... central counterparty's risk-management procedures. (9) The central securities depository or central... plausible market conditions. (b) The Board, by order, may apply heightened risk-management standards to a...

  6. Using Statistics for Database Management in an Academic Library.

    ERIC Educational Resources Information Center

    Hyland, Peter; Wright, Lynne

    1996-01-01

    Collecting statistical data about database usage by library patrons aids in the management of CD-ROM and database offerings, collection development, and evaluation of training programs. Two approaches to data collection are presented which should be used together: an automated or nonintrusive method which monitors search sessions while the…

  7. Database Software Selection for the Egyptian National STI Network.

    ERIC Educational Resources Information Center

    Slamecka, Vladimir

    The evaluation and selection of information/data management system software for the Egyptian National Scientific and Technical (STI) Network are described. An overview of the state-of-the-art of database technology elaborates on the differences between information retrieval and database management systems (DBMS). The desirable characteristics of…

  8. Teaching Database Management System Use in a Library School Curriculum.

    ERIC Educational Resources Information Center

    Cooper, Michael D.

    1985-01-01

    Description of database management systems course being taught to students at School of Library and Information Studies, University of California, Berkeley, notes course structure, assignments, and course evaluation. Approaches to teaching concepts of three types of database systems are discussed and systems used by students in the course are…

  9. A New Breed of Database System: Volcano Global Risk Identification and Analysis Project (VOGRIPA)

    NASA Astrophysics Data System (ADS)

    Crosweller, H. S.; Sparks, R. S.; Siebert, L.

    2009-12-01

    VOGRIPA originated as part of the Global Risk Identification Programme (GRIP) that is being co-ordinated from the Earth Institute of Columbia University under the auspices of the United Nations and World Bank. GRIP is a five-year programme aiming at improving global knowledge about risk from natural hazards and is part of the international response to the catastrophic 2004 Asian tsunami. VOGRIPA is also a formal IAVCEI project. The objectives of VOGRIPA are to create a global database of volcanic activity, hazards and vulnerability information that can be analysed to identify locations at high risk from volcanism, gaps in knowledge about hazards and risk, and will allow scientists and disaster managers at specific locations to analyse risk within a global context of systematic information. It is this added scope of risk and vulnerability as well as hazard which sets VOGRIPA apart from most previous databases. The University of Bristol is the central coordinating centre for the project, which is an international partnership including the Smithsonian Institution, the Geological Survey of Japan, the Earth Observatory of Singapore (Chris Newhall), the British Geological Survey, the University of Buffalo (SUNY) and Munich Re. The partnership is intended to grow and any individuals or institutions who are able to contribute resources to VOGRIPA objectives are welcome to participate. Work has already begun (funded principally by Munich Re) on populating a database of large magnitude explosive eruptions reaching back to the Quaternary, with extreme-value statistics being used to evaluate the magnitude-frequency relationship of such events, and also an assessment of how the quality of records affect the results. The following 4 years of funding from the European Research Council for VOGRIPA will be used to establish further international collaborations in order to develop different aspects of the database, with the data being accessible online once it is sufficiently complete and analyses have been carried out. It is anticipated that such a resource would be of use to the scientific community, civil authorities with responsibility for mitigating and managing volcanic hazards, and the public.

  10. The land management and operations database (LMOD)

    USDA-ARS?s Scientific Manuscript database

    This paper presents the design, implementation, deployment, and application of the Land Management and Operations Database (LMOD). LMOD is the single authoritative source for reference land management and operation reference data within the USDA enterprise data warehouse. LMOD supports modeling appl...

  11. Development of bilateral data transferability in the Virginia Department of Transportation's Geotechnical Database Management System Framework.

    DOT National Transportation Integrated Search

    2006-01-01

    An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was designed, developed, and implemented at the Virginia Department of Transportation (VDOT) in 2002 to retrieve, manage, archive, and analyze geotechnical da...

  12. Database System Design and Implementation for Marine Air-Traffic-Controller Training

    DTIC Science & Technology

    2017-06-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. DATABASE SYSTEM DESIGN AND...thesis 4. TITLE AND SUBTITLE DATABASE SYSTEM DESIGN AND IMPLEMENTATION FOR MARINE AIR-TRAFFIC-CONTROLLER TRAINING 5. FUNDING NUMBERS 6. AUTHOR(S...12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) This project focused on the design , development, and implementation of a centralized

  13. Geologic Map of the Wenatchee 1:100,000 Quadrangle, Central Washington: A Digital Database

    USGS Publications Warehouse

    Tabor, R.W.; Waitt, R.B.; Frizzell, V.A.; Swanson, D.A.; Byerly, G.R.; Bentley, R.D.

    2005-01-01

    This digital map database has been prepared by R.W. Tabor from the published Geologic map of the Wenatchee 1:100,000 Quadrangle, Central Washington. Together with the accompanying text files as PDF, it provides information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The authors mapped most of the bedrock geology at 1:100,000 scale, but compiled Quaternary units at 1:24,000 scale. The Quaternary contacts and structural data have been much simplified for the 1:100,000-scale map and database. The spatial resolution (scale) of the database is 1:100,000 or smaller. This database depicts the distribution of geologic materials and structures at a regional (1:100,000) scale. The report is intended to provide geologic information for the regional study of materials properties, earthquake shaking, landslide potential, mineral hazards, seismic velocity, and earthquake faults. In addition, the report contains information and interpretations about the regional geologic history and framework. However, the regional scale of this report does not provide sufficient detail for site development purposes.

  14. Toxoplasmosis in Iran: A guide for general physicians working in the Iranian health network setting: A systematic review.

    PubMed

    Alavi, Seyed Mohammad; Alavi, Leila

    2016-01-01

    Human toxoplasmosis is an important zoonotic infection worldwide which is caused by the intracellular parasite Toxoplasma gondii (T.gondii). The aim of this study was to review briefly the general aspects of toxoplasma infection in in Iranian health system network. We searched published toxoplasmosis related articles in English databases including Science Direct, Pub Med, Scopus, Google Scholar, Magiran, Iran Medex, Iran Doc and Scientific Information Database (SID) for toxoplasmosis. Out of 1267 articles from the English and Persian databases search, 40 articles were suitable with our research objectives and so were selected for the study. It is estimated that at least a third of the world human population is infected with T.gondii, suggesting it as one of the most common parasitic infections through the world. Maternal infection during pregnancy may affect dangerous outcome for the fetus, or even cause intrauterine death. Reactivation of a previous infection in immunocompromised patient such as drug induced, AIDS and organ transplantation can cause life-threating central nervous system infection. Ocular toxoplasmosis is one of the most important causes of blindness, especially in individuals with a deficient immune system. According to the increasing burden of toxoplasmosis on human health, the findings of this study highlight the appropriate preventive measures, diagnosis, and management of this disease.

  15. Human Disease Insight: An integrated knowledge-based platform for disease-gene-drug information.

    PubMed

    Tasleem, Munazzah; Ishrat, Romana; Islam, Asimul; Ahmad, Faizan; Hassan, Md Imtaiyaz

    2016-01-01

    The scope of the Human Disease Insight (HDI) database is not limited to researchers or physicians as it also provides basic information to non-professionals and creates disease awareness, thereby reducing the chances of patient suffering due to ignorance. HDI is a knowledge-based resource providing information on human diseases to both scientists and the general public. Here, our mission is to provide a comprehensive human disease database containing most of the available useful information, with extensive cross-referencing. HDI is a knowledge management system that acts as a central hub to access information about human diseases and associated drugs and genes. In addition, HDI contains well-classified bioinformatics tools with helpful descriptions. These integrated bioinformatics tools enable researchers to annotate disease-specific genes and perform protein analysis, search for biomarkers and identify potential vaccine candidates. Eventually, these tools will facilitate the analysis of disease-associated data. The HDI provides two types of search capabilities and includes provisions for downloading, uploading and searching disease/gene/drug-related information. The logistical design of the HDI allows for regular updating. The database is designed to work best with Mozilla Firefox and Google Chrome and is freely accessible at http://humandiseaseinsight.com. Copyright © 2015 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  16. Nerve growth factor for Bell’s palsy: A meta-analysis

    PubMed Central

    SU, YIPENG; DONG, XIAOMENG; LIU, JUAN; HU, YAOZHI; CHEN, JINBO

    2015-01-01

    A meta-analysis was performed to evaluate the efficacy and safety of nerve growth factor (NGF) in the treatment of Bell’s palsy. PubMed, the Cochrane Central Register of Controlled Trials, Embase and a number of Chinese databases, including the China National Knowledge Infrastructure, China Biology Medicine disc, VIP Database for Chinese Technical Periodicals and Wan Fang Data, were used to collect randomised controlled trials (RCTs) of NGF for Bell’s palsy. The span of the search covered data from the date of database establishment until December 2013. The included trials were screened comprehensively and rigorously. The efficacies of NGF were pooled via meta-analysis performed using Review Manager 5.2 software. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated using the fixed-effects model. The meta-analysis of eight RCTs showed favorable effects of NGF on the disease response rate (n=642; OR, 3.87; 95% CI, 2.13–7.03; P<0.01; I2=0%). However, evidence supporting the effectiveness of NGF for the treatment of Bell’s palsy is limited. The number and quality of trials are too low to form solid conclusions. Further meticulous RCTs are required to overcome the limitations identified in the present study. PMID:25574223

  17. ARCTOS: a relational database relating specimens, specimen-based science, and archival documentation

    USGS Publications Warehouse

    Jarrell, Gordon H.; Ramotnik, Cindy A.; McDonald, D.L.

    2010-01-01

    Data are preserved when they are perpetually discoverable, but even in the Information Age, discovery of legacy data appropriate to particular investigations is uncertain. Secure Internet storage is necessary but insufficient. Data can be discovered only when they are adequately described, and visibility increases markedly if the data are related to other data that are receiving usage. Such relationships can be built within (1) the framework of a relational database, or (1) they can be built among separate resources, within the framework of the Internet. Evolving primarily around biological collections, Arctos is a database that does both of these tasks. It includes data structures for a diversity of specimen attributes, essentially all collection-management tasks, plus literature citations, project descriptions, etc. As a centralized collaboration of several university museums, Arctos is an ideal environment for capitalizing on the many relationships that often exist between items in separate collections. Arctos is related to NIH’s DNA-sequence repository (GenBank) with record-to-record reciprocal linkages, and it serves data to several discipline-specific web portals, including the Global Biodiversity Information Network (GBIF). The University of Alaska Museum’s paleontological collection is Arctos’s recent extension beyond the constraints of neontology. With about 1.3 million cataloged items, additional collections are being added each year.

  18. Database-driven web interface automating gyrokinetic simulations for validation

    NASA Astrophysics Data System (ADS)

    Ernst, D. R.

    2010-11-01

    We are developing a web interface to connect plasma microturbulence simulation codes with experimental data. The website automates the preparation of gyrokinetic simulations utilizing plasma profile and magnetic equilibrium data from TRANSP analysis of experiments, read from MDSPLUS over the internet. This database-driven tool saves user sessions, allowing searches of previous simulations, which can be restored to repeat the same analysis for a new discharge. The website includes a multi-tab, multi-frame, publication quality java plotter Webgraph, developed as part of this project. Input files can be uploaded as templates and edited with context-sensitive help. The website creates inputs for GS2 and GYRO using a well-tested and verified back-end, in use for several years for the GS2 code [D. R. Ernst et al., Phys. Plasmas 11(5) 2637 (2004)]. A centralized web site has the advantage that users receive bug fixes instantaneously, while avoiding the duplicated effort of local compilations. Possible extensions to the database to manage run outputs, toward prototyping for the Fusion Simulation Project, are envisioned. Much of the web development utilized support from the DoE National Undergraduate Fellowship program [e.g., A. Suarez and D. R. Ernst, http://meetings.aps.org/link/BAPS.2005.DPP.GP1.57.

  19. The NCBI BioSystems database

    PubMed Central

    Geer, Lewis Y.; Marchler-Bauer, Aron; Geer, Renata C.; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H.

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI’s Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets. PMID:19854944

  20. Rolling Deck to Repository I: Designing a Database Infrastructure

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Miller, S. P.; Chandler, C. L.; Ferrini, V. L.; O'Hara, S. H.

    2008-12-01

    The NSF-supported academic research fleet collectively produces a large and diverse volume of scientific data, which are increasingly being shared across disciplines and contributed to regional and global syntheses. As both Internet connectivity and storage technology improve, it becomes practical for ships to routinely deliver data and documentation for a standard suite of underway instruments to a central shoreside repository. Routine delivery will facilitate data discovery and integration, quality assessment, cruise planning, compliance with funding agency and clearance requirements, and long-term data preservation. We are working collaboratively with ship operators and data managers to develop a prototype "data discovery system" for NSF-supported research vessels. Our goal is to establish infrastructure for a central shoreside repository, and to develop and test procedures for the routine delivery of standard data products and documentation to the repository. Related efforts are underway to identify tools and criteria for quality control of standard data products, and to develop standard interfaces and procedures for maintaining an underway event log. Development of a shoreside repository infrastructure will include: 1. Deployment and testing of a central catalog that holds cruise summaries and vessel profiles. A cruise summary will capture the essential details of a research expedition (operating institution, ports/dates, personnel, data inventory, etc.), as well as related documentation such as event logs and technical reports. A vessel profile will capture the essential details of a ship's installed instruments (manufacturer, model, serial number, reference location, etc.), with version control as the profile changes through time. The catalog's relational database schema will be based on the UNOLS Data Best Practices Committee's recommendations, and published as a formal XML specification. 2. Deployment and testing of a central repository that holds navigation and routine underway data. Based on discussion with ship operators and data managers at a workgroup meeting in September 2008, we anticipate that a subset of underway data could be delivered from ships to the central repository in near- realtime - enabling the integrated display of ship tracks at a public Web portal, for example - and a full data package could be delivered post-cruise by network transfer or disk shipment. Once ashore, data sets could be distributed to assembly centers such as the Shipboard Automated Meteorological and Oceanographic System (SAMOS) for routine processing, quality assessment, and synthesis efforts - as well as transmitted to national data centers such as NODC and NGDC for permanent archival. 3. Deployment and testing of a basic suite of Web services to make cruise summaries, vessel profiles, event logs, and navigation data easily available. A standard set of catalog records, maps, and navigation features will be published via the Open Archives Initiative (OAI) and Open Geospatial Consortium (OGC) protocols, which can then be harvested by partner data centers and/or embedded in client applications.

  1. Kristin Munch | NREL

    Science.gov Websites

    Information Management System, Materials Research Society Fall Meeting (2013) Photovoltaics Informatics scientific data management, database and data systems design, database clusters, storage systems integration , and distributed data analytics. She has used her experience in laboratory data management systems, lab

  2. Development of the interconnectivity and enhancement (ICE) module in the Virginia Department of Transportation's Geotechnical Database Management System Framework.

    DOT National Transportation Integrated Search

    2007-01-01

    An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was implemented at the Virginia Department of Transportation (VDOT) in 2002 to manage geotechnical data using a distributed Geographical Information System (G...

  3. 23 CFR 973.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS MANAGEMENT... system; (2) A process to operate and maintain the management systems and their associated databases; (3... systems shall use databases with a common or coordinated reference system that can be used to geolocate...

  4. 23 CFR 973.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS MANAGEMENT... system; (2) A process to operate and maintain the management systems and their associated databases; (3... systems shall use databases with a common or coordinated reference system that can be used to geolocate...

  5. A database of the coseismic effects following the 30 October 2016 Norcia earthquake in Central Italy

    PubMed Central

    Villani, Fabio; Civico, Riccardo; Pucci, Stefano; Pizzimenti, Luca; Nappi, Rosa; De Martini, Paolo Marco; Villani, Fabio; Civico, Riccardo; Pucci, Stefano; Pizzimenti, Luca; Nappi, Rosa; De Martini, Paolo Marco; Agosta, F.; Alessio, G.; Alfonsi, L.; Amanti, M.; Amoroso, S.; Aringoli, D.; Auciello, E.; Azzaro, R.; Baize, S.; Bello, S.; Benedetti, L.; Bertagnini, A.; Binda, G.; Bisson, M.; Blumetti, A.M.; Bonadeo, L.; Boncio, P.; Bornemann, P.; Branca, S.; Braun, T.; Brozzetti, F.; Brunori, C.A.; Burrato, P.; Caciagli, M.; Campobasso, C.; Carafa, M.; Cinti, F.R.; Cirillo, D.; Comerci, V.; Cucci, L.; De Ritis, R.; Deiana, G.; Del Carlo, P.; Del Rio, L.; Delorme, A.; Di Manna, P.; Di Naccio, D.; Falconi, L.; Falcucci, E.; Farabollini, P.; Faure Walker, J.P.; Ferrarini, F.; Ferrario, M.F.; Ferry, M.; Feuillet, N.; Fleury, J.; Fracassi, U.; Frigerio, C.; Galluzzo, F.; Gambillara, R.; Gaudiosi, G.; Goodall, H.; Gori, S.; Gregory, L.C.; Guerrieri, L.; Hailemikael, S.; Hollingsworth, J.; Iezzi, F.; Invernizzi, C.; Jablonská, D.; Jacques, E.; Jomard, H.; Kastelic, V.; Klinger, Y.; Lavecchia, G.; Leclerc, F.; Liberi, F.; Lisi, A.; Livio, F.; Lo Sardo, L.; Malet, J.P.; Mariucci, M.T.; Materazzi, M.; Maubant, L.; Mazzarini, F.; McCaffrey, K.J.W.; Michetti, A.M.; Mildon, Z.K.; Montone, P.; Moro, M.; Nave, R.; Odin, M.; Pace, B.; Paggi, S.; Pagliuca, N.; Pambianchi, G.; Pantosti, D.; Patera, A.; Pérouse, E.; Pezzo, G.; Piccardi, L.; Pierantoni, P.P.; Pignone, M.; Pinzi, S.; Pistolesi, E.; Point, J.; Pousse, L.; Pozzi, A.; Proposito, M.; Puglisi, C.; Puliti, I.; Ricci, T.; Ripamonti, L.; Rizza, M.; Roberts, G.P.; Roncoroni, M.; Sapia, V.; Saroli, M.; Sciarra, A.; Scotti, O.; Skupinski, G.; Smedile, A.; Soquet, A.; Tarabusi, G.; Tarquini, S.; Terrana, S.; Tesson, J.; Tondi, E.; Valentini, A.; Vallone, R.; Van der Woerd, J.; Vannoli, P.; Venuti, A.; Vittori, E.; Volatili, T.; Wedmore, L.N.J.; Wilkinson, M.; Zambrano, M.

    2018-01-01

    We provide a database of the coseismic geological surface effects following the Mw 6.5 Norcia earthquake that hit central Italy on 30 October 2016. This was one of the strongest seismic events to occur in Europe in the past thirty years, causing complex surface ruptures over an area of >400 km2. The database originated from the collaboration of several European teams (Open EMERGEO Working Group; about 130 researchers) coordinated by the Istituto Nazionale di Geofisica e Vulcanologia. The observations were collected by performing detailed field surveys in the epicentral region in order to describe the geometry and kinematics of surface faulting, and subsequently of landslides and other secondary coseismic effects. The resulting database consists of homogeneous georeferenced records identifying 7323 observation points, each of which contains 18 numeric and string fields of relevant information. This database will impact future earthquake studies focused on modelling of the seismic processes in active extensional settings, updating probabilistic estimates of slip distribution, and assessing the hazard of surface faulting. PMID:29583143

  6. A database of the coseismic effects following the 30 October 2016 Norcia earthquake in Central Italy.

    PubMed

    Villani, Fabio; Civico, Riccardo; Pucci, Stefano; Pizzimenti, Luca; Nappi, Rosa; De Martini, Paolo Marco

    2018-03-27

    We provide a database of the coseismic geological surface effects following the Mw 6.5 Norcia earthquake that hit central Italy on 30 October 2016. This was one of the strongest seismic events to occur in Europe in the past thirty years, causing complex surface ruptures over an area of >400 km 2 . The database originated from the collaboration of several European teams (Open EMERGEO Working Group; about 130 researchers) coordinated by the Istituto Nazionale di Geofisica e Vulcanologia. The observations were collected by performing detailed field surveys in the epicentral region in order to describe the geometry and kinematics of surface faulting, and subsequently of landslides and other secondary coseismic effects. The resulting database consists of homogeneous georeferenced records identifying 7323 observation points, each of which contains 18 numeric and string fields of relevant information. This database will impact future earthquake studies focused on modelling of the seismic processes in active extensional settings, updating probabilistic estimates of slip distribution, and assessing the hazard of surface faulting.

  7. A database of the coseismic effects following the 30 October 2016 Norcia earthquake in Central Italy

    NASA Astrophysics Data System (ADS)

    Villani, Fabio; Civico, Riccardo; Pucci, Stefano; Pizzimenti, Luca; Nappi, Rosa; de Martini, Paolo Marco; Villani, Fabio; Civico, Riccardo; Pucci, Stefano; Pizzimenti, Luca; Nappi, Rosa; de Martini, Paolo Marco; Agosta, F.; Alessio, G.; Alfonsi, L.; Amanti, M.; Amoroso, S.; Aringoli, D.; Auciello, E.; Azzaro, R.; Baize, S.; Bello, S.; Benedetti, L.; Bertagnini, A.; Binda, G.; Bisson, M.; Blumetti, A. M.; Bonadeo, L.; Boncio, P.; Bornemann, P.; Branca, S.; Braun, T.; Brozzetti, F.; Brunori, C. A.; Burrato, P.; Caciagli, M.; Campobasso, C.; Carafa, M.; Cinti, F. R.; Cirillo, D.; Comerci, V.; Cucci, L.; de Ritis, R.; Deiana, G.; Del Carlo, P.; Del Rio, L.; Delorme, A.; di Manna, P.; di Naccio, D.; Falconi, L.; Falcucci, E.; Farabollini, P.; Faure Walker, J. P.; Ferrarini, F.; Ferrario, M. F.; Ferry, M.; Feuillet, N.; Fleury, J.; Fracassi, U.; Frigerio, C.; Galluzzo, F.; Gambillara, R.; Gaudiosi, G.; Goodall, H.; Gori, S.; Gregory, L. C.; Guerrieri, L.; Hailemikael, S.; Hollingsworth, J.; Iezzi, F.; Invernizzi, C.; Jablonská, D.; Jacques, E.; Jomard, H.; Kastelic, V.; Klinger, Y.; Lavecchia, G.; Leclerc, F.; Liberi, F.; Lisi, A.; Livio, F.; Lo Sardo, L.; Malet, J. P.; Mariucci, M. T.; Materazzi, M.; Maubant, L.; Mazzarini, F.; McCaffrey, K. J. W.; Michetti, A. M.; Mildon, Z. K.; Montone, P.; Moro, M.; Nave, R.; Odin, M.; Pace, B.; Paggi, S.; Pagliuca, N.; Pambianchi, G.; Pantosti, D.; Patera, A.; Pérouse, E.; Pezzo, G.; Piccardi, L.; Pierantoni, P. P.; Pignone, M.; Pinzi, S.; Pistolesi, E.; Point, J.; Pousse, L.; Pozzi, A.; Proposito, M.; Puglisi, C.; Puliti, I.; Ricci, T.; Ripamonti, L.; Rizza, M.; Roberts, G. P.; Roncoroni, M.; Sapia, V.; Saroli, M.; Sciarra, A.; Scotti, O.; Skupinski, G.; Smedile, A.; Soquet, A.; Tarabusi, G.; Tarquini, S.; Terrana, S.; Tesson, J.; Tondi, E.; Valentini, A.; Vallone, R.; van der Woerd, J.; Vannoli, P.; Venuti, A.; Vittori, E.; Volatili, T.; Wedmore, L. N. J.; Wilkinson, M.; Zambrano, M.

    2018-03-01

    We provide a database of the coseismic geological surface effects following the Mw 6.5 Norcia earthquake that hit central Italy on 30 October 2016. This was one of the strongest seismic events to occur in Europe in the past thirty years, causing complex surface ruptures over an area of >400 km2. The database originated from the collaboration of several European teams (Open EMERGEO Working Group; about 130 researchers) coordinated by the Istituto Nazionale di Geofisica e Vulcanologia. The observations were collected by performing detailed field surveys in the epicentral region in order to describe the geometry and kinematics of surface faulting, and subsequently of landslides and other secondary coseismic effects. The resulting database consists of homogeneous georeferenced records identifying 7323 observation points, each of which contains 18 numeric and string fields of relevant information. This database will impact future earthquake studies focused on modelling of the seismic processes in active extensional settings, updating probabilistic estimates of slip distribution, and assessing the hazard of surface faulting.

  8. Salary Management System for Small and Medium-sized Enterprises

    NASA Astrophysics Data System (ADS)

    Hao, Zhang; Guangli, Xu; Yuhuan, Zhang; Yilong, Lei

    Small and Medium-sized Enterprises (SMEs) in the process of wage entry, calculation, the total number are needed to be done manually in the past, the data volume is quite large, processing speed is low, and it is easy to make error, which is resulting in low efficiency. The main purpose of writing this paper is to present the basis of salary management system, establish a scientific database, the computer payroll system, using the computer instead of a lot of past manual work in order to reduce duplication of staff labor, it will improve working efficiency.This system combines the actual needs of SMEs, through in-depth study and practice of the C/S mode, PowerBuilder10.0 development tools, databases and SQL language, Completed a payroll system needs analysis, database design, application design and development work. Wages, departments, units and personnel database file are included in this system, and have data management, department management, personnel management and other functions, through the control and management of the database query, add, delete, modify, and other functions can be realized. This system is reasonable design, a more complete function, stable operation has been tested to meet the basic needs of the work.

  9. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    NASA Astrophysics Data System (ADS)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  10. 41 CFR 102-83.115 - What is a central city?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What is a central city? 102-83.115 Section 102-83.115 Public Contracts and Property Management Federal Property Management... Space Urban Areas § 102-83.115 What is a central city? Central cities are those central cities defined...

  11. 41 CFR 102-83.115 - What is a central city?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What is a central city? 102-83.115 Section 102-83.115 Public Contracts and Property Management Federal Property Management... Space Urban Areas § 102-83.115 What is a central city? Central cities are those central cities defined...

  12. 48 CFR 52.232-33 - Payment by Electronic Funds Transfer-System for Award Management.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... contained in the System for Award Management (SAM) database. In the event that the EFT information changes, the Contractor shall be responsible for providing the updated information to the SAM database. (c... 210. (d) Suspension of payment. If the Contractor's EFT information in the SAM database is incorrect...

  13. 48 CFR 52.232-33 - Payment by Electronic Funds Transfer-System for Award Management.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... contained in the System for Award Management (SAM) database. In the event that the EFT information changes, the Contractor shall be responsible for providing the updated information to the SAM database. (c... 210. (d) Suspension of payment. If the Contractor's EFT information in the SAM database is incorrect...

  14. Microcomputer-Based Access to Machine-Readable Numeric Databases.

    ERIC Educational Resources Information Center

    Wenzel, Patrick

    1988-01-01

    Describes the use of microcomputers and relational database management systems to improve access to numeric databases by the Data and Program Library Service at the University of Wisconsin. The internal records management system, in-house reference tools, and plans to extend these tools to the entire campus are discussed. (3 references) (CLB)

  15. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    ERIC Educational Resources Information Center

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  16. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  17. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  18. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  19. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  20. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  1. A Database Design and Development Case: NanoTEK Networks

    ERIC Educational Resources Information Center

    Ballenger, Robert M.

    2010-01-01

    This case provides a real-world project-oriented case study for students enrolled in a management information systems, database management, or systems analysis and design course in which database design and development are taught. The case consists of a business scenario to provide background information and details of the unique operating…

  2. Database Access Systems.

    ERIC Educational Resources Information Center

    Dalrymple, Prudence W.; Roderer, Nancy K.

    1994-01-01

    Highlights the changes that have occurred from 1987-93 in database access systems. Topics addressed include types of databases, including CD-ROMs; enduser interface; database selection; database access management, including library instruction and use of primary literature; economic issues; database users; the search process; and improving…

  3. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    NASA Astrophysics Data System (ADS)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  4. Facilitating quality control for spectra assignments of small organic molecules: nmrshiftdb2--a free in-house NMR database with integrated LIMS for academic service laboratories.

    PubMed

    Kuhn, Stefan; Schlörer, Nils E

    2015-08-01

    nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Evaluating the Potential of Commercial GIS for Accelerator Configuration Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.L. Larrieu; Y.R. Roblin; K. White

    2005-10-10

    The Geographic Information System (GIS) is a tool used by industries needing to track information about spatially distributed assets. A water utility, for example, must know not only the precise location of each pipe and pump, but also the respective pressure rating and flow rate of each. In many ways, an accelerator such as CEBAF (Continuous Electron Beam Accelerator Facility) can be viewed as an ''electron utility''. Whereas the water utility uses pipes and pumps, the ''electron utility'' uses magnets and RF cavities. At Jefferson lab we are exploring the possibility of implementing ESRI's ArcGIS as the framework for buildingmore » an all-encompassing accelerator configuration database that integrates location, configuration, maintenance, and connectivity details of all hardware and software. The possibilities of doing so are intriguing. From the GIS, software such as the model server could always extract the most-up-to-date layout information maintained by the Survey & Alignment for lattice modeling. The Mechanical Engineering department could use ArcGIS tools to generate CAD drawings of machine segments from the same database. Ultimately, the greatest benefit of the GIS implementation could be to liberate operators and engineers from the limitations of the current system-by-system view of machine configuration and allow a more integrated regional approach. The commercial GIS package provides a rich set of tools for database-connectivity, versioning, distributed editing, importing and exporting, and graphical analysis and querying, and therefore obviates the need for much custom development. However, formidable challenges to implementation exist and these challenges are not only technical and manpower issues, but also organizational ones. The GIS approach would crosscut organizational boundaries and require departments, which heretofore have had free reign to manage their own data, to cede some control and agree to a centralized framework.« less

  6. Network Configuration of Oracle and Database Programming Using SQL

    NASA Technical Reports Server (NTRS)

    Davis, Melton; Abdurrashid, Jibril; Diaz, Philip; Harris, W. C.

    2000-01-01

    A database can be defined as a collection of information organized in such a way that it can be retrieved and used. A database management system (DBMS) can further be defined as the tool that enables us to manage and interact with the database. The Oracle 8 Server is a state-of-the-art information management environment. It is a repository for very large amounts of data, and gives users rapid access to that data. The Oracle 8 Server allows for sharing of data between applications; the information is stored in one place and used by many systems. My research will focus primarily on SQL (Structured Query Language) programming. SQL is the way you define and manipulate data in Oracle's relational database. SQL is the industry standard adopted by all database vendors. When programming with SQL, you work on sets of data (i.e., information is not processed one record at a time).

  7. Virtopsy - the concept of a centralized database in forensic medicine for analysis and comparison of radiological and autopsy data.

    PubMed

    Aghayev, Emin; Staub, Lukas; Dirnhofer, Richard; Ambrose, Tony; Jackowski, Christian; Yen, Kathrin; Bolliger, Stephan; Christe, Andreas; Roeder, Christoph; Aebi, Max; Thali, Michael J

    2008-04-01

    Recent developments in clinical radiology have resulted in additional developments in the field of forensic radiology. After implementation of cross-sectional radiology and optical surface documentation in forensic medicine, difficulties in the validation and analysis of the acquired data was experienced. To address this problem and for the comparison of autopsy and radiological data a centralized database with internet technology for forensic cases was created. The main goals of the database are (1) creation of a digital and standardized documentation tool for forensic-radiological and pathological findings; (2) establishing a basis for validation of forensic cross-sectional radiology as a non-invasive examination method in forensic medicine that means comparing and evaluating the radiological and autopsy data and analyzing the accuracy of such data; and (3) providing a conduit for continuing research and education in forensic medicine. Considering the infrequent availability of CT or MRI for forensic institutions and the heterogeneous nature of case material in forensic medicine an evaluation of benefits and limitations of cross-sectional imaging concerning certain forensic features by a single institution may be of limited value. A centralized database permitting international forensic and cross disciplinary collaborations may provide important support for forensic-radiological casework and research.

  8. Methods for structuring scientific knowledge from many areas related to aging research.

    PubMed

    Zhavoronkov, Alex; Cantor, Charles R

    2011-01-01

    Aging and age-related disease represents a substantial quantity of current natural, social and behavioral science research efforts. Presently, no centralized system exists for tracking aging research projects across numerous research disciplines. The multidisciplinary nature of this research complicates the understanding of underlying project categories, the establishment of project relations, and the development of a unified project classification scheme. We have developed a highly visual database, the International Aging Research Portfolio (IARP), available at AgingPortfolio.org to address this issue. The database integrates information on research grants, peer-reviewed publications, and issued patent applications from multiple sources. Additionally, the database uses flexible project classification mechanisms and tools for analyzing project associations and trends. This system enables scientists to search the centralized project database, to classify and categorize aging projects, and to analyze the funding aspects across multiple research disciplines. The IARP is designed to provide improved allocation and prioritization of scarce research funding, to reduce project overlap and improve scientific collaboration thereby accelerating scientific and medical progress in a rapidly growing area of research. Grant applications often precede publications and some grants do not result in publications, thus, this system provides utility to investigate an earlier and broader view on research activity in many research disciplines. This project is a first attempt to provide a centralized database system for research grants and to categorize aging research projects into multiple subcategories utilizing both advanced machine algorithms and a hierarchical environment for scientific collaboration.

  9. A geospatial database model for the management of remote sensing datasets at multiple spectral, spatial, and temporal scales

    NASA Astrophysics Data System (ADS)

    Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George

    2017-10-01

    In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.

  10. How I do it: a practical database management system to assist clinical research teams with data collection, organization, and reporting.

    PubMed

    Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe

    2015-04-01

    The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  11. Universal Index System

    NASA Technical Reports Server (NTRS)

    Kelley, Steve; Roussopoulos, Nick; Sellis, Timos; Wallace, Sarah

    1993-01-01

    The Universal Index System (UIS) is an index management system that uses a uniform interface to solve the heterogeneity problem among database management systems. UIS provides an easy-to-use common interface to access all underlying data, but also allows different underlying database management systems, storage representations, and access methods.

  12. Maintaining Research Documents with Database Management Software.

    ERIC Educational Resources Information Center

    Harrington, Stuart A.

    1999-01-01

    Discusses taking notes for research projects and organizing them into card files; reviews the literature on personal filing systems; introduces the basic process of database management; and offers a plan for managing research notes. Describes field groups and field definitions, data entry, and creating reports. (LRW)

  13. 76 FR 15953 - Agency Information Collection Activities; Announcement of Office of Management and Budget...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-22

    ... CONSUMER PRODUCT SAFETY COMMISSION Agency Information Collection Activities; Announcement of Office of Management and Budget Approval; Publicly Available Consumer Product Safety Information Database... Product Safety Information Database has been approved by the Office of Management and Budget (OMB) under...

  14. Use of imagery and GIS for humanitarian demining management

    NASA Astrophysics Data System (ADS)

    Gentile, Jack; Gustafson, Glen C.; Kimsey, Mary; Kraenzle, Helmut; Wilson, James; Wright, Stephen

    1997-11-01

    In the Fall of 1996, the Center for Geographic Information Science at James Madison University became involved in a project for the Department of Defense evaluating the data needs and data management systems for humanitarian demining in the Third World. In particular, the effort focused on the information needs of demining in Cambodia and in Bosnia. In the first phase of the project one team attempted to identify all sources of unclassified country data, image data and map data. Parallel with this, another group collected information and evaluations on most of the commercial off-the-shelf computer software packages for the management of such geographic information. The result was a design for the kinds of data and the kinds of systems necessary to establish and maintain such a database as a humanitarian demining management tool. The second phase of the work involved acquiring the recommended data and systems, integrating the two, and producing a demonstration of the system. In general, the configuration involves ruggedized portable computers for field use with a greatly simplified graphical user interface, supported by a more capable central facility based on Pentium workstations and appropriate technical expertise.

  15. Processing of next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data for the DuPage County streamflow simulation system

    USGS Publications Warehouse

    Bera, Maitreyee; Ortel, Terry W.

    2018-01-12

    The U.S. Geological Survey, in cooperation with DuPage County Stormwater Management Department, is testing a near real-time streamflow simulation system that assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek and West Branch DuPage River drainage basins in DuPage County, Illinois. As part of this effort, the U.S. Geological Survey maintains a database of hourly meteorological and hydrologic data for use in this near real-time streamflow simulation system. Among these data are next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data, which are retrieved from the North Central River Forecasting Center of the National Weather Service. The DuPage County streamflow simulation system uses these quantitative precipitation forecast data to create streamflow predictions for the two simulated drainage basins. This report discusses in detail how these data are processed for inclusion in the Watershed Data Management files used in the streamflow simulation system for the Salt Creek and West Branch DuPage River drainage basins.

  16. Amyloid beta (Aβ) peptide modulators and other current treatment strategies for Alzheimer’s disease (AD)

    PubMed Central

    Lukiw, Walter J.

    2012-01-01

    Introduction Alzheimer’s disease (AD) is a common, progressive neurological disorder whose incidence is reaching epidemic proportions. The prevailing ‘amyloid cascade hypothesis’, which maintains that the aberrant proteolysis of beta-amyloid precursor protein (βAPP) into neurotoxic amyloid beta (Aβ)-peptides is central to the etiopathology of AD, continues to dominate pharmacological approaches to the clinical management of this insidious disorder. This review is a compilation and update on current pharmacological strategies designed to down-regulate Aβ42-peptide generation in an effort to ameliorate the tragedy of AD. Areas Covered This review utilized on-line data searches at various open online-access websites including the Alzheimer Association, Alzheimer Research Forum; individual drug company databases; the National Institutes of Health (NIH) Medline; Pharmaprojects database; Scopus; inter-University research communications and unpublished research data. Expert Opinion Aβ immunization-, anti-acetylcholinesterase-, β-secretase-, chelation-, γ-secretase-, N-methyl D-aspartate (NMDA) receptor antagonist-, statin-based and other strategies to modulate βAPP processing have dominated pharmacological approaches directed against AD-type neurodegenerative pathology. Cumulative clinical results of these efforts remain extremely disappointing, and have had little overall impact on the clinical management of AD. While a number of novel approaches are in consideration and development, to date there is still no effective treatment or cure for this expanding healthcare concern. PMID:22439907

  17. Online tools for individuals with depression and neurologic conditions: A scoping review.

    PubMed

    Lukmanji, Sara; Pham, Tram; Blaikie, Laura; Clark, Callie; Jetté, Nathalie; Wiebe, Samuel; Bulloch, Andrew; Holroyd-Leduc, Jayna; Macrodimitris, Sophia; Mackie, Aaron; Patten, Scott B

    2017-08-01

    Patients with neurologic conditions commonly have depression. Online tools have the potential to improve outcomes in these patients in an efficient and accessible manner. We aimed to identify evidence-informed online tools for patients with comorbid neurologic conditions and depression. A scoping review of online tools (free, publicly available, and not requiring a facilitator) for patients with depression and epilepsy, Parkinson disease (PD), multiple sclerosis (MS), traumatic brain injury (TBI), or migraine was conducted. MEDLINE, EMBASE, PsycINFO, Cochrane Database of Systematic Reviews, and Cochrane CENTRAL Register of Controlled Trials were searched from database inception to January 2017 for all 5 neurologic conditions. Gray literature using Google and Google Scholar as well as app stores for both Android and Apple devices were searched. Self-management or self-efficacy online tools were not included unless they were specifically targeted at depression and one of the neurologic conditions and met the other eligibility criteria. Only 4 online tools were identified. Of these 4 tools, 2 were web-based self-management programs for patients with migraine or MS and depression. The other 2 were mobile apps for patients with PD or TBI and depression. No online tools were found for epilepsy. There are limited depression tools for people with neurologic conditions that are evidence-informed, publicly available, and free. Future research should focus on the development of high-quality, evidence-based online tools targeted at neurologic patients.

  18. Using experimental and geospatial data to estimate regional carbon sequestration potential under no-till management

    USGS Publications Warehouse

    Tan, Z.; Lal, R.; Liu, S.

    2006-01-01

    Conservation management of croplands at the plot scale has demonstrated a great potential to mitigate the greenhouse effect through sequestration of atmospheric carbon (C) into soil. This study estimated the potential of soil to sequester C through the conversion of croplands from conventional tillage (CT) to no-till (NT) in the East Central United States between 1992 and 2012. This study used the baseline soil organic C (SOC) pool (SOCP) inventory and the empirical models that describe the relationships of the SOCP under CT and NT, respectively, to their baseline SOCP in the upper 30-cm depth of soil. The baseline SOCP were obtained from the State Soil Geographic database, and the cropland distribution map was generated from the 1992 National Land Cover Database. The results indicate that if all the croplands under CT in 1992 were converted to NT, the SOCP would increase by 16.8% by 2012, which results in a total C sink of 136 Tg after 20 years. A greater sequestration rate would occur in soils with lower baseline SOCP, but the sink strength would be weaker with increasing SOCP levels. The CT-induced C sources tend to become larger in soils with higher baseline levels, which can be significantly reduced by adopting NT. We conclude that baseline SOC contents are an indicator of C sequestration potential with NT practices. ?? 2006 Lippincott Williams & Wilkins, Inc.

  19. GIS-based Geospatial Infrastructure of Water Resource Assessment for Supporting Oil Shale Development in Piceance Basin of Northwestern Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Wei; Minnick, Matthew D; Mattson, Earl D

    Oil shale deposits of the Green River Formation (GRF) in Northwestern Colorado, Southwestern Wyoming, and Northeastern Utah may become one of the first oil shale deposits to be developed in the U.S. because of their richness, accessibility, and extensive prior characterization. Oil shale is an organic-rich fine-grained sedimentary rock that contains significant amounts of kerogen from which liquid hydrocarbons can be produced. Water is needed to retort or extract oil shale at an approximate rate of three volumes of water for every volume of oil produced. Concerns have been raised over the demand and availability of water to produce oilmore » shale, particularly in semiarid regions where water consumption must be limited and optimized to meet demands from other sectors. The economic benefit of oil shale development in this region may have tradeoffs within the local and regional environment. Due to these potential environmental impacts of oil shale development, water usage issues need to be further studied. A basin-wide baseline for oil shale and water resource data is the foundation of the study. This paper focuses on the design and construction of a centralized geospatial infrastructure for managing a large amount of oil shale and water resource related baseline data, and for setting up the frameworks for analytical and numerical models including but not limited to three-dimensional (3D) geologic, energy resource development systems, and surface water models. Such a centralized geospatial infrastructure made it possible to directly generate model inputs from the same database and to indirectly couple the different models through inputs/outputs. Thus ensures consistency of analyses conducted by researchers from different institutions, and help decision makers to balance water budget based on the spatial distribution of the oil shale and water resources, and the spatial variations of geologic, topographic, and hydrogeological Characterization of the basin. This endeavor encountered many technical challenging and hasn't been done in the past for any oil shale basin. The database built during this study remains valuable for any other future studies involving oil shale and water resource management in the Piceance Basin. The methodology applied in the development of the GIS based Geospatial Infrastructure can be readily adapted for other professionals to develop database structure for other similar basins.« less

  20. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    NASA Astrophysics Data System (ADS)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  1. Preliminary surficial geologic map of a Calico Mountains piedmont and part of Coyote Lake, Mojave desert, San Bernardino County, California

    USGS Publications Warehouse

    Dudash, Stephanie L.

    2006-01-01

    This 1:24,000 scale detailed surficial geologic map and digital database of a Calico Mountains piedmont and part of Coyote Lake in south-central California depicts surficial deposits and generalized bedrock units. The mapping is part of a USGS project to investigate the spatial distribution of deposits linked to changes in climate, to provide framework geology for land use management (http://deserts.wr.usgs.gov), to understand the Quaternary tectonic history of the Mojave Desert, and to provide additional information on the history of Lake Manix, of which Coyote Lake is a sub-basin. Mapping is displayed on parts of four USGS 7.5 minute series topographic maps. The map area lies in the central Mojave Desert of California, northeast of Barstow, Calif. and south of Fort Irwin, Calif. and covers 258 sq.km. (99.5 sq.mi.). Geologic deposits in the area consist of Paleozoic metamorphic rocks, Mesozoic plutonic rocks, Miocene volcanic rocks, Pliocene-Pleistocene basin fill, and Quaternary surficial deposits. McCulloh (1960, 1965) conducted bedrock mapping and a generalized version of his maps are compiled into this map. McCulloh's maps contain many bedrock structures within the Calico Mountains that are not shown on the present map. This study resulted in several new findings, including the discovery of previously unrecognized faults, one of which is the Tin Can Alley fault. The north-striking Tin Can Alley fault is part of the Paradise fault zone (Miller and others, 2005), a potentially important feature for studying neo-tectonic strain in the Mojave Desert. Additionally, many Anodonta shells were collected in Coyote Lake lacustrine sediments for radiocarbon dating. Preliminary results support some of Meek's (1999) conclusions on the timing of Mojave River inflow into the Coyote Basin. The database includes information on geologic deposits, samples, and geochronology. The database is distributed in three parts: spatial map-based data, documentation, and printable map graphics of the database. Spatial data are distributed as an ArcInfo personal geodatabase, or as tabular data in the form of Microsoft Access Database (MDB) or dBase Format (DBF) file formats. Documentation includes this file, which provides a discussion of the surficial geology and describes the format and content of the map data, and Federal Geographic Data Committee (FGDC) metadata for the spatial map information. Map graphics files are distributed as Postscript and Adobe Acrobat Portable Document Format (PDF) files, and are appropriate for representing a view of the spatial database at the mapped scale.

  2. Root resorption during orthodontic treatment with self-ligating or conventional brackets: a systematic review and meta-analysis.

    PubMed

    Yi, Jianru; Li, Meile; Li, Yu; Li, Xiaobing; Zhao, Zhihe

    2016-11-21

    The aim of this study was to compare the external apical root resorption (EARR) in patients receiving fixed orthodontic treatment with self-ligating or conventional brackets. Studies comparing the EARR between orthodontic patients using self-ligating or conventional brackets were identified through electronic search in databases including CENTRAL, PubMed, EMBASE, China National Knowledge Infrastructure (CNKI) and SIGLE, and manual search in relevant journals and reference lists of the included studies until Apr 2016. The extraction of data and risk of bias evaluation were conducted by two investigators independently. The original outcome underwent statistical pooling by using Review Manager 5. Seven studies were included in the systematic review, out of which, five studies were statistically pooled in meta-analysis. The value of EARR of maxillary central incisors in the self-ligating bracket group was significantly lower than that in the conventional bracket group (SMD -0.31; 95% CI: -0.60--0.01). No significant differences in other incisors were observed between self-ligating and conventional brackets. Current evidences suggest self-ligating brackets do not outperform conventional brackets in reducing the EARR in maxillary lateral incisors, mandible central incisors and mandible lateral incisors. However, self-ligating brackets appear to have an advantage in protecting maxillary central incisor from EARR, which still needs to be confirmed by more high-quality studies.

  3. CNS sites cooperate to detect duplicate subjects with a clinical trial subject registry.

    PubMed

    Shiovitz, Thomas M; Wilcox, Charles S; Gevorgyan, Lilit; Shawkat, Adnan

    2013-02-01

    To report the results of the first 1,132 subjects in a pilot project where local central nervous system trial sites collaborated in the use of a subject database to identify potential duplicate subjects. Central nervous system sites in Los Angeles and Orange County, California, were contacted by the lead author to seek participation in the project. CTSdatabase, a central nervous system-focused trial subject registry, was utilized to track potential subjects at pre-screen. Subjects signed an institutional review board-approved authorization prior to participation, and site staff entered their identifiers by accessing a website. Sites were prompted to communicate with each other or with the database administrator when a match occurred between a newly entered subject and a subject already in the database. Between October 30, 2011, and August 31, 2012, 1,132 subjects were entered at nine central nervous system sites. Subjects continue to be entered, and more sites are anticipated to begin participation by the time of publication. Initially, there were concerns at a few sites over patient acceptance, financial implications, and/or legal and privacy issues, but these were eventually overcome. Patient acceptance was estimated to be above 95 percent. Duplicate Subjects (those that matched several key identifiers with subjects at different sites) made up 7.78 percent of the sample and Certain Duplicates (matching identifiers with a greater than 1 in 10 million likelihood of occurring by chance in the general population) accounted for 3.45 percent of pre-screens entered into the database. Many of these certain duplicates were not consented for studies because of the information provided by the registry. The use of a clinical trial subject registry and cooperation between central nervous system trial sites can reduce the number of duplicate and professional subjects entering clinical trials. To be fully effective, a trial subject database could be integrated into protocols across pharmaceutical companies, thereby mandating site participation and increasing the likelihood that duplicate subjects will be removed before they enter (and negatively affect) clinical trials.

  4. Development of the ageing management database of PUSPATI TRIGA reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramli, Nurhayati, E-mail: nurhayati@nm.gov.my; Tom, Phongsakorn Prak; Husain, Nurfazila

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  5. The Design and Implementation of a Relational to Network Query Translator for a Distributed Database Management System.

    DTIC Science & Technology

    1985-12-01

    RELATIONAL TO NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM TH ESI S .L Kevin H. Mahoney -- Captain, USAF AFIT/GCS/ENG/85D-7...NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM - THESIS Presented to the Faculty of the School of Engineering of the Air Force...Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Computer Systems - Kevin H. Mahoney

  6. Central diabetes insipidus: a previously unreported side effect of temozolomide.

    PubMed

    Faje, Alexander T; Nachtigall, Lisa; Wexler, Deborah; Miller, Karen K; Klibanski, Anne; Makimura, Hideo

    2013-10-01

    Temozolomide (TMZ) is an alkylating agent primarily used to treat tumors of the central nervous system. We describe 2 patients with apparent TMZ-induced central diabetes insipidus. Using our institution's Research Patient Database Registry, we identified 3 additional potential cases of TMZ-induced diabetes insipidus among a group of 1545 patients treated with TMZ. A 53-year-old male with an oligoastrocytoma and a 38-year-old male with an oligodendroglioma each developed symptoms of polydipsia and polyuria approximately 2 months after the initiation of TMZ. Laboratory analyses demonstrated hypernatremia and urinary concentrating defects, consistent with the presence of diabetes insipidus, and the patients were successfully treated with desmopressin acetate. Desmopressin acetate was withdrawn after the discontinuation of TMZ, and diabetes insipidus did not recur. Magnetic resonance imaging of the pituitary and hypothalamus was unremarkable apart from the absence of a posterior pituitary bright spot in both of the cases. Anterior pituitary function tests were normal in both cases. Using the Research Patient Database Registry database, we identified the 2 index cases and 3 additional potential cases of diabetes insipidus for an estimated prevalence of 0.3% (5 cases of diabetes insipidus per 1545 patients prescribed TMZ). Central diabetes insipidus is a rare but reversible side effect of treatment with TMZ.

  7. Central Diabetes Insipidus: A Previously Unreported Side Effect of Temozolomide

    PubMed Central

    Nachtigall, Lisa; Wexler, Deborah; Miller, Karen K.; Klibanski, Anne; Makimura, Hideo

    2013-01-01

    Context: Temozolomide (TMZ) is an alkylating agent primarily used to treat tumors of the central nervous system. We describe 2 patients with apparent TMZ-induced central diabetes insipidus. Using our institution's Research Patient Database Registry, we identified 3 additional potential cases of TMZ-induced diabetes insipidus among a group of 1545 patients treated with TMZ. Case Presentations: A 53-year-old male with an oligoastrocytoma and a 38-year-old male with an oligodendroglioma each developed symptoms of polydipsia and polyuria approximately 2 months after the initiation of TMZ. Laboratory analyses demonstrated hypernatremia and urinary concentrating defects, consistent with the presence of diabetes insipidus, and the patients were successfully treated with desmopressin acetate. Desmopressin acetate was withdrawn after the discontinuation of TMZ, and diabetes insipidus did not recur. Magnetic resonance imaging of the pituitary and hypothalamus was unremarkable apart from the absence of a posterior pituitary bright spot in both of the cases. Anterior pituitary function tests were normal in both cases. Using the Research Patient Database Registry database, we identified the 2 index cases and 3 additional potential cases of diabetes insipidus for an estimated prevalence of 0.3% (5 cases of diabetes insipidus per 1545 patients prescribed TMZ). Conclusions: Central diabetes insipidus is a rare but reversible side effect of treatment with TMZ. PMID:23928668

  8. Tautomerism in chemical information management systems

    NASA Astrophysics Data System (ADS)

    Warr, Wendy A.

    2010-06-01

    Tautomerism has an impact on many of the processes in chemical information management systems including novelty checking during registration into chemical structure databases; storage of structures; exact and substructure searching in chemical structure databases; and depiction of structures retrieved by a search. The approaches taken by 27 different software vendors and database producers are compared. It is hoped that this comparison will act as a discussion document that could ultimately improve databases and software for researchers in the future.

  9. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    PubMed

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  10. A hierarchical spatial framework and database for the national river fish habitat condition assessment

    USGS Publications Warehouse

    Wang, L.; Infante, D.; Esselman, P.; Cooper, A.; Wu, D.; Taylor, W.; Beard, D.; Whelan, G.; Ostroff, A.

    2011-01-01

    Fisheries management programs, such as the National Fish Habitat Action Plan (NFHAP), urgently need a nationwide spatial framework and database for health assessment and policy development to protect and improve riverine systems. To meet this need, we developed a spatial framework and database using National Hydrography Dataset Plus (I-.100,000-scale); http://www.horizon-systems.com/nhdplus). This framework uses interconfluence river reaches and their local and network catchments as fundamental spatial river units and a series of ecological and political spatial descriptors as hierarchy structures to allow users to extract or analyze information at spatial scales that they define. This database consists of variables describing channel characteristics, network position/connectivity, climate, elevation, gradient, and size. It contains a series of catchment-natural and human-induced factors that are known to influence river characteristics. Our framework and database assembles all river reaches and their descriptors in one place for the first time for the conterminous United States. This framework and database provides users with the capability of adding data, conducting analyses, developing management scenarios and regulation, and tracking management progresses at a variety of spatial scales. This database provides the essential data needs for achieving the objectives of NFHAP and other management programs. The downloadable beta version database is available at http://ec2-184-73-40-15.compute-1.amazonaws.com/nfhap/main/.

  11. Common Database Interface for Heterogeneous Software Engineering Tools.

    DTIC Science & Technology

    1987-12-01

    SUB-GROUP Database Management Systems ;Programming(Comuters); 1e 05 Computer Files;Information Transfer;Interfaces; 19. ABSTRACT (Continue on reverse...Air Force Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Information Systems ...Literature ..... 8 System 690 Configuration ......... 8 Database Functionis ............ 14 Software Engineering Environments ... 14 Data Manager

  12. Improving data management and dissemination in web based information systems by semantic enrichment of descriptive data aspects

    NASA Astrophysics Data System (ADS)

    Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan

    2010-10-01

    The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data retrieval, analysis, and distribution. The graphical user interfaces facilitate metadata cataloguing, data warehousing, web sensor data analysis and thematic mapping.

  13. Design and deployment of a large brain-image database for clinical and nonclinical research

    NASA Astrophysics Data System (ADS)

    Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.

    2004-04-01

    An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.

  14. Documentation of a spatial data-base management system for monitoring pesticide application in Washington

    USGS Publications Warehouse

    Schurr, K.M.; Cox, S.E.

    1994-01-01

    The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.

  15. MaizeGDB: New tools and resource

    USDA-ARS?s Scientific Manuscript database

    MaizeGDB, the USDA-ARS genetics and genomics database, is a highly curated, community-oriented informatics service to researchers focused on the crop plant and model organism Zea mays. MaizeGDB facilitates maize research by curating, integrating, and maintaining a database that serves as the central...

  16. Does integrated care reduce hospital activity for patients with chronic diseases? An umbrella review of systematic reviews

    PubMed Central

    Damery, Sarah; Flanagan, Sarah; Combes, Gill

    2016-01-01

    Objective To summarise the evidence regarding the effectiveness of integrated care interventions in reducing hospital activity. Design Umbrella review of systematic reviews and meta-analyses. Setting Interventions must have delivered care crossing the boundary between at least two health and/or social care settings. Participants Adult patients with one or more chronic diseases. Data sources MEDLINE, Embase, ASSIA, PsycINFO, HMIC, CINAHL, Cochrane Library (HTA database, DARE, Cochrane Database of Systematic Reviews), EPPI-Centre, TRIP, HEED, manual screening of references. Outcome measures Any measure of hospital admission or readmission, length of stay (LoS), accident and emergency use, healthcare costs. Results 50 reviews were included. Interventions focused on case management (n=8), chronic care model (CCM) (n=9), discharge management (n=15), complex interventions (n=3), multidisciplinary teams (MDT) (n=10) and self-management (n=5). 29 reviews reported statistically significant improvements in at least one outcome. 11/21 reviews reported significantly reduced emergency admissions (15–50%); 11/24 showed significant reductions in all-cause (10–30%) or condition-specific (15–50%) readmissions; 9/16 reported LoS reductions of 1–7 days and 4/9 showed significantly lower A&E use (30–40%). 10/25 reviews reported significant cost reductions but provided little robust evidence. Effective interventions included discharge management with postdischarge support, MDT care with teams that include condition-specific expertise, specialist nurses and/or pharmacists and self-management as an adjunct to broader interventions. Interventions were most effective when targeting single conditions such as heart failure, and when care was provided in patients’ homes. Conclusions Although all outcomes showed some significant reductions, and a number of potentially effective interventions were found, interventions rarely demonstrated unequivocally positive effects. Despite the centrality of integrated care to current policy, questions remain about whether the magnitude of potentially achievable gains is enough to satisfy national targets for reductions in hospital activity. Trial registration number CRD42015016458. PMID:27872113

  17. Systematic Review of Integrated Medical and Psychiatric Self-Management Interventions for Adults with Serious Mental Illness

    PubMed Central

    Whiteman, Karen L.; Naslund, John A.; DiNapoli, Elizabeth A.; Bruce, Martha L.; Bartels, Stephen J.

    2016-01-01

    Objective Adults with serious mental illness are disproportionately affected by medical comorbidity, earlier onset of disease, and premature mortality. Integrated self-management interventions have been developed to address both medical and psychiatric illnesses. This systematic review aimed to: review the evidence of the effect of self-management interventions targeting both medical and psychiatric illnesses and evaluate the potential for implementation. Methods Databases including CINAHL, Cochrane Central, Ovid Medline, PsycINFO, and Web of Science were searched for articles published between 1946 and July 2015. Studies evaluating integrated medical and psychiatric self-management interventions for adults with schizophrenia spectrum or mood disorders and medical comorbidity were included. Results Fifteen studies reported on nine interventions (i.e., nine randomized control trials, six pre/post designs). Most studies demonstrated feasibility, acceptability, and preliminary effectiveness; however, clinical effectiveness could not be established in most of the studies due to methodological limitations. Factors identified that may deter implementation included operating costs, impractical length of the intervention, and the workforce needs of these interventions. Conclusions Integrated medical and psychiatric illness self-management interventions appear feasible and acceptable, with high potential for clinical effectiveness. However, implementation considerations were rarely considered in intervention development, contributing to limited uptake and reach in real-world settings. PMID:27301767

  18. Integrated technologies for solid waste bin monitoring system.

    PubMed

    Arebey, Maher; Hannan, M A; Basri, Hassan; Begum, R A; Abdullah, Huda

    2011-06-01

    The integration of communication technologies such as radio frequency identification (RFID), global positioning system (GPS), general packet radio system (GPRS), and geographic information system (GIS) with a camera are constructed for solid waste monitoring system. The aim is to improve the way of responding to customer's inquiry and emergency cases and estimate the solid waste amount without any involvement of the truck driver. The proposed system consists of RFID tag mounted on the bin, RFID reader as in truck, GPRS/GSM as web server, and GIS as map server, database server, and control server. The tracking devices mounted in the trucks collect location information in real time via the GPS. This information is transferred continuously through GPRS to a central database. The users are able to view the current location of each truck in the collection stage via a web-based application and thereby manage the fleet. The trucks positions and trash bin information are displayed on a digital map, which is made available by a map server. Thus, the solid waste of the bin and the truck are being monitored using the developed system.

  19. Investigation of the environmental impacts of municipal wastewater treatment plants through a Life Cycle Assessment software tool.

    PubMed

    De Feo, G; Ferrara, C

    2017-08-01

    This paper investigates the total and per capita environmental impacts of municipal wastewater treatment in the function of the population equivalent (PE) with a Life Cycle Assessment (LCA) approach using the processes of the Ecoinvent 2.2 database available in the software tool SimaPro v.7.3. Besides the wastewater treatment plant (WWTP), the study also considers the sewerage system. The obtained results confirm that there is a 'scale factor' for the wastewater collection and treatment even in environmental terms, in addition to the well-known scale factor in terms of management costs. Thus, the more the treatment plant size is, the less the per capita environmental impacts are. However, the Ecoinvent 2.2 database does not contain information about treatment systems with a capacity lower than 30 PE. Nevertheless, worldwide there are many sparsely populated areas, where it is not convenient to realize a unique centralized WWTP. Therefore, it would be very important to conduct an LCA study in order to compare alternative on-site small-scale systems with treatment capacity of few PE.

  20. A multi-user real time inventorying system for radioactive materials: a networking approach.

    PubMed

    Mehta, S; Bandyopadhyay, D; Hoory, S

    1998-01-01

    A computerized system for radioisotope management and real time inventory coordinated across a large organization is reported. It handles hundreds of individual users and their separate inventory records. Use of highly efficient computer network and database technologies makes it possible to accept, maintain, and furnish all records related to receipt, usage, and disposal of the radioactive materials for the users separately and collectively. The system's central processor is an HP-9000/800 G60 RISC server and users from across the organization use their personal computers to login to this server using the TCP/IP networking protocol, which makes distributed use of the system possible. Radioisotope decay is automatically calculated by the program, so that it can make the up-to-date radioisotope inventory data of an entire institution available immediately. The system is specifically designed to allow use by large numbers of users (about 300) and accommodates high volumes of data input and retrieval without compromising simplicity and accuracy. Overall, it is an example of a true multi-user, on-line, relational database information system that makes the functioning of a radiation safety department efficient.

  1. Biological Databases for Behavioral Neurobiology

    PubMed Central

    Baker, Erich J.

    2014-01-01

    Databases are, at their core, abstractions of data and their intentionally derived relationships. They serve as a central organizing metaphor and repository, supporting or augmenting nearly all bioinformatics. Behavioral domains provide a unique stage for contemporary databases, as research in this area spans diverse data types, locations, and data relationships. This chapter provides foundational information on the diversity and prevalence of databases, how data structures support the various needs of behavioral neuroscience analysis and interpretation. The focus is on the classes of databases, data curation, and advanced applications in bioinformatics using examples largely drawn from research efforts in behavioral neuroscience. PMID:23195119

  2. Informatics in radiology: use of CouchDB for document-based storage of DICOM objects.

    PubMed

    Rascovsky, Simón J; Delgado, Jorge A; Sanz, Alexander; Calvo, Víctor D; Castrillón, Gabriel

    2012-01-01

    Picture archiving and communication systems traditionally have depended on schema-based Structured Query Language (SQL) databases for imaging data management. To optimize database size and performance, many such systems store a reduced set of Digital Imaging and Communications in Medicine (DICOM) metadata, discarding informational content that might be needed in the future. As an alternative to traditional database systems, document-based key-value stores recently have gained popularity. These systems store documents containing key-value pairs that facilitate data searches without predefined schemas. Document-based key-value stores are especially suited to archive DICOM objects because DICOM metadata are highly heterogeneous collections of tag-value pairs conveying specific information about imaging modalities, acquisition protocols, and vendor-supported postprocessing options. The authors used an open-source document-based database management system (Apache CouchDB) to create and test two such databases; CouchDB was selected for its overall ease of use, capability for managing attachments, and reliance on HTTP and Representational State Transfer standards for accessing and retrieving data. A large database was created first in which the DICOM metadata from 5880 anonymized magnetic resonance imaging studies (1,949,753 images) were loaded by using a Ruby script. To provide the usual DICOM query functionality, several predefined "views" (standard queries) were created by using JavaScript. For performance comparison, the same queries were executed in both the CouchDB database and a SQL-based DICOM archive. The capabilities of CouchDB for attachment management and database replication were separately assessed in tests of a similar, smaller database. Results showed that CouchDB allowed efficient storage and interrogation of all DICOM objects; with the use of information retrieval algorithms such as map-reduce, all the DICOM metadata stored in the large database were searchable with only a minimal increase in retrieval time over that with the traditional database management system. Results also indicated possible uses for document-based databases in data mining applications such as dose monitoring, quality assurance, and protocol optimization. RSNA, 2012

  3. Kingfisher: a system for remote sensing image database management

    NASA Astrophysics Data System (ADS)

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  4. The ID Database: Managing the Instructional Development Process

    ERIC Educational Resources Information Center

    Piña, Anthony A.; Sanford, Barry K.

    2017-01-01

    Management is evolving as a foundational domain to the field of instructional design and technology. However, there are few tools dedicated to the management of instructional design and development projects and activities. In this article, we describe the development, features and implementation of an instructional design database--built from a…

  5. Expansion of the MANAGE database with forest and drainage studies

    USDA-ARS?s Scientific Manuscript database

    The “Measured Annual Nutrient loads from AGricultural Environments” (MANAGE) database was published in 2006 to expand an early 1980’s compilation of nutrient export (load) data from agricultural land uses at the field or farm spatial scale. Then in 2008, MANAGE was updated with 15 additional studie...

  6. 76 FR 12617 - Airworthiness Directives; The Boeing Company Model 777-200 and -300 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-08

    ... installing new operational software for the electrical load management system and configuration database... the electrical load management system operational software and configuration database software, in... Management, P.O. Box 3707, MC 2H-65, Seattle, Washington 98124-2207; telephone 206- 544-5000, extension 1...

  7. Controlled Substance Reconciliation Accuracy Improvement Using Near Real-Time Drug Transaction Capture from Automated Dispensing Cabinets.

    PubMed

    Epstein, Richard H; Dexter, Franklin; Gratch, David M; Perino, Michael; Magrann, Jerry

    2016-06-01

    Accurate accounting of controlled drug transactions by inpatient hospital pharmacies is a requirement in the United States under the Controlled Substances Act. At many hospitals, manual distribution of controlled substances from pharmacies is being replaced by automated dispensing cabinets (ADCs) at the point of care. Despite the promise of improved accountability, a high prevalence (15%) of controlled substance discrepancies between ADC records and anesthesia information management systems (AIMS) has been published, with a similar incidence (15.8%; 95% confidence interval [CI], 15.3% to 16.2%) noted at our institution. Most reconciliation errors are clerical. In this study, we describe a method to capture drug transactions in near real-time from our ADCs, compare them with documentation in our AIMS, and evaluate subsequent improvement in reconciliation accuracy. ADC-controlled substance transactions are transmitted to a hospital interface server, parsed, reformatted, and sent to a software script written in Perl. The script extracts the data and writes them to a SQL Server database. Concurrently, controlled drug totals for each patient having care are documented in the AIMS and compared with the balance of the ADC transactions (i.e., vending, transferring, wasting, and returning drug). Every minute, a reconciliation report is available to anesthesia providers over the hospital Intranet from AIMS workstations. The report lists all patients, the current provider, the balance of ADC transactions, the totals from the AIMS, the difference, and whether the case is still ongoing or had concluded. Accuracy and latency of the ADC transaction capture process were assessed via simulation and by comparison with pharmacy database records, maintained by the vendor on a central server located remotely from the hospital network. For assessment of reconciliation accuracy over time, data were collected from our AIMS from January 2012 to June 2013 (Baseline), July 2013 to April 2014 (Next Day Reports), and May 2014 to September 2015 (Near Real-Time Reports) and reconciled against pharmacy records from the central pharmacy database maintained by the vendor. Control chart (batch means) methods were used between successive epochs to determine if improvement had taken place. During simulation, 100% of 10,000 messages, transmitted at a rate of 1295 per minute, were accurately captured and inserted into the database. Latency (transmission time to local database insertion time) was 46.3 ± 0.44 milliseconds (SEM). During acceptance testing, only 1 of 1384 transactions analyzed had a difference between the near real-time process and what was in the central database; this was for a "John Doe" patient whose name had been changed subsequent to data capture. Once a transaction was entered at the ADC workstation, 84.9% (n = 18 bins; 95% CI, 78.4% to 91.3%) of these transactions were available in the database on the AIMS server within 2 minutes. Within 5 minutes, 98.2% (n = 18 bins; 95% CI, 97.2% to 99.3%) were available. Among 145,642 transactions present in the central pharmacy database, only 24 were missing from the local database table (mean = 0.018%; 95% CI, 0.002% to 0.034%). Implementation of near real-time reporting improved the controlled substance reconciliation error rate compared to the previous Next Day Reports epoch, from 8.8% to 5.2% (difference = -3.6%; 95% CI, -4.3% to -2.8%; P < 10). Errors were distributed among staff, with 50% of discrepancies accounted for by 12.4% of providers and 80% accounted for by 28.5% of providers executing transactions during the Near Real-Time Reports epoch. The near real-time system for the capture of transactional data flowing over the hospital network was highly accurate, reliable, and exhibited acceptable latency. This methodology can be used to implement similar data capture for transactions from their drug ADCs. Reconciliation accuracy improved significantly as a result of implementation. Our approach may be of particular utility at facilities with limited pharmacy resources to audit anesthesia records for controlled substance administration and reconcile them against dispensing records.

  8. Transcriptome Analysis and Differential Gene Expression on the Testis of Orange Mud Crab, Scylla olivacea, during Sexual Maturation

    PubMed Central

    Waiho, Khor; Fazhan, Hanafiah; Shahreza, Md Sheriff; Moh, Julia Hwei Zhong; Noorbaiduri, Shaibani; Wong, Li Lian; Sinnasamy, Saranya

    2017-01-01

    Adequate genetic information is essential for sustainable crustacean fisheries and aquaculture management. The commercially important orange mud crab, Scylla olivacea, is prevalent in Southeast Asia region and is highly sought after. Although it is a suitable aquaculture candidate, full domestication of this species is hampered by the lack of knowledge about the sexual maturation process and the molecular mechanisms behind it, especially in males. To date, data on its whole genome is yet to be reported for S. olivacea. The available transcriptome data published previously on this species focus primarily on females and the role of central nervous system in reproductive development. De novo transcriptome sequencing for the testes of S. olivacea from immature, maturing and mature stages were performed. A total of approximately 144 million high-quality reads were generated and de novo assembled into 160,569 transcripts with a total length of 142.2 Mb. Approximately 15–23% of the total assembled transcripts were annotated when compared to public protein sequence databases (i.e. UniProt database, Interpro database, Pfam database and Drosophila melanogaster protein database), and GO-categorised with GO Ontology terms. A total of 156,181 high-quality Single-Nucleotide Polymorphisms (SNPs) were mined from the transcriptome data of present study. Transcriptome comparison among the testes of different maturation stages revealed one gene (beta crystallin like gene) with the most significant differential expression—up-regulated in immature stage and down-regulated in maturing and mature stages. This was further validated by qRT-PCR. In conclusion, a comprehensive transcriptome of the testis of orange mud crabs from different maturation stages were obtained. This report provides an invaluable resource for enhancing our understanding of this species’ genome structure and biology, as expressed and controlled by their gonads. PMID:28135340

  9. The ALS patient care database: goals, design, and early results. ALS C.A.R.E. Study Group.

    PubMed

    Miller, R G; Anderson, F A; Bradley, W G; Brooks, B R; Mitsumoto, H; Munsat, T L; Ringel, S P

    2000-01-11

    The ALS Patient Care Database was created to improve the quality of care for patients with ALS by 1) providing neurologists with data to evaluate and improve their practices, 2) publishing data on temporal trends in the care of patients with ALS, and 3) developing hypotheses to be tested during formal clinical trials. Substantial variations exist in managing ALS, but there has been no North American database to measure outcomes in ALS until now. This observational database is open to all neurologists practicing in North America, who are encouraged to enroll both incident and prevalent ALS patients. Longitudinal data are collected at intervals of 3 to 6 months by using standard data collection instruments. Forms are submitted to a central data coordinating center, which mails quarterly reports to participating neurologists. Beginning in September 1996 through November 30, 1998, 1,857 patients were enrolled at 83 clinical sites. On enrollment, patients had a mean age of 58.6 years +/-12.9 (SD) years (range, 20.1 to 95.1 years), 92% were white, and 61% were men. The mean interval between onset of symptoms and diagnosis was 1.2+/-1.6 years (range, 0 to 31.9 years). Riluzole was the most frequently used disease-specific therapy (48%). Physical therapy was the most common nonpharmacologic intervention (45%). The primary caregiver was generally the spouse (77%). Advance directives were in place at the time of death for 70% of 213 enrolled patients who were reported to have died. The ALS Patient Care Database appears to provide valuable data on physician practices and patient-focused outcomes in ALS.

  10. Database of the Geologic Map of North America - Adapted from the Map by J.C. Reed, Jr. and others (2005)

    USGS Publications Warehouse

    Garrity, Christopher P.; Soller, David R.

    2009-01-01

    The Geological Society of America's (GSA) Geologic Map of North America (Reed and others, 2005; 1:5,000,000) shows the geology of a significantly large area of the Earth, centered on North and Central America and including the submarine geology of parts of the Atlantic and Pacific Oceans. This map is now converted to a Geographic Information System (GIS) database that contains all geologic and base-map information shown on the two printed map sheets and the accompanying explanation sheet. We anticipate this map database will be revised at some unspecified time in the future, likely through the actions of a steering committee managed by the Geological Society of America (GSA) and staffed by scientists from agencies including, but not limited to, those responsible for the original map compilation (U.S. Geological Survey, Geological Survey of Canada, and Woods Hole Oceanographic Institute). Regarding the use of this product, as noted by the map's compilers: 'The Geologic Map of North America is an essential educational tool for teaching the geology of North America to university students and for the continuing education of professional geologists in North America and elsewhere. In addition, simplified maps derived from the Geologic Map of North America are useful for enlightening younger students and the general public about the geology of the continent.' With publication of this database, the preparation of any type of simplified map is made significantly easier. More important perhaps, the database provides a more accessible means to explore the map information and to compare and analyze it in conjunction with other types of information (for example, land use, soils, biology) to better understand the complex interrelations among factors that affect Earth resources, hazards, ecosystems, and climate.

  11. Pacific walrus coastal haulout database, 1852-2016— Background report

    USGS Publications Warehouse

    Fischbach, Anthony S.; Kochnev, Anatoly A.; Garlich-Miller, Joel L.; Jay, Chadwick V.

    2016-01-01

    Walruses are large benthic predators that rest out of water between foraging bouts. Coastal “haulouts” (places where walruses rest) are formed by adult males in summer and sometimes by females and young when sea ice is absent, and are often used repeatedly across seasons and years. Understanding the geography and historical use of haulouts provides a context for conservation efforts. We summarize information on Pacific walrus haulouts from available reports (n =151), interviews with coastal residents and aviators, and personal observations of the authors. We provide this in the form of a georeferenced database that can be queried and displayed with standard geographic information system and database management software. The database contains 150 records of Pacific walrus haulouts, with a summary of basic characteristics on maximum haulout aggregation size, age-sex composition, season of use, and decade of most recent use. Citations to reports are provided in the appendix and as a bibliographic database. Haulouts were distributed across the coasts of the Pacific walrus range; however, the largest (maximum >10,000 walruses) of the haulouts reported in the recent 4 decades (n=19) were concentrated on the Russian shores in regions near the Bering Strait and northward into the western Chukchi Sea (n=17). Haulouts of adult female and young walruses primarily occurred in the Bering Strait region and areas northward, with others occurring in the central Bering Sea, Gulf of Anadyr, and Saint Lawrence Island regions. The Gulf of Anadyr was the only region to contain female and young walrus haulouts, which formed after the northward spring migration and prior to autumn ice formation.

  12. The Astrobiology Habitable Environments Database (AHED)

    NASA Astrophysics Data System (ADS)

    Lafuente, B.; Stone, N.; Downs, R. T.; Blake, D. F.; Bristow, T.; Fonda, M.; Pires, A.

    2015-12-01

    The Astrobiology Habitable Environments Database (AHED) is a central, high quality, long-term searchable repository for archiving and collaborative sharing of astrobiologically relevant data, including, morphological, textural and contextural images, chemical, biochemical, isotopic, sequencing, and mineralogical information. The aim of AHED is to foster long-term innovative research by supporting integration and analysis of diverse datasets in order to: 1) help understand and interpret planetary geology; 2) identify and characterize habitable environments and pre-biotic/biotic processes; 3) interpret returned data from present and past missions; 4) provide a citable database of NASA-funded published and unpublished data (after an agreed-upon embargo period). AHED uses the online open-source software "The Open Data Repository's Data Publisher" (ODR - http://www.opendatarepository.org) [1], which provides a user-friendly interface that research teams or individual scientists can use to design, populate and manage their own database according to the characteristics of their data and the need to share data with collaborators or the broader scientific community. This platform can be also used as a laboratory notebook. The database will have the capability to import and export in a variety of standard formats. Advanced graphics will be implemented including 3D graphing, multi-axis graphs, error bars, and similar scientific data functions together with advanced online tools for data analysis (e. g. the statistical package, R). A permissions system will be put in place so that as data are being actively collected and interpreted, they will remain proprietary. A citation system will allow research data to be used and appropriately referenced by other researchers after the data are made public. This project is supported by the Science-Enabling Research Activity (SERA) and NASA NNX11AP82A, Mars Science Laboratory Investigations. [1] Nate et al. (2015) AGU, submitted.

  13. Database for content of mercury in Polish brown coal

    NASA Astrophysics Data System (ADS)

    Jastrząb, Krzysztof

    2018-01-01

    Poland is rated among the countries with largest level of mercury emission in Europe. According to information provided by the National Centre for Balancing and Management of Emissions (KOBiZE) more than 10.5 tons of mercury and its compounds were emitted into the atmosphere in 2015 from the area of Poland. Within the scope of the BazaHg project lasting from 2014 to 2015 and co-financed from the National Centre of Research and Development (NCBiR) a database was set up with specification of mercury content in Polish hard steam coal, coking coal and brown coal (lignite) grades. With regard to domestic brown coal the database comprises information on coal grades from Brown Coal Mines of `Bełchatów', `Adamów', `Turów' and `Sieniawa'. Currently the database contains 130 records with parameters of brown coal, where each record stands for technical analysis (content of moisture, ash and volatile particles), elemental analysis (CHNS), content of chlorine and mercury as well as net calorific value and combustion heat. Content of mercury in samples of brown coal grades under test ranged from 44 to 985 μg of Hg/kg with the average level of 345 μg of Hg/kg. The established database makes up a reliable and trustworthy source of information about content of mercury in Polish fossils. The foregoing details completed with information about consumption of coal by individual electric power stations and multiplied by appropriate emission coefficients may serve as the background to establish loads of mercury emitted into atmosphere from individual stations and by the entire sector of power engineering in total. It will also enable Polish central organizations and individual business entities to implement reasonable policy with respect of mercury emission into atmosphere.

  14. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    ERIC Educational Resources Information Center

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  15. Migration from relational to NoSQL database

    NASA Astrophysics Data System (ADS)

    Ghotiya, Sunita; Mandal, Juhi; Kandasamy, Saravanakumar

    2017-11-01

    Data generated by various real time applications, social networking sites and sensor devices is of very huge amount and unstructured, which makes it difficult for Relational database management systems to handle the data. Data is very precious component of any application and needs to be analysed after arranging it in some structure. Relational databases are only able to deal with structured data, so there is need of NoSQL Database management System which can deal with semi -structured data also. Relational database provides the easiest way to manage the data but as the use of NoSQL is increasing it is becoming necessary to migrate the data from Relational to NoSQL databases. Various frameworks has been proposed previously which provides mechanisms for migration of data stored at warehouses in SQL, middle layer solutions which can provide facility of data to be stored in NoSQL databases to handle data which is not structured. This paper provides a literature review of some of the recent approaches proposed by various researchers to migrate data from relational to NoSQL databases. Some researchers proposed mechanisms for the co-existence of NoSQL and Relational databases together. This paper provides a summary of mechanisms which can be used for mapping data stored in Relational databases to NoSQL databases. Various techniques for data transformation and middle layer solutions are summarised in the paper.

  16. Plant Genome Resources at the National Center for Biotechnology Information

    PubMed Central

    Wheeler, David L.; Smith-White, Brian; Chetvernin, Vyacheslav; Resenchuk, Sergei; Dombrowski, Susan M.; Pechous, Steven W.; Tatusova, Tatiana; Ostell, James

    2005-01-01

    The National Center for Biotechnology Information (NCBI) integrates data from more than 20 biological databases through a flexible search and retrieval system called Entrez. A core Entrez database, Entrez Nucleotide, includes GenBank and is tightly linked to the NCBI Taxonomy database, the Entrez Protein database, and the scientific literature in PubMed. A suite of more specialized databases for genomes, genes, gene families, gene expression, gene variation, and protein domains dovetails with the core databases to make Entrez a powerful system for genomic research. Linked to the full range of Entrez databases is the NCBI Map Viewer, which displays aligned genetic, physical, and sequence maps for eukaryotic genomes including those of many plants. A specialized plant query page allow maps from all plant genomes covered by the Map Viewer to be searched in tandem to produce a display of aligned maps from several species. PlantBLAST searches against the sequences shown in the Map Viewer allow BLAST alignments to be viewed within a genomic context. In addition, precomputed sequence similarities, such as those for proteins offered by BLAST Link, enable fluid navigation from unannotated to annotated sequences, quickening the pace of discovery. NCBI Web pages for plants, such as Plant Genome Central, complete the system by providing centralized access to NCBI's genomic resources as well as links to organism-specific Web pages beyond NCBI. PMID:16010002

  17. The Finnish disease heritage database (FinDis) update-a database for the genes mutated in the Finnish disease heritage brought to the next-generation sequencing era.

    PubMed

    Polvi, Anne; Linturi, Henna; Varilo, Teppo; Anttonen, Anna-Kaisa; Byrne, Myles; Fokkema, Ivo F A C; Almusa, Henrikki; Metzidis, Anthony; Avela, Kristiina; Aula, Pertti; Kestilä, Marjo; Muilu, Juha

    2013-11-01

    The Finnish Disease Heritage Database (FinDis) (http://findis.org) was originally published in 2004 as a centralized information resource for rare monogenic diseases enriched in the Finnish population. The FinDis database originally contained 405 causative variants for 30 diseases. At the time, the FinDis database was a comprehensive collection of data, but since 1994, a large amount of new information has emerged, making the necessity to update the database evident. We collected information and updated the database to contain genes and causative variants for 35 diseases, including six more genes and more than 1,400 additional disease-causing variants. Information for causative variants for each gene is collected under the LOVD 3.0 platform, enabling easy updating. The FinDis portal provides a centralized resource and user interface to link information on each disease and gene with variant data in the LOVD 3.0 platform. The software written to achieve this has been open-sourced and made available on GitHub (http://github.com/findis-db), allowing biomedical institutions in other countries to present their national data in a similar way, and to both contribute to, and benefit from, standardized variation data. The updated FinDis portal provides a unique resource to assist patient diagnosis, research, and the development of new cures. © 2013 WILEY PERIODICALS, INC.

  18. National Library of Medicine

    MedlinePlus

    ... Disasters and Public Health Emergencies The NLM Disaster Information Management Research Center has tools, guides, and databases to ... Disasters and Public Health Emergencies The NLM Disaster Information Management Research Center has tools, guides, and databases to ...

  19. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Dezaki, Kyoko; Saeki, Makoto

    Rapid progress in advanced informationalization has increased need to enforce documentation activities in industries. Responding to it Tokin Corporation has been engaged in database construction for patent information, technical reports and so on accumulated inside the Company. Two results are obtained; One is TOPICS, inhouse patent information management system, the other is TOMATIS, management and technical information system by use of personal computers and all-purposed relational database software. These systems aim at compiling databases of patent and technological management information generated internally and externally by low labor efforts as well as low cost, and providing for comprehensive information company-wide. This paper introduces the outline of these systems and how they are actually used.

  20. Innovative Use of Existing Public and Private Data Sources for Postmarketing Surveillance of Central Line-Associated Bloodstream Infections Associated With Intravenous Needleless Connectors

    PubMed Central

    Tabak, Ying P.; Johannes, Richard S.; Sun, Xiaowu; Crosby, Cynthia T.

    2016-01-01

    The Centers for Medicare and Medicaid Services (CMS) Hospital Compare central line-associated bloodstream infection (CLABSI) data and private databases containing new-generation intravenous needleless connector (study NC) use at the hospital level were linked. The relative risk (RR) of CLABSI associated with the study NCs was estimated, adjusting for hospital characteristics. Among 3074 eligible hospitals in the 2013 CMS database, 758 (25%) hospitals used the study NCs. The study NC hospitals had a lower unadjusted CLABSI rate (1.03 vs 1.13 CLABSIs per 1000 central line days, P < .0001) compared with comparator hospitals. The adjusted RR for CLABSI was 0.94 (95% confidence interval: 0.86, 1.02; P = .11). PMID:27598072

Top