Sample records for technology assessment database

  1. Potential use of routine databases in health technology assessment.

    PubMed

    Raftery, J; Roderick, P; Stevens, A

    2005-05-01

    To develop criteria for classifying databases in relation to their potential use in health technology (HT) assessment and to apply them to a list of databases of relevance in the UK. To explore the extent to which prioritized databases could pick up those HTs being assessed by the National Coordinating Centre for Health Technology Assessment (NCCHTA) and the extent to which these databases have been used in HT assessment. To explore the validation of the databases and their cost. Electronic databases. Key literature sources. Experienced users of routine databases. A 'first principles' examination of the data necessary for each type of HT assessment was carried out, supplemented by literature searches and a historical review. The principal investigators applied the criteria to the databases. Comments of the 'keepers' of the prioritized databases were incorporated. Details of 161 topics funded by the NHS R&D Health Technology Assessment (HTA) programme were reviewed iteratively by the principal investigators. Uses of databases in HTAs were identified by literature searches, which included the title of each prioritized database as a keyword. Annual reports of databases were examined and 'keepers' queried. The validity of each database was assessed using criteria based on a literature search and involvement by the authors in a national academic network. The costs of databases were established from annual reports, enquiries to 'keepers' of databases and 'guesstimates' based on cost per record. For assessing effectiveness, equity and diffusion, routine databases were classified into three broad groups: (1) group I databases, identifying both HTs and health states, (2) group II databases, identifying the HTs, but not a health state, and (3) group III databases, identifying health states, but not an HT. Group I datasets were disaggregated into clinical registries, clinical administrative databases and population-oriented databases. Group III were disaggregated into adverse event reporting, confidential enquiries, disease-only registers and health surveys. Databases in group I can be used not only to assess effectiveness but also to assess diffusion and equity. Databases in group II can only assess diffusion. Group III has restricted scope for assessing HTs, except for analysis of adverse events. For use in costing, databases need to include unit costs or prices. Some databases included unit cost as well as a specific HT. A list of around 270 databases was identified at the level of UK, England and Wales or England (over 1000 including Scotland, Wales and Northern Ireland). Allocation of these to the above groups identified around 60 databases with some potential for HT assessment, roughly half to group I. Eighteen clinical registers were identified as having the greatest potential although the clinical administrative datasets had potential mainly owing to their inclusion of a wide range of technologies. Only two databases were identified that could directly be used in costing. The review of the potential capture of HTs prioritized by the UK's NHS R&D HTA programme showed that only 10% would be captured in these databases, mainly drugs prescribed in primary care. The review of the use of routine databases in any form of HT assessment indicated that clinical registers were mainly used for national comparative audit. Some databases have only been used in annual reports, usually time trend analysis. A few peer-reviewed papers used a clinical register to assess the effectiveness of a technology. Accessibility is suggested as a barrier to using most databases. Clinical administrative databases (group Ib) have mainly been used to build population needs indices and performance indicators. A review of the validity of used databases showed that although internal consistency checks were common, relatively few had any form of external audit. Some comparative audit databases have data scrutinised by participating units. Issues around coverage and coding have, in general, received little attention. NHS funding of databases has been mainly for 'Central Returns' for management purposes, which excludes those databases with the greatest potential for HT assessment. Funding for databases was various, but some are unfunded, relying on goodwill. The estimated total cost of databases in group I plus selected databases from groups II and III has been estimated at pound 50 million or around 0.1% of annual NHS spend. A few databases with limited potential for HT assessment account for the bulk of spending. Suggestions for policy include clarification of responsibility for the strategic development of databases, improved resourcing, and issues around coding, confidentiality, ownership and access, maintenance of clinical support, optimal use of information technology, filling gaps and remedying deficiencies. Recommendations for researchers include closer policy links between routine data and R&D, and selective investment in the more promising databases. Recommended research topics include optimal capture and coding of the range of HTs, international comparisons of the role, funding and use of routine data in healthcare systems and use of routine database in trials and in modelling. Independent evaluations are recommended for information strategies (such as those around the National Service Frameworks and various collaborations) and for electronic patient and health records.

  2. Designing Corporate Databases to Support Technology Innovation

    ERIC Educational Resources Information Center

    Gultz, Michael Jarett

    2012-01-01

    Based on a review of the existing literature on database design, this study proposed a unified database model to support corporate technology innovation. This study assessed potential support for the model based on the opinions of 200 technology industry executives, including Chief Information Officers, Chief Knowledge Officers and Chief Learning…

  3. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    PubMed

    Saokaew, Surasak; Sugimoto, Takashi; Kamae, Isao; Pratoomsoot, Chayanin; Chaiyakunapruk, Nathorn

    2015-01-01

    Health technology assessment (HTA) has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced. Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided. Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources. Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  4. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research

    PubMed Central

    Saokaew, Surasak; Sugimoto, Takashi; Kamae, Isao; Pratoomsoot, Chayanin; Chaiyakunapruk, Nathorn

    2015-01-01

    Background Health technology assessment (HTA) has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced. Method Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases’ characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided. Results Forty databases– 20 from Thailand and 20 from Japan—were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources. Conclusion Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed. PMID:26560127

  5. Design and implementation of website information disclosure assessment system.

    PubMed

    Cho, Ying-Chiang; Pan, Jen-Yi

    2015-01-01

    Internet application technologies, such as cloud computing and cloud storage, have increasingly changed people's lives. Websites contain vast amounts of personal privacy information. In order to protect this information, network security technologies, such as database protection and data encryption, attract many researchers. The most serious problems concerning web vulnerability are e-mail address and network database leakages. These leakages have many causes. For example, malicious users can steal database contents, taking advantage of mistakes made by programmers and administrators. In order to mitigate this type of abuse, a website information disclosure assessment system is proposed in this study. This system utilizes a series of technologies, such as web crawler algorithms, SQL injection attack detection, and web vulnerability mining, to assess a website's information disclosure. Thirty websites, randomly sampled from the top 50 world colleges, were used to collect leakage information. This testing showed the importance of increasing the security and privacy of website information for academic websites.

  6. New perspectives in toxicological information management, and the role of ISSTOX databases in assessing chemical mutagenicity and carcinogenicity.

    PubMed

    Benigni, Romualdo; Battistelli, Chiara Laura; Bossa, Cecilia; Tcheremenskaia, Olga; Crettaz, Pierre

    2013-07-01

    Currently, the public has access to a variety of databases containing mutagenicity and carcinogenicity data. These resources are crucial for the toxicologists and regulators involved in the risk assessment of chemicals, which necessitates access to all the relevant literature, and the capability to search across toxicity databases using both biological and chemical criteria. Towards the larger goal of screening chemicals for a wide range of toxicity end points of potential interest, publicly available resources across a large spectrum of biological and chemical data space must be effectively harnessed with current and evolving information technologies (i.e. systematised, integrated and mined), if long-term screening and prediction objectives are to be achieved. A key to rapid progress in the field of chemical toxicity databases is that of combining information technology with the chemical structure as identifier of the molecules. This permits an enormous range of operations (e.g. retrieving chemicals or chemical classes, describing the content of databases, finding similar chemicals, crossing biological and chemical interrogations, etc.) that other more classical databases cannot allow. This article describes the progress in the technology of toxicity databases, including the concepts of Chemical Relational Database and Toxicological Standardized Controlled Vocabularies (Ontology). Then it describes the ISSTOX cluster of toxicological databases at the Istituto Superiore di Sanitá. It consists of freely available databases characterised by the use of modern information technologies and by curation of the quality of the biological data. Finally, this article provides examples of analyses and results made possible by ISSTOX.

  7. Solar Sail Propulsion Technology Readiness Level Database

    NASA Technical Reports Server (NTRS)

    Adams, Charles L.

    2004-01-01

    The NASA In-Space Propulsion Technology (ISPT) Projects Office has been sponsoring 2 solar sail system design and development hardware demonstration activities over the past 20 months. Able Engineering Company (AEC) of Goleta, CA is leading one team and L Garde, Inc. of Tustin, CA is leading the other team. Component, subsystem and system fabrication and testing has been completed successfully. The goal of these activities is to advance the technology readiness level (TRL) of solar sail propulsion from 3 towards 6 by 2006. These activities will culminate in the deployment and testing of 20-meter solar sail system ground demonstration hardware in the 30 meter diameter thermal-vacuum chamber at NASA Glenn Plum Brook in 2005. This paper will describe the features of a computer database system that documents the results of the solar sail development activities to-date. Illustrations of the hardware components and systems, test results, analytical models, relevant space environment definition and current TRL assessment, as stored and manipulated within the database are presented. This database could serve as a central repository for all data related to the advancement of solar sail technology sponsored by the ISPT, providing an up-to-date assessment of the TRL of this technology. Current plans are to eventually make the database available to the Solar Sail community through the Space Transportation Information Network (STIN).

  8. Combining new technologies for effective collection development: a bibliometric study using CD-ROM and a database management program.

    PubMed Central

    Burnham, J F; Shearer, B S; Wall, J C

    1992-01-01

    Librarians have used bibliometrics for many years to assess collections and to provide data for making selection and deselection decisions. With the advent of new technology--specifically, CD-ROM databases and reprint file database management programs--new cost-effective procedures can be developed. This paper describes a recent multidisciplinary study conducted by two library faculty members and one allied health faculty member to test a bibliometric method that used the MEDLINE and CINAHL databases on CD-ROM and the Papyrus database management program to produce a new collection development methodology. PMID:1600424

  9. Examining the Factors That Contribute to Successful Database Application Implementation Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Nworji, Alexander O.

    2013-01-01

    Most organizations spend millions of dollars due to the impact of improperly implemented database application systems as evidenced by poor data quality problems. The purpose of this quantitative study was to use, and extend, the technology acceptance model (TAM) to assess the impact of information quality and technical quality factors on database…

  10. Design and Implementation of Website Information Disclosure Assessment System

    PubMed Central

    Cho, Ying-Chiang; Pan, Jen-Yi

    2015-01-01

    Internet application technologies, such as cloud computing and cloud storage, have increasingly changed people’s lives. Websites contain vast amounts of personal privacy information. In order to protect this information, network security technologies, such as database protection and data encryption, attract many researchers. The most serious problems concerning web vulnerability are e-mail address and network database leakages. These leakages have many causes. For example, malicious users can steal database contents, taking advantage of mistakes made by programmers and administrators. In order to mitigate this type of abuse, a website information disclosure assessment system is proposed in this study. This system utilizes a series of technologies, such as web crawler algorithms, SQL injection attack detection, and web vulnerability mining, to assess a website’s information disclosure. Thirty websites, randomly sampled from the top 50 world colleges, were used to collect leakage information. This testing showed the importance of increasing the security and privacy of website information for academic websites. PMID:25768434

  11. A novel approach: chemical relational databases, and the role of the ISSCAN database on assessing chemical carcinogenicity.

    PubMed

    Benigni, Romualdo; Bossa, Cecilia; Richard, Ann M; Yang, Chihae

    2008-01-01

    Mutagenicity and carcinogenicity databases are crucial resources for toxicologists and regulators involved in chemicals risk assessment. Until recently, existing public toxicity databases have been constructed primarily as "look-up-tables" of existing data, and most often did not contain chemical structures. Concepts and technologies originated from the structure-activity relationships science have provided powerful tools to create new types of databases, where the effective linkage of chemical toxicity with chemical structure can facilitate and greatly enhance data gathering and hypothesis generation, by permitting: a) exploration across both chemical and biological domains; and b) structure-searchability through the data. This paper reviews the main public databases, together with the progress in the field of chemical relational databases, and presents the ISSCAN database on experimental chemical carcinogens.

  12. Soil Organic Carbon for Global Benefits - assessing potential SOC increase under SLM technologies worldwide and evaluating tradeoffs and gains of upscaling SLM technologies

    NASA Astrophysics Data System (ADS)

    Wolfgramm, Bettina; Hurni, Hans; Liniger, Hanspeter; Ruppen, Sebastian; Milne, Eleanor; Bader, Hans-Peter; Scheidegger, Ruth; Amare, Tadele; Yitaferu, Birru; Nazarmavloev, Farrukh; Conder, Malgorzata; Ebneter, Laura; Qadamov, Aslam; Shokirov, Qobiljon; Hergarten, Christian; Schwilch, Gudrun

    2013-04-01

    There is a fundamental mutual interest between enhancing soil organic carbon (SOC) in the world's soils and the objectives of the major global environmental conventions (UNFCCC, UNCBD, UNCCD). While there is evidence at the case study level that sustainable land management (SLM) technologies increase SOC stocks and SOC related benefits, there is no quantitative data available on the potential for increasing SOC benefits from different SLM technologies and especially from case studies in the developing countries, and a clear understanding of the trade-offs related to SLM up-scaling is missing. This study aims at assessing the potential increase of SOC under SLM technologies worldwide, evaluating tradeoffs and gains in up-scaling SLM for case studies in Tajikistan, Ethiopia and Switzerland. It makes use of the SLM technologies documented in the online database of the World Overview of Conservation Approaches and Technologies (WOCAT). The study consists of three components: 1) Identifying SOC benefits contributing to the major global environmental issues for SLM technologies worldwide as documented in the WOCAT global database 2) Validation of SOC storage potentials and SOC benefit predictions for SLM technologies from the WOCAT database using results from existing comparative case studies at the plot level, using soil spectral libraries and standardized documentations of ecosystem service from the WOCAT database. 3) Understanding trade-offs and win-win scenarios of up-scaling SLM technologies from the plot to the household and landscape level using material flow analysis. This study builds on the premise that the most promising way to increase benefits from land management is to consider already existing sustainable strategies. Such SLM technologies from all over the world documented are accessible in a standardized way in the WOCAT online database. The study thus evaluates SLM technologies from the WOCAT database by calculating the potential SOC storage increase and related benefits by comparing SOC estimates before-and-after establishment of the SLM technology. These results are validated using comparative case studies of plots with-and-without SLM technologies (existing SLM systems versus surrounding, degrading systems). In view of upscaling SLM technologies, it is crucial to understand tradeoffs and gains supporting or hindering the further spread. Systemic biomass management analysis using material flow analysis allows quantifying organic carbon flows and storages for different land management options at the household, but also at landscape level. The study shows results relevant for science, policy and practice for accounting, monitoring and evaluating SOC related ecosystem services: - A comprehensive methodology for SLM impact assessments allowing quantification of SOC storage and SOC related benefits under different SLM technologies, and - Improved understanding of upscaling options for SLM technologies and tradeoffs as well as win-win opportunities for biomass management, SOC content increase, and ecosystem services improvement at the plot and household level.

  13. Impact of Technology on the University of Miami.

    ERIC Educational Resources Information Center

    Little, Robert O.; Temares, M. Lewis

    As part of a long-range information systems planning effort at the University of Miami, the impact of technology on the organization was assessed. The assessment covered hardware, office automation, systems and database software, and communications. The trends in computer hardware point toward continued decreasing size and cost, placing computer…

  14. The human role in space (THURIS) applications study. Final briefing

    NASA Technical Reports Server (NTRS)

    Maybee, George W.

    1987-01-01

    The THURIS (The Human Role in Space) application is an iterative process involving successive assessments of man/machine mixes in terms of performance, cost and technology to arrive at an optimum man/machine mode for the mission application. The process begins with user inputs which define the mission in terms of an event sequence and performance time requirements. The desired initial operational capability date is also an input requirement. THURIS terms and definitions (e.g., generic activities) are applied to the input data converting it into a form which can be analyzed using the THURIS cost model outputs. The cost model produces tabular and graphical outputs for determining the relative cost-effectiveness of a given man/machine mode and generic activity. A technology database is provided to enable assessment of support equipment availability for selected man/machine modes. If technology gaps exist for an application, the database contains information supportive of further investigation into the relevant technologies. The present study concentrated on testing and enhancing the THURIS cost model and subordinate data files and developing a technology database which interfaces directly with the user via technology readiness displays. This effort has resulted in a more powerful, easy-to-use applications system for optimization of man/machine roles. Volume 1 is an executive summary.

  15. Finding Qualitative Research Evidence for Health Technology Assessment.

    PubMed

    DeJean, Deirdre; Giacomini, Mita; Simeonov, Dorina; Smith, Andrea

    2016-08-01

    Health technology assessment (HTA) agencies increasingly use reviews of qualitative research as evidence for evaluating social, experiential, and ethical aspects of health technologies. We systematically searched three bibliographic databases (MEDLINE, CINAHL, and Social Science Citation Index [SSCI]) using published search filters or "hedges" and our hybrid filter to identify qualitative research studies pertaining to chronic obstructive pulmonary disease and early breast cancer. The search filters were compared in terms of sensitivity, specificity, and precision. Our screening by title and abstract revealed that qualitative research constituted only slightly more than 1% of all published research on each health topic. The performance of the published search filters varied greatly across topics and databases. Compared with existing search filters, our hybrid filter demonstrated a consistently high sensitivity across databases and topics, and minimized the resource-intensive process of sifting through false positives. We identify opportunities for qualitative health researchers to improve the uptake of qualitative research into evidence-informed policy making. © The Author(s) 2016.

  16. Demand Response Advanced Controls Framework and Assessment of Enabling Technology Costs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potter, Jennifer; Cappers, Peter

    The Demand Response Advanced Controls Framework and Assessment of Enabling Technology Costs research describe a variety of DR opportunities and the various bulk power system services they can provide. The bulk power system services are mapped to a generalized taxonomy of DR “service types”, which allows us to discuss DR opportunities and bulk power system services in fewer yet broader categories that share similar technological requirements which mainly drive DR enablement costs. The research presents a framework for the costs to automate DR and provides descriptions of the various elements that drive enablement costs. The report introduces the various DRmore » enabling technologies and end-uses, identifies the various services that each can provide to the grid and provides the cost assessment for each enabling technology. In addition to a report, this research includes a Demand Response Advanced Controls Database and User Manual. They are intended to provide users with the data that underlies this research and instructions for how to use that database more effectively and efficiently.« less

  17. Stealth Aircraft Technology. (Latest Citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The bibliography contains citations concerning design, manufacture, and history of aircraft incorporating stealth technology. Citations focus on construction materials, testing, aircraft performance, and technology assessment. Fighter aircraft, bombers, missiles, and helicopters represent coverage. (Contains 50-250 citations and includes a subject term index and title list.)

  18. Using a Technology-Based Case to Aid in Improving Assessment

    ERIC Educational Resources Information Center

    Zelin, Robert C., II

    2008-01-01

    This paper describes how a technology-based case using Microsoft Access can aid in the assessment process. A case was used in lieu of giving a final examination in an Accounting Information Systems course. Students worked in small groups to design a database-driven payroll system for a hypothetical company. Each group submitted its results along…

  19. Hybrid Wing Body Aircraft System Noise Assessment with Propulsion Airframe Aeroacoustic Experiments

    NASA Technical Reports Server (NTRS)

    Thomas, Russell H.; Burley, Casey L.; Olson, Erik D.

    2010-01-01

    A system noise assessment of a hybrid wing body configuration was performed using NASA s best available aircraft models, engine model, and system noise assessment method. A propulsion airframe aeroacoustic effects experimental database for key noise sources and interaction effects was used to provide data directly in the noise assessment where prediction methods are inadequate. NASA engine and aircraft system models were created to define the hybrid wing body aircraft concept as a twin engine aircraft with a 7500 nautical mile mission. The engines were modeled as existing technology high bypass ratio turbofans. The baseline hybrid wing body aircraft was assessed at 22 dB cumulative below the FAA Stage 4 certification level. To determine the potential for noise reduction with relatively near term technologies, seven other configurations were assessed beginning with moving the engines two fan nozzle diameters upstream of the trailing edge and then adding technologies for reduction of the highest noise sources. Aft radiated noise was expected to be the most challenging to reduce and, therefore, the experimental database focused on jet nozzle and pylon configurations that could reduce jet noise through a combination of source reduction and shielding effectiveness. The best configuration for reduction of jet noise used state-of-the-art technology chevrons with a pylon above the engine in the crown position. This configuration resulted in jet source noise reduction, favorable azimuthal directivity, and noise source relocation upstream where it is more effectively shielded by the limited airframe surface, and additional fan noise attenuation from acoustic liner on the crown pylon internal surfaces. Vertical and elevon surfaces were also assessed to add shielding area. The elevon deflection above the trailing edge showed some small additional noise reduction whereas vertical surfaces resulted in a slight noise increase. With the effects of the configurations from the database included, the best available noise reduction was 40 dB cumulative. Projected effects from additional technologies were assessed for an advanced noise reduction configuration including landing gear fairings and advanced pylon and chevron nozzles. Incorporating the three additional technology improvements, an aircraft noise is projected of 42.4 dB cumulative below the Stage 4 level.

  20. Technology-Enhanced Formative Assessment in Mathematics for English Language Learners

    ERIC Educational Resources Information Center

    Lekwa, Adam Jens

    2012-01-01

    This paper reports the results of a descriptive study on the use of a technology-enhanced formative assessment system called Accelerated Math (AM) for ELLs and their native-English-speaking (NES) peers. It was comprised of analyses of an extant database of 18,549 students, including 2,057 ELLs, from grades 1 through 8 across 30 U.S. states. These…

  1. The use of intelligent database systems in acute pancreatitis--a systematic review.

    PubMed

    van den Heever, Marc; Mittal, Anubhav; Haydock, Matthew; Windsor, John

    2014-01-01

    Acute pancreatitis (AP) is a complex disease with multiple aetiological factors, wide ranging severity, and multiple challenges to effective triage and management. Databases, data mining and machine learning algorithms (MLAs), including artificial neural networks (ANNs), may assist by storing and interpreting data from multiple sources, potentially improving clinical decision-making. 1) Identify database technologies used to store AP data, 2) collate and categorise variables stored in AP databases, 3) identify the MLA technologies, including ANNs, used to analyse AP data, and 4) identify clinical and non-clinical benefits and obstacles in establishing a national or international AP database. Comprehensive systematic search of online reference databases. The predetermined inclusion criteria were all papers discussing 1) databases, 2) data mining or 3) MLAs, pertaining to AP, independently assessed by two reviewers with conflicts resolved by a third author. Forty-three papers were included. Three data mining technologies and five ANN methodologies were reported in the literature. There were 187 collected variables identified. ANNs increase accuracy of severity prediction, one study showed ANNs had a sensitivity of 0.89 and specificity of 0.96 six hours after admission--compare APACHE II (cutoff score ≥8) with 0.80 and 0.85 respectively. Problems with databases were incomplete data, lack of clinical data, diagnostic reliability and missing clinical data. This is the first systematic review examining the use of databases, MLAs and ANNs in the management of AP. The clinical benefits these technologies have over current systems and other advantages to adopting them are identified. Copyright © 2013 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  2. Surgical research using national databases

    PubMed Central

    Leland, Hyuma; Heckmann, Nathanael

    2016-01-01

    Recent changes in healthcare and advances in technology have increased the use of large-volume national databases in surgical research. These databases have been used to develop perioperative risk stratification tools, assess postoperative complications, calculate costs, and investigate numerous other topics across multiple surgical specialties. The results of these studies contain variable information but are subject to unique limitations. The use of large-volume national databases is increasing in popularity, and thorough understanding of these databases will allow for a more sophisticated and better educated interpretation of studies that utilize such databases. This review will highlight the composition, strengths, and weaknesses of commonly used national databases in surgical research. PMID:27867945

  3. Surgical research using national databases.

    PubMed

    Alluri, Ram K; Leland, Hyuma; Heckmann, Nathanael

    2016-10-01

    Recent changes in healthcare and advances in technology have increased the use of large-volume national databases in surgical research. These databases have been used to develop perioperative risk stratification tools, assess postoperative complications, calculate costs, and investigate numerous other topics across multiple surgical specialties. The results of these studies contain variable information but are subject to unique limitations. The use of large-volume national databases is increasing in popularity, and thorough understanding of these databases will allow for a more sophisticated and better educated interpretation of studies that utilize such databases. This review will highlight the composition, strengths, and weaknesses of commonly used national databases in surgical research.

  4. NASA Aerospace Flight Battery Systems Program Update

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle; ODonnell, Patricia

    1997-01-01

    The objectives of NASA's Aerospace Flight Battery Systems Program is to: develop, maintain and provide tools for the validation and assessment of aerospace battery technologies; accelerate the readiness of technology advances and provide infusion paths for emerging technologies; provide NASA projects with the required database and validation guidelines for technology selection of hardware and processes relating to aerospace batteries; disseminate validation and assessment tools, quality assurance, reliability, and availability information to the NASA and aerospace battery communities; and ensure that safe, reliable batteries are available for NASA's future missions.

  5. Health technology assessment in Iran: challenges and views

    PubMed Central

    Olyaeemanesh, Alireza; Doaee, Shila; Mobinizadeh, Mohammadreza; Nedjati, Mina; Aboee, Parisa; Emami-Razavi, Seyed Hassan

    2014-01-01

    Background: Various decisions have been made on technology application at all levels of the health system in different countries around the world. Health technology assessment is considered as one of the best scientific tools at the service of policy- makers. This study attempts to investigate the current challenges of Iran’s health technology assessment and provide appropriate strategies to establish and institutionalize this program. Methods: This study was carried out in two independent phases. In the first, electronic databases such as Medline (via Pub Med) and Scientific Information Database (SID) were searched to provide a list of challenges of Iran’s health technology assessment. The views and opinions of the experts and practitioners on HTA challenges were studied through a questionnaire in the second phase which was then analyzed by SPSS Software version 16. This has been an observational and analytical study with a thematic analysis. Results: In the first phase, seven papers were retrieved; from which, twenty- two HTA challenges in Iran were extracted by the researchers; and they were used as the base for designing a structured questionnaire of the second phase. The views of the experts on the challenges of health technology assessment were categorized as follows: organizational culture, stewardship, stakeholders, health system management, infrastructures and external pressures which were mentioned in more than 60% of the cases and were also common in the views. Conclusion: The identification and prioritization of HTA challenges which were approved by those experts involved in the strategic planning of the Department of Health Technology Assessment will be a step forward in the promotion of an evidence- based policy- making and in the production of comprehensive scientific evidence. PMID:25695015

  6. BUILDING A DATABASE FOR LIFE CYCLE PERFORMANCE ASSESSMENT OF TRENCHLESS TECHNOLOGIES - abstract

    EPA Science Inventory

    Trenchless pipe rehabilitation has steadily increased over the past 40 years and represents an increasing proportion of the annual expenditure on the nation’s water infrastructure. Despite the massive public investment in these technologies, there has been little quantitative ev...

  7. Alberta Carpenter | NREL

    Science.gov Websites

    cycle assessment in industrial by-product management, waste management, biofuels and manufacturing technologies Life cycle inventory database management Research Interests Life cycle assessment Life cycle inventory management Biofuels Advanced manufacturing Supply chain analysis Education Ph.D in environmental

  8. TOWARDS A CORE DATA SET FOR LANDSCAPE ASSESSMENTS

    EPA Science Inventory

    One of the primary goals of the NATO Committee on Challenges to Modern Society (CCMS) Landscape Pilot Study is to further develop, apply, and share landscape assessment technologies and spatial databases among participating countries, with the ultimate aim of sustaining environme...

  9. Image database for digital hand atlas

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente; Dey, Partha S.; Gertych, Arkadiusz; Pospiech-Kurkowska, Sywia

    2003-05-01

    Bone age assessment is a procedure frequently performed in pediatric patients to evaluate their growth disorder. A commonly used method is atlas matching by a visual comparison of a hand radiograph with a small reference set of old Greulich-Pyle atlas. We have developed a new digital hand atlas with a large set of clinically normal hand images of diverse ethnic groups. In this paper, we will present our system design and implementation of the digital atlas database to support the computer-aided atlas matching for bone age assessment. The system consists of a hand atlas image database, a computer-aided diagnostic (CAD) software module for image processing and atlas matching, and a Web user interface. Users can use a Web browser to push DICOM images, directly or indirectly from PACS, to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, are then extracted and compared with patterns from the atlas image database to assess the bone age. The digital atlas method built on a large image database and current Internet technology provides an alternative to supplement or replace the traditional one for a quantitative, accurate and cost-effective assessment of bone age.

  10. Criminal genomic pragmatism: prisoners' representations of DNA technology and biosecurity.

    PubMed

    Machado, Helena; Silva, Susana

    2012-01-01

    Within the context of the use of DNA technology in crime investigation, biosecurity is perceived by different stakeholders according to their particular rationalities and interests. Very little is known about prisoners' perceptions and assessments of the uses of DNA technology in solving crime. To propose a conceptual model that serves to analyse and interpret prisoners' representations of DNA technology and biosecurity. A qualitative study using an interpretative approach based on 31 semi-structured tape-recorded interviews was carried out between May and September 2009, involving male inmates in three prisons located in the north of Portugal. The content analysis focused on the following topics: the meanings attributed to DNA and assessments of the risks and benefits of the uses of DNA technology and databasing in forensic applications. DNA was described as a record of identity, an exceptional material, and a powerful biometric identifier. The interviewees believed that DNA can be planted to incriminate suspects. Convicted offenders argued for the need to extend the criteria for the inclusion of DNA profiles in forensic databases and to restrict the removal of profiles. The conceptual model entitled criminal genomic pragmatism allows for an understanding of the views of prison inmates regarding DNA technology and biosecurity.

  11. A Support Database System for Integrated System Health Management (ISHM)

    NASA Technical Reports Server (NTRS)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between system elements to provide the logical context for the database. The historical data archive provides a common repository for sensor data that can be shared between developers and applications. The firmware codebase is used by the developer to organize the intelligent element firmware into atomic units which can be assembled into complete firmware for specific elements.

  12. Building a Database for Life Cycle Performance Assessment of Water and Wastewater Rehabilitation Technologies

    EPA Science Inventory

    The deployment of trenchless pipe rehabilitation technologies steadily increased over the past 30 to 40 years and continue to represent a growing proportion of the approximately $25 billion annual expenditure on operations and maintenance of the nation’s water and wastewater infr...

  13. About the Cancer Biomarkers Research Group | Division of Cancer Prevention

    Cancer.gov

    The Cancer Biomarkers Research Group promotes research to identify, develop, and validate biological markers for early cancer detection and cancer risk assessment. Activities include development and validation of promising cancer biomarkers, collaborative databases and informatics systems, and new technologies or the refinement of existing technologies. NCI DCP News Note

  14. National Database Structure for Life Cycle Performance Assessment of Water and Wastewater Rehabilitation Technologies (Retrospective Evaluation)

    EPA Science Inventory

    This report builds upon a previous pilot study to document the in-service performance of trenchless pipe rehabilitation techniques. The use of pipe rehabilitation and trenchless pipe replacement technologies has increased over the past 30 to 40 years and represents an increasing...

  15. The application of various digital subscriber line (xDSL) technologies to ITS : traffic video field assessments

    DOT National Transportation Integrated Search

    1999-06-01

    This paper is an addendum to an earlier report (see PATH Database record no. 19339) which gives an overview of xDSL technologies. This supplement documents the field testing of an xDSL-based traffic video prototype that was built during laboratory st...

  16. Building a Database for Life Cycle Performance Assessment of Water and Wastewater Rehabilitation Technologies - abstract

    EPA Science Inventory

    Pipe rehabilitation and trenchless pipe replacement technologies have seen a steady increase in use over the past 30 to 40 years and represent an increasing proportion of the approximately $25 billion annual expenditure on operations and maintenance of the nation’s water and wast...

  17. The Digital Workforce: Update, August 2000 [and] The Digital Work Force: State Data & Rankings, September 2000.

    ERIC Educational Resources Information Center

    Sargent, John

    The Office of Technology Policy analyzed Bureau of Labor Statistics' growth projections for the core occupational classifications of IT (information technology) workers to assess future demand in the United States. Classifications studied were computer engineers, systems analysts, computer programmers, database administrators, computer support…

  18. Open source database of images DEIMOS: extension for large-scale subjective image quality assessment

    NASA Astrophysics Data System (ADS)

    Vítek, Stanislav

    2014-09-01

    DEIMOS (Database of Images: Open Source) is an open-source database of images and video sequences for testing, verification and comparison of various image and/or video processing techniques such as compression, reconstruction and enhancement. This paper deals with extension of the database allowing performing large-scale web-based subjective image quality assessment. Extension implements both administrative and client interface. The proposed system is aimed mainly at mobile communication devices, taking into account advantages of HTML5 technology; it means that participants don't need to install any application and assessment could be performed using web browser. The assessment campaign administrator can select images from the large database and then apply rules defined by various test procedure recommendations. The standard test procedures may be fully customized and saved as a template. Alternatively the administrator can define a custom test, using images from the pool and other components, such as evaluating forms and ongoing questionnaires. Image sequence is delivered to the online client, e.g. smartphone or tablet, as a fully automated assessment sequence or viewer can decide on timing of the assessment if required. Environmental data and viewing conditions (e.g. illumination, vibrations, GPS coordinates, etc.), may be collected and subsequently analyzed.

  19. A Systems Model for Power Technology Assessment

    NASA Technical Reports Server (NTRS)

    Hoffman, David J.

    2002-01-01

    A computer model is under continuing development at NASA Glenn Research Center that enables first-order assessments of space power technology. The model, an evolution of NASA Glenn's Array Design Assessment Model (ADAM), is an Excel workbook that consists of numerous spreadsheets containing power technology performance data and sizing algorithms. Underlying the model is a number of databases that contain default values for various power generation, energy storage and power management and distribution component parameters. These databases are actively maintained by a team of systems analysts so that they contain state-of-art data as well as the most recent technology performance projections. Sizing of the power subsystems can be accomplished either by using an assumed mass specific power (W/kg) or energy (Wh/kg) or by a bottoms-up calculation that accounts for individual component performance and masses. The power generation, energy storage and power management and distribution subsystems are sized for given mission requirements for a baseline case and up to three alternatives. This allows four different power systems to be sized and compared using consistent assumptions and sizing algorithms. The component sizing models contained in the workbook are modular so that they can be easily maintained and updated. All significant input values have default values loaded from the databases that can be over-written by the user. The default data and sizing algorithms for each of the power subsystems are described in some detail. The user interface and workbook navigational features are also discussed. Finally, an example study case that illustrates the model's capability is presented.

  20. International contributions to IAEA-NEA heat transfer databases for supercritical fluids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, L. K. H.; Yamada, K.

    2012-07-01

    An IAEA Coordinated Research Project on 'Heat Transfer Behaviour and Thermohydraulics Code Testing for SCWRs' is being conducted to facilitate collaboration and interaction among participants from 15 organizations. While the project covers several key technology areas relevant to the development of SCWR concepts, it focuses mainly on the heat transfer aspect, which has been identified as the most challenging. Through the collaborating effort, large heat-transfer databases have been compiled for supercritical water and surrogate fluids in tubes, annuli, and bundle subassemblies of various orientations over a wide range of flow conditions. Assessments of several supercritical heat-transfer correlations were performed usingmore » the complied databases. The assessment results are presented. (authors)« less

  1. Developing a Technology Workshop Series for Your Faculty and Staff.

    ERIC Educational Resources Information Center

    Zeitz, Leigh E.

    1995-01-01

    Based on a needs assessment questionnaire, 13 technology workshops were designed for school personnel. Topics included an introduction; troubleshooting; e-mail and the Internet; ERIC and CD-ROM databases; Microsoft Works; desktop publishing; presentation software; resources on CD-ROM; the X-Press news service; and interactive laser video discs. A…

  2. From Policy to Practice: The Implementation and Negotiation of Technologies in Everyday Child Welfare

    ERIC Educational Resources Information Center

    Peckover, Sue; Hall, Christopher; White, Sue

    2009-01-01

    A central element of the Every Child Matters reforms in England are measures which aim at improving information sharing. Amongst these are the children's database and the Common Assessment Framework, both representing technological solutions to long-standing concerns about information sharing in child welfare. This article reports some findings…

  3. INFOMAT: The international materials assessment and application centre's internet gateway

    NASA Astrophysics Data System (ADS)

    Branquinho, Carmen Lucia; Colodete, Leandro Tavares

    2004-08-01

    INFOMAT is an electronic directory structured to facilitate the search and retrieval of materials science and technology information sources. Linked to the homepage of the International Materials Assessment and Application Centre, INFOMAT presents descriptions of 392 proprietary databases with links to their host systems as well as direct links to over 180 public domain databases and over 2,400 web sites. Among the web sites are associations/unions, governmental and non-governmental institutions, industries, library holdings, market statistics, news services, on-line publications, standardization and intellectual property organizations, and universities/research groups.

  4. Quantifying innovation in surgery.

    PubMed

    Hughes-Hallett, Archie; Mayer, Erik K; Marcus, Hani J; Cundy, Thomas P; Pratt, Philip J; Parston, Greg; Vale, Justin A; Darzi, Ara W

    2014-08-01

    The objectives of this study were to assess the applicability of patents and publications as metrics of surgical technology and innovation; evaluate the historical relationship between patents and publications; develop a methodology that can be used to determine the rate of innovation growth in any given health care technology. The study of health care innovation represents an emerging academic field, yet it is limited by a lack of valid scientific methods for quantitative analysis. This article explores and cross-validates 2 innovation metrics using surgical technology as an exemplar. Electronic patenting databases and the MEDLINE database were searched between 1980 and 2010 for "surgeon" OR "surgical" OR "surgery." Resulting patent codes were grouped into technology clusters. Growth curves were plotted for these technology clusters to establish the rate and characteristics of growth. The initial search retrieved 52,046 patents and 1,801,075 publications. The top performing technology cluster of the last 30 years was minimally invasive surgery. Robotic surgery, surgical staplers, and image guidance were the most emergent technology clusters. When examining the growth curves for these clusters they were found to follow an S-shaped pattern of growth, with the emergent technologies lying on the exponential phases of their respective growth curves. In addition, publication and patent counts were closely correlated in areas of technology expansion. This article demonstrates the utility of publically available patent and publication data to quantify innovations within surgical technology and proposes a novel methodology for assessing and forecasting areas of technological innovation.

  5. Identification of the condition of crops based on geospatial data embedded in graph databases

    NASA Astrophysics Data System (ADS)

    Idziaszek, P.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Koszela, K.; Fojud, A.

    2017-07-01

    The Web application presented here supports plant production and works with the graph database Neo4j shell to support the assessment of the condition of crops on the basis of geospatial data, including raster and vector data. The adoption of a graph database as a tool to store and manage the data, including geospatial data, is completely justified in the case of those agricultural holdings that have a wide range of types and sizes of crops. In addition, the authors tested the option of using the technology of Microsoft Cognitive Services at the level of produced application that enables an image analysis using the services provided. The presented application was designed using ASP.NET MVC technology and a wide range of leading IT tools.

  6. XML Technology Assessment

    DTIC Science & Technology

    2001-01-01

    System (GCCS) Track Database Management System (TDBM) (3) GCCS Integrated Imagery and Intelligence (3) Intelligence Shared Data Server (ISDS) General ...The CTH is a powerful model that will allow more than just message systems to exchange information. It could be used for object-oriented databases, as...of the Naval Integrated Tactical Environmental System I (NITES I) is used as a case study to demonstrate the utility of this distributed component

  7. A systematic review of randomised control trials of sexual health interventions delivered by mobile technologies.

    PubMed

    Burns, Kara; Keating, Patrick; Free, Caroline

    2016-08-12

    Sexually transmitted infections (STIs) pose a serious public health problem globally. The rapid spread of mobile technology creates an opportunity to use innovative methods to reduce the burden of STIs. This systematic review identified recent randomised controlled trials that employed mobile technology to improve sexual health outcomes. The following databases were searched for randomised controlled trials of mobile technology based sexual health interventions with any outcome measures and all patient populations: MEDLINE, EMBASE, PsycINFO, Global Health, The Cochrane Library (Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Cochrane Methodology Register, NHS Health Technology Assessment Database, and Web of Science (science and social science citation index) (Jan 1999-July 2014). Interventions designed to increase adherence to HIV medication were not included. Two authors independently extracted data on the following elements: interventions, allocation concealment, allocation sequence, blinding, completeness of follow-up, and measures of effect. Trials were assessed for methodological quality using the Cochrane risk of bias tool. We calculated effect estimates using intention to treat analysis. A total of ten randomised trials were identified with nine separate study groups. No trials had a low risk of bias. The trials targeted: 1) promotion of uptake of sexual health services, 2) reduction of risky sexual behaviours and 3) reduction of recall bias in reporting sexual activity. Interventions employed up to five behaviour change techniques. Meta-analysis was not possible due to heterogeneity in trial assessment and reporting. Two trials reported statistically significant improvements in the uptake of sexual health services using SMS reminders compared to controls. One trial increased knowledge. One trial reported promising results in increasing condom use but no trial reported statistically significant increases in condom use. Finally, one trial showed that collection of sexual health information using mobile technology was acceptable. The findings suggest interventions delivered by SMS interventions can increase uptake of sexual health services and STI testing. High quality trials of interventions using standardised objective measures and employing a wider range of behavioural change techniques are needed to assess if interventions delivered by mobile phone can alter safer sex behaviours carried out between couples and reduce STIs.

  8. Advanced instrumentation: Technology database enhancement, volume 4, appendix G

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The purpose of this task was to add to the McDonnell Douglas Space Systems Company's Sensors Database, including providing additional information on the instruments and sensors applicable to physical/chemical Environmental Control and Life Support System (P/C ECLSS) or Closed Ecological Life Support System (CELSS) which were not previously included. The Sensors Database was reviewed in order to determine the types of data required, define the data categories, and develop an understanding of the data record structure. An assessment of the MDSSC Sensors Database identified limitations and problems in the database. Guidelines and solutions were developed to address these limitations and problems in order that the requirements of the task could be fulfilled.

  9. Technology as a Threat to Privacy: Ethical Challenges and Guidelines for the Information Professionals.

    ERIC Educational Resources Information Center

    Britz, J. J.

    1996-01-01

    Assesses the impact of technology on privacy. Discusses electronic monitoring of people in the workplace; interception and reading of e-mail messages; merging of databases which contain personal information; rise in the number of hackers; and the development of software that makes the decoding of digital information virtually impossible. Presents…

  10. National Database for Autism Research (NDAR): Big Data Opportunities for Health Services Research and Health Technology Assessment.

    PubMed

    Payakachat, Nalin; Tilford, J Mick; Ungar, Wendy J

    2016-02-01

    The National Database for Autism Research (NDAR) is a US National Institutes of Health (NIH)-funded research data repository created by integrating heterogeneous datasets through data sharing agreements between autism researchers and the NIH. To date, NDAR is considered the largest neuroscience and genomic data repository for autism research. In addition to biomedical data, NDAR contains a large collection of clinical and behavioral assessments and health outcomes from novel interventions. Importantly, NDAR has a global unique patient identifier that can be linked to aggregated individual-level data for hypothesis generation and testing, and for replicating research findings. As such, NDAR promotes collaboration and maximizes public investment in the original data collection. As screening and diagnostic technologies as well as interventions for children with autism are expensive, health services research (HSR) and health technology assessment (HTA) are needed to generate more evidence to facilitate implementation when warranted. This article describes NDAR and explains its value to health services researchers and decision scientists interested in autism and other mental health conditions. We provide a description of the scope and structure of NDAR and illustrate how data are likely to grow over time and become available for HSR and HTA.

  11. Databases, data integration, and expert systems: new directions in mineral resource assessment and mineral exploration

    USGS Publications Warehouse

    McCammon, Richard B.; Ramani, Raja V.; Mozumdar, Bijoy K.; Samaddar, Arun B.

    1994-01-01

    Overcoming future difficulties in searching for ore deposits deeper in the earth's crust will require closer attention to the collection and analysis of more diverse types of data and to more efficient use of current computer technologies. Computer technologies of greatest interest include methods of storage and retrieval of resource information, methods for integrating geologic, geochemical, and geophysical data, and the introduction of advanced computer technologies such as expert systems, multivariate techniques, and neural networks. Much experience has been gained in the past few years in applying these technologies. More experience is needed if they are to be implemented for everyday use in future assessments and exploration.

  12. Assessment of COPD-related outcomes via a national electronic medical record database.

    PubMed

    Asche, Carl; Said, Quayyim; Joish, Vijay; Hall, Charles Oaxaca; Brixner, Diana

    2008-01-01

    The technology and sophistication of healthcare utilization databases have expanded over the last decade to include results of lab tests, vital signs, and other clinical information. This review provides an assessment of the methodological and analytical challenges of conducting chronic obstructive pulmonary disease (COPD) outcomes research in a national electronic medical records (EMR) dataset and its potential application towards the assessment of national health policy issues, as well as a description of the challenges or limitations. An EMR database and its application to measuring outcomes for COPD are described. The ability to measure adherence to the COPD evidence-based practice guidelines, generated by the NIH and HEDIS quality indicators, in this database was examined. Case studies, before and after their publication, were used to assess the adherence to guidelines and gauge the conformity to quality indicators. EMR was the only source of information for pulmonary function tests, but low frequency in ordering by primary care was an issue. The EMR data can be used to explore impact of variation in healthcare provision on clinical outcomes. The EMR database permits access to specific lab data and biometric information. The richness and depth of information on "real world" use of health services for large population-based analytical studies at relatively low cost render such databases an attractive resource for outcomes research. Various sources of information exist to perform outcomes research. It is important to understand the desired endpoints of such research and choose the appropriate database source.

  13. Technologies Assessing Limb Bradykinesia in Parkinson's Disease.

    PubMed

    Hasan, Hasan; Athauda, Dilan S; Foltynie, Thomas; Noyce, Alastair J

    2017-01-01

    The MDS-UPDRS (Movement Disorders Society - Unified Parkinson's Disease Rating Scale) is the most widely used scale for rating impairment in PD. Subscores measuring bradykinesia have low reliability that can be subject to rater variability. Novel technological tools can be used to overcome such issues. To systematically explore and describe the available technologies for measuring limb bradykinesia in PD that were published between 2006 and 2016. A systematic literature search using PubMed (MEDLINE), IEEE Xplore, Web of Science, Scopus and Engineering Village (Compendex and Inspec) databases was performed to identify relevant technologies published until 18 October 2016. 47 technologies assessing bradykinesia in PD were identified, 17 of which offered home and clinic-based assessment whilst 30 provided clinic-based assessment only. Of the eligible studies, 7 were validated in a PD patient population only, whilst 40 were tested in both PD and healthy control groups. 19 of the 47 technologies assessed bradykinesia only, whereas 28 assessed other parkinsonian features as well. 33 technologies have been described in additional PD-related studies, whereas 14 are not known to have been tested beyond the pilot phase. Technology based tools offer advantages including objective motor assessment and home monitoring of symptoms, and can be used to assess response to intervention in clinical trials or routine care. This review provides an up-to-date repository and synthesis of the current literature regarding technology used for assessing limb bradykinesia in PD. The review also discusses the current trends with regards to technology and discusses future directions in development.

  14. Quality labeled faces in the wild (QLFW): a database for studying face recognition in real-world environments

    NASA Astrophysics Data System (ADS)

    Karam, Lina J.; Zhu, Tong

    2015-03-01

    The varying quality of face images is an important challenge that limits the effectiveness of face recognition technology when applied in real-world applications. Existing face image databases do not consider the effect of distortions that commonly occur in real-world environments. This database (QLFW) represents an initial attempt to provide a set of labeled face images spanning the wide range of quality, from no perceived impairment to strong perceived impairment for face detection and face recognition applications. Types of impairment include JPEG2000 compression, JPEG compression, additive white noise, Gaussian blur and contrast change. Subjective experiments are conducted to assess the perceived visual quality of faces under different levels and types of distortions and also to assess the human recognition performance under the considered distortions. One goal of this work is to enable automated performance evaluation of face recognition technologies in the presence of different types and levels of visual distortions. This will consequently enable the development of face recognition systems that can operate reliably on real-world visual content in the presence of real-world visual distortions. Another goal is to enable the development and assessment of visual quality metrics for face images and for face detection and recognition applications.

  15. TechTracS: NASA's commercial technology management system

    NASA Astrophysics Data System (ADS)

    Barquinero, Kevin; Cannon, Douglas

    1996-03-01

    The Commercial Technology Mission is a primary NASA mission, comparable in importance to those in aeronautics and space. This paper will discuss TechTracS, NASA Commercial Technology Management System that has been put into place in FY 1995 to implement this mission. This system is designed to identify and capture the NASA technologies which have commercial potential into an off-the-shelf database application, and then track the technologies' progress in realizing the commercial potential through collaborations with industry. The management system consists of four stages. The first is to develop an inventory database of the agency's entire technology portfolio and assess it for relevance to the commercial marketplace. Those technologies that are identified as having commercial potential will then be actively marketed to appropriate industries—this is the second stage. The third stage is when a NASA-industry partnership is entered into for the purposes of commercializing the technology. The final stage is to track the technology's success or failure in the marketplace. The collection of this information in TechTracS enables metrics evaluation and can accelerate the establishment on direct contacts between and NASA technologist and an industry technologist. This connection is the beginning of the technology commercialization process.

  16. Utilizing GIS Technology to Improve Fire Prevention Activities in an Urban Fire Department.

    PubMed

    Shields, Wendy C; Shields, Timothy M; McDonald, Eileen M; Perry, Elise C; Hanna, Peter; Gielen, Andrea C

    2015-01-01

    The Baltimore City Fire Department (BCFD) has been installing smoke alarms city wide for more than three decades. Though data on each visit are entered into a database, no system existed for using these data for planning or evaluation. The objective of this study is to use Geographic Information System (GIS) technology and existing databases to 1) determine the number of residences in need of a home visit; 2) determine total visits, visits per household, and number of homes entered for eligible households; and 3) demonstrate integration of various data via GIS for use in prevention planning. The tax assessment database was queried to determine the number of eligible (as determined by BCFD policy) residences in need of a visit. Each attempted BCFD home visit was coded to identify, if the BCFD personnel interacted with residents ("pass door") and installed alarms. Home visits were geocoded and compared to the tax assessment database to determine city wide pass door rates. Frequency of visits was run by individual residences to measure efficiency. A total of 206,850 residences met BCFD eligibility for a home visit. In 2007, the BCFD attempted 181,757 home visits and 177,213 were successfully geocoded to 122,118 addresses. A total of 122,118 eligible residences (59%) received a home visit. A total of 35,317 residences (29%) received a repeat visit attempt. The pass door rate was 22% (46,429) of all residences. GIS technology offers a promising means for fire departments to plan and evaluate the fire prevention services they provide.

  17. A personal digital assistant application (MobilDent) for dental fieldwork data collection, information management and database handling.

    PubMed

    Forsell, M; Häggström, M; Johansson, O; Sjögren, P

    2008-11-08

    To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.

  18. DOE technology information management system database study report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performedmore » detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.« less

  19. An intermediary's perspective of online databases for local governments

    NASA Technical Reports Server (NTRS)

    Jack, R. F.

    1984-01-01

    Numerous public administration studies have indicated that local government agencies for a variety of reasons lack access to comprehensive information resources; furthermore, such entities are often unwilling or unable to share information regarding their own problem-solving innovations. The NASA/University of Kentucky Technology Applications Program devotes a considerable effort to providing scientific and technical information and assistance to local agencies, relying on its access to over 500 distinct online databases offered by 20 hosts. The author presents a subjective assessment, based on his own experiences, of several databases which may prove useful in obtaining information for this particular end-user community.

  20. USE OF LANDSCAPE SCIENCE FRO ENVIRONMENTAL ASSESSMENT PILOT STUDY

    EPA Science Inventory

    Landscape metrics or indicators are calculated by combining various scientific databases using technologies from geographic information systems. These metrics facilitate the understanding that events that might occur in one ecosystem or resource can affect the conditions of many ...

  1. Planned and ongoing projects (pop) database: development and results.

    PubMed

    Wild, Claudia; Erdös, Judit; Warmuth, Marisa; Hinterreiter, Gerda; Krämer, Peter; Chalon, Patrice

    2014-11-01

    The aim of this study was to present the development, structure and results of a database on planned and ongoing health technology assessment (HTA) projects (POP Database) in Europe. The POP Database (POP DB) was set up in an iterative process from a basic Excel sheet to a multifunctional electronic online database. The functionalities, such as the search terminology, the procedures to fill and update the database, the access rules to enter the database, as well as the maintenance roles, were defined in a multistep participatory feedback loop with EUnetHTA Partners. The POP Database has become an online database that hosts not only the titles and MeSH categorizations, but also some basic information on status and contact details about the listed projects of EUnetHTA Partners. Currently, it stores more than 1,200 planned, ongoing or recently published projects of forty-three EUnetHTA Partners from twenty-four countries. Because the POP Database aims to facilitate collaboration, it also provides a matching system to assist in identifying similar projects. Overall, more than 10 percent of the projects in the database are identical both in terms of pathology (indication or disease) and technology (drug, medical device, intervention). In addition, approximately 30 percent of the projects are similar, meaning that they have at least some overlap in content. Although the POP DB is successful concerning regular updates of most national HTA agencies within EUnetHTA, little is known about its actual effects on collaborations in Europe. Moreover, many non-nationally nominated HTA producing agencies neither have access to the POP DB nor can share their projects.

  2. Technologies Assessing Limb Bradykinesia in Parkinson’s Disease

    PubMed Central

    Hasan, Hasan; Athauda, Dilan S.; Foltynie, Thomas; Noyce, Alastair J.

    2017-01-01

    Background: The MDS-UPDRS (Movement Disorders Society – Unified Parkinson’s Disease Rating Scale) is the most widely used scale for rating impairment in PD. Subscores measuring bradykinesia have low reliability that can be subject to rater variability. Novel technological tools can be used to overcome such issues. Objective: To systematically explore and describe the available technologies for measuring limb bradykinesia in PD that were published between 2006 and 2016. Methods: A systematic literature search using PubMed (MEDLINE), IEEE Xplore, Web of Science, Scopus and Engineering Village (Compendex and Inspec) databases was performed to identify relevant technologies published until 18 October 2016. Results: 47 technologies assessing bradykinesia in PD were identified, 17 of which offered home and clinic-based assessment whilst 30 provided clinic-based assessment only. Of the eligible studies, 7 were validated in a PD patient population only, whilst 40 were tested in both PD and healthy control groups. 19 of the 47 technologies assessed bradykinesia only, whereas 28 assessed other parkinsonian features as well. 33 technologies have been described in additional PD-related studies, whereas 14 are not known to have been tested beyond the pilot phase. Conclusion: Technology based tools offer advantages including objective motor assessment and home monitoring of symptoms, and can be used to assess response to intervention in clinical trials or routine care. This review provides an up-to-date repository and synthesis of the current literature regarding technology used for assessing limb bradykinesia in PD. The review also discusses the current trends with regards to technology and discusses future directions in development. PMID:28222539

  3. Computers in the Cop Car: Impact of the Mobile Digital Terminal Technology on Motor Vehicle Theft Clearance and Recovery Rates in a Texas City.

    ERIC Educational Resources Information Center

    Nunn, Samuel

    1993-01-01

    Assessed the impact of the Mobile Digital Terminal technology (computers used to communicate with remote crime databases) on motor vehicle theft clearance (arresting a perpetrator) and recovery rates in Fort Worth (Texas), using a time series analysis. Impact has been ambiguous, with little evidence of improved clearance or recovery. (SLD)

  4. Real-Time Integrity Monitoring of Stored Geo-Spatial Data Using Forward-Looking Remote Sensing Technology

    NASA Technical Reports Server (NTRS)

    Young, Steven D.; Harrah, Steven D.; deHaag, Maarten Uijt

    2002-01-01

    Terrain Awareness and Warning Systems (TAWS) and Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data (e.g. terrain, obstacles, and/or features). As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. This lack of a quantifiable integrity level is one of the constraints that has limited certification and operational approval of TAWS/SVS to "advisory-only" systems for civil aviation. Previous work demonstrated the feasibility of using a real-time monitor to bound database integrity by using downward-looking remote sensing technology (i.e. radar altimeters). This paper describes an extension of the integrity monitor concept to include a forward-looking sensor to cover additional classes of terrain database faults and to reduce the exposure time associated with integrity threats. An operational concept is presented that combines established feature extraction techniques with a statistical assessment of similarity measures between the sensed and stored features using principles from classical detection theory. Finally, an implementation is presented that uses existing commercial-off-the-shelf weather radar sensor technology.

  5. Time to publication for NIHR HTA programme-funded research: a cohort study

    PubMed Central

    Chinnery, Fay; Young, Amanda; Goodman, Jennie; Ashton-Key, Martin; Milne, Ruairidh

    2013-01-01

    Objective To assess the time to publication of primary research and evidence syntheses funded by the National Institute for Health Research (NIHR) Health Technology Assessment (HTA) Programme published as a monograph in Health Technology Assessment and as a journal article in the wider biomedical literature. Study design Retrospective cohort study. Setting Primary research and evidence synthesis projects funded by the HTA Programme were included in the cohort if they were registered in the NIHR research programmes database and was planned to submit the draft final report for publication in Health Technology Assessment on or before 9 December 2011. Main outcome measures The median time to publication and publication at 30 months in Health Technology Assessment and in an external journal were determined by searching the NIHR research programmes database and HTA Programme website. Results Of 458 included projects, 184 (40.2%) were primary research projects and 274 (59.8%) were evidence syntheses. A total of 155 primary research projects had a completion date; the median time to publication was 23 months (26.5 and 35.5 months to publish a monograph and to publish in an external journal, respectively) and 69% were published within 30 months. The median time to publication of HTA-funded trials (n=126) was 24 months and 67.5% were published within 30 months. Among the evidence syntheses with a protocol online date (n=223), the median time to publication was 25.5 months (28 months to publication as a monograph), but only 44.4% of evidence synthesis projects were published in an external journal. 65% of evidence synthesis studies had been published within 30.0 months. Conclusions Research funded by the HTA Programme publishes promptly. The importance of Health Technology Assessment was highlighted as the median time to publication was 9 months shorter for a monograph than an external journal article. PMID:24285634

  6. A systematic review of model-based economic evaluations of diagnostic and therapeutic strategies for lower extremity artery disease.

    PubMed

    Vaidya, Anil; Joore, Manuela A; ten Cate-Hoek, Arina J; Kleinegris, Marie-Claire; ten Cate, Hugo; Severens, Johan L

    2014-01-01

    Lower extremity artery disease (LEAD) is a sign of wide spread atherosclerosis also affecting coronary, cerebral and renal arteries and is associated with increased risk of cardiovascular events. Many economic evaluations have been published for LEAD due to its clinical, social and economic importance. The aim of this systematic review was to assess modelling methods used in published economic evaluations in the field of LEAD. Our review appraised and compared the general characteristics, model structure and methodological quality of published models. Electronic databases MEDLINE and EMBASE were searched until February 2013 via OVID interface. Cochrane database of systematic reviews, Health Technology Assessment database hosted by National Institute for Health research and National Health Services Economic Evaluation Database (NHSEED) were also searched. The methodological quality of the included studies was assessed by using the Philips' checklist. Sixteen model-based economic evaluations were identified and included. Eleven models compared therapeutic health technologies; three models compared diagnostic tests and two models compared a combination of diagnostic and therapeutic options for LEAD. Results of this systematic review revealed an acceptable to low methodological quality of the included studies. Methodological diversity and insufficient information posed a challenge for valid comparison of the included studies. In conclusion, there is a need for transparent, methodologically comparable and scientifically credible model-based economic evaluations in the field of LEAD. Future modelling studies should include clinically and economically important cardiovascular outcomes to reflect the wider impact of LEAD on individual patients and on the society.

  7. Database Technology Activities and Assessment for Defense Modeling and Simulation Office (DMSO) (August 1991-November 1992). A Documented Briefing

    DTIC Science & Technology

    1994-01-01

    databases and identifying new data entities, data elements, and relationships . - Standard data naming conventions, schema, and definition processes...management system. The use of such a tool could offer: (1) structured support for representation of objects and their relationships to each other (and...their relationships to related multimedia objects such as an engineering drawing of the tank object or a satellite image that contains the installation

  8. Towards G2G: Systems of Technology Database Systems

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David

    2005-01-01

    We present an approach and methodology for developing Government-to-Government (G2G) Systems of Technology Database Systems. G2G will deliver technologies for distributed and remote integration of technology data for internal use in analysis and planning as well as for external communications. G2G enables NASA managers, engineers, operational teams and information systems to "compose" technology roadmaps and plans by selecting, combining, extending, specializing and modifying components of technology database systems. G2G will interoperate information and knowledge that is distributed across organizational entities involved that is ideal for NASA future Exploration Enterprise. Key contributions of the G2G system will include the creation of an integrated approach to sustain effective management of technology investments that supports the ability of various technology database systems to be independently managed. The integration technology will comply with emerging open standards. Applications can thus be customized for local needs while enabling an integrated management of technology approach that serves the global needs of NASA. The G2G capabilities will use NASA s breakthrough in database "composition" and integration technology, will use and advance emerging open standards, and will use commercial information technologies to enable effective System of Technology Database systems.

  9. The research of network database security technology based on web service

    NASA Astrophysics Data System (ADS)

    Meng, Fanxing; Wen, Xiumei; Gao, Liting; Pang, Hui; Wang, Qinglin

    2013-03-01

    Database technology is one of the most widely applied computer technologies, its security is becoming more and more important. This paper introduced the database security, network database security level, studies the security technology of the network database, analyzes emphatically sub-key encryption algorithm, applies this algorithm into the campus-one-card system successfully. The realization process of the encryption algorithm is discussed, this method is widely used as reference in many fields, particularly in management information system security and e-commerce.

  10. Health Technology Assessment in nursing: a literature review.

    PubMed

    Ramacciati, N

    2013-03-01

    The Health Technology Assessment (HTA) approach, which provides scientific support for the decisions taken within the health field, is of increasing importance worldwide. In a context of limited resources, HTA has the potential of being an efficient tool for addressing the sustainability problems and the allocation choices arising from the constant increase in demand. This study aims to investigate HTA use in nursing, both in terms of quantifying HTA evaluations of nursing phenomena which have been conducted and in terms of the extent to which nursing has used the HTA approach. The Italian context has been analysed because of the growing diffusion of the HTA in Italy along with the recent developments in the nursing profession. A narrative review of international literature was undertaken using the following databases: HTA, PubMed, CINAHL, ILISI. Seventy evaluation studies on nursing were identified from the HTA database (1.12% of all studies in the database). The areas of nursing intervention and the country of origin of the studies were identified. Two nursing studies on the HTA approach were found in the PubMed, CINAHL and HTA databases. The first focused on the evaluation of nursing technology process and analysed 126 studies in six main thematic areas; the second was a systematic review on HTA in nursing and analysed 192 studies (46 meta-analyses, 31 Finnish primary studies, 117 international primary studies). Three Italian studies were identified from the ILISI database and Italian grey literature. In the international literature, although analyses regarding the efficacy of nursing interventions have been conducted, there are to date very few research projects that focus exclusively on the HTA process as applied to nursing. The recent development of a standardized nursing language coupled with the open debate as to which research method (qualitative vs. quantitative) best serves to 'read' nursing phenomena may explain the scarce diffusion of HTA in the field of nursing. © 2012 The Author. International Nursing Review © 2012 International Council of Nurses.

  11. XML technology planning database : lessons learned

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Neff, Jon M.

    2005-01-01

    A hierarchical Extensible Markup Language(XML) database called XCALIBR (XML Analysis LIBRary) has been developed by Millennium Program to assist in technology investment (ROI) analysis and technology Language Capability the New return on portfolio optimization. The database contains mission requirements and technology capabilities, which are related by use of an XML dictionary. The XML dictionary codifies a standardized taxonomy for space missions, systems, subsystems and technologies. In addition to being used for ROI analysis, the database is being examined for use in project planning, tracking and documentation. During the past year, the database has moved from development into alpha testing. This paper describes the lessons learned during construction and testing of the prototype database and the motivation for moving from an XML taxonomy to a standard XML-based ontology.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soldevilla, M.; Salmons, S.; Espinosa, B.

    The new application BDDR (Reactor database) has been developed at CEA in order to manage nuclear reactors technological and operating data. This application is a knowledge management tool which meets several internal needs: -) to facilitate scenario studies for any set of reactors, e.g. non-proliferation assessments; -) to make core physics studies easier, whatever the reactor design (PWR-Pressurized Water Reactor-, BWR-Boiling Water Reactor-, MAGNOX- Magnesium Oxide reactor-, CANDU - CANada Deuterium Uranium-, FBR - Fast Breeder Reactor -, etc.); -) to preserve the technological data of all reactors (past and present, power generating or experimental, naval propulsion,...) in a uniquemore » repository. Within the application database are enclosed location data and operating history data as well as a tree-like structure containing numerous technological data. These data address all kinds of reactors features and components. A few neutronics data are also included (neutrons fluxes). The BDDR application is based on open-source technologies and thin client/server architecture. The software architecture has been made flexible enough to allow for any change. (authors)« less

  13. FINDING COMMON GROUND IN MANAGING DATA USED IN REGIONAL ENVIRONMENTAL ASSESSMENTS

    EPA Science Inventory

    Evaluating the overall environmental health of a region invariably involves using data-bases from multiple organizations. Several approaches to deal with the related technological and sociological issues have been used by various programs. Flexible data systems are required to de...

  14. ExpoCastDB: A Publicly Accessible Database for Observational Exposure Data

    EPA Science Inventory

    The application of environmental informatics tools for human health risk assessment will require the development of advanced exposure information technology resources. Exposure data for chemicals is often not readily accessible. There is a pressing need for easily accessible, che...

  15. The Design and Implementation of Network Teaching Platform Basing on .NET

    NASA Astrophysics Data System (ADS)

    Yanna, Ren

    This paper addresses the problem that students under traditional teaching model have poor operation ability and studies in depth the network teaching platform in domestic colleges and universities, proposing the design concept of network teaching platform of NET + C # + SQL excellent course and designing the overall structure, function module and back-end database of the platform. This paper emphatically expounds the use of MD5 encryption techniques in order to solve data security problems and the assessment of student learning using ADO.NET database access technology as well as the mathematical formula. The example shows that the network teaching platform developed by using WEB application technology has higher safety and availability, and thus improves the students' operation ability.

  16. The comparative effectiveness of conventional and digital image libraries.

    PubMed

    McColl, R I; Johnson, A

    2001-03-01

    Before introducing a hospital-wide image database to improve access, navigation and retrieval speed, a comparative study between a conventional slide library and a matching image database was undertaken to assess its relative benefits. Paired time trials and personal questionnaires revealed faster retrieval rates, higher image quality, and easier viewing for the pilot digital image database. Analysis of confidentiality, copyright and data protection exposed similar issues for both systems, thus concluding that the digital image database is a more effective library system. The authors suggest that in the future, medical images will be stored on large, professionally administered, centrally located file servers, allowing specialist image libraries to be tailored locally for individual users. The further integration of the database with web technology will enable cheap and efficient remote access for a wide range of users.

  17. OfftargetFinder: a web tool for species-specific RNAi design.

    PubMed

    Good, R T; Varghese, T; Golz, J F; Russell, D A; Papanicolaou, A; Edwards, O; Robin, C

    2016-04-15

    RNA interference (RNAi) technology is being developed as a weapon for pest insect control. To maximize the specificity that such an approach affords we have developed a bioinformatic web tool that searches the ever-growing arthropod transcriptome databases so that pest-specific RNAi sequences can be identified. This will help technology developers finesse the design of RNAi sequences and suggests which non-target species should be assessed in the risk assessment process. http://rnai.specifly.org crobin@unimelb.edu.au. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Current mHealth Technologies for Physical Activity Assessment and Promotion

    PubMed Central

    O’Reilly, Gillian A.; Spruijt-Metz, Donna

    2014-01-01

    Context Novel mobile assessment and intervention capabilities are changing the face of physical activity (PA) research. A comprehensive systematic review of how mobile technology has been used for measuring PA and promoting PA behavior change is needed. Evidence acquisition Article collection was conducted using six databases from February to June 2012 with search terms related to mobile technology and PA. Articles that described the use of mobile technologies for PA assessment, sedentary behavior assessment, and/or interventions for PA behavior change were included. Articles were screened for inclusion and study information was extracted. Evidence synthesis Analyses were conducted from June to September 2012. Mobile phone–based journals and questionnaires, short message service (SMS) prompts, and on-body PA sensing systems were the mobile technologies most utilized. Results indicate that mobile journals and questionnaires are effective PA self-report measurement tools. Intervention studies that reported successful promotion of PA behavior change employed SMS communication, mobile journaling, or both SMS and mobile journaling. Conclusions mHealth technologies are increasingly being employed to assess and intervene on PA in clinical, epidemiologic, and intervention research. The wide variations in technologies used and outcomes measured limit comparability across studies, and hamper identification of the most promising technologies. Further, the pace of technologic advancement currently outstrips that of scientific inquiry. New adaptive, sequential research designs that take advantage of ongoing technology development are needed. At the same time, scientific norms must shift to accept “smart,” adaptive, iterative, evidence-based assessment and intervention technologies that will, by nature, improve during implementation. PMID:24050427

  19. Current mHealth technologies for physical activity assessment and promotion.

    PubMed

    O'Reilly, Gillian A; Spruijt-Metz, Donna

    2013-10-01

    Novel mobile assessment and intervention capabilities are changing the face of physical activity (PA) research. A comprehensive systematic review of how mobile technology has been used for measuring PA and promoting PA behavior change is needed. Article collection was conducted using six databases from February to June 2012 with search terms related to mobile technology and PA. Articles that described the use of mobile technologies for PA assessment, sedentary behavior assessment, and/or interventions for PA behavior change were included. Articles were screened for inclusion and study information was extracted. Analyses were conducted from June to September 2012. Mobile phone-based journals and questionnaires, short message service (SMS) prompts, and on-body PA sensing systems were the mobile technologies most utilized. Results indicate that mobile journals and questionnaires are effective PA self-report measurement tools. Intervention studies that reported successful promotion of PA behavior change employed SMS communication, mobile journaling, or both SMS and mobile journaling. mHealth technologies are increasingly being employed to assess and intervene on PA in clinical, epidemiologic, and intervention research. The wide variations in technologies used and outcomes measured limit comparability across studies, and hamper identification of the most promising technologies. Further, the pace of technologic advancement currently outstrips that of scientific inquiry. New adaptive, sequential research designs that take advantage of ongoing technology development are needed. At the same time, scientific norms must shift to accept "smart," adaptive, iterative, evidence-based assessment and intervention technologies that will, by nature, improve during implementation. © 2013 American Journal of Preventive Medicine.

  20. Selecting the Right Courseware for Your Online Learning Program.

    ERIC Educational Resources Information Center

    O'Mara, Heather

    2000-01-01

    Presents criteria for selecting courseware for online classes. Highlights include ease of use, including navigation; assessment tools; advantages of Java-enabled courseware; advantages of Oracle databases, including scalability; future possibilities for multimedia technology; and open architecture that will integrate with other systems. (LRW)

  1. Building a Database for Life Cycle Performance Assessment of Trenchless Technologies

    EPA Science Inventory

    Deployment of trenchless pipe rehabilitation method has steadily increased over the past 40 years and has represented an increasing proportion of the annual expenditure on the nation’s water and sewer infrastructure. Until recently, despite the massive public investments in these...

  2. Genetic testing in the European Union: does economic evaluation matter?

    PubMed

    Antoñanzas, Fernando; Rodríguez-Ibeas, R; Hutter, M F; Lorente, R; Juárez, C; Pinillos, M

    2012-10-01

    We review the published economic evaluation studies applied to genetic technologies in the EU to know the main diseases addressed by these studies, the ways the studies were conducted and to assess the efficiency of these new technologies. The final aim of this review was to understand the possibilities of the economic evaluations performed up to date as a tool to contribute to decision making in this area. We have reviewed a set of articles found in several databases until March 2010. Literature searches were made in the following databases: PubMed; Euronheed; Centre for Reviews and Dissemination of the University of York-Health Technology Assessment, Database of Abstracts of Reviews of Effects, NHS Economic Evaluation Database; and Scopus. The algorithm was "(screening or diagnosis) and genetic and (cost or economic) and (country EU27)". We included studies if they met the following criteria: (1) a genetic technology was analysed; (2) human DNA must be tested for; (3) the analysis was a real economic evaluation or a cost study, and (4) the articles had to be related to any EU Member State. We initially found 3,559 papers on genetic testing but only 92 articles of economic analysis referred to a wide range of genetic diseases matched the inclusion criteria. The most studied diseases were as follows: cystic fibrosis (12), breast and ovarian cancer (8), hereditary hemochromatosis (6), Down's syndrome (7), colorectal cancer (5), familial hypercholesterolaemia (5), prostate cancer (4), and thrombophilia (4). Genetic tests were mostly used for screening purposes, and cost-effectiveness analysis is the most common type of economic study. The analysed gene technologies are deemed to be efficient for some specific population groups and screening algorithms according to the values of their cost-effectiveness ratios that were below the commonly accepted threshold of 30,000€. Economic evaluation of genetic technologies matters but the number of published studies is still rather low as to be widely used for most of the decisions in different jurisdictions across the EU. Further, the decision bodies across EU27 are fragmented and the responsibilities are located at different levels of the decision process for what it is difficult to find out whether a given decision on genetic tests was somehow supported by the economic evaluation results.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, P.Y.; Wassom, J.S.

    Scientific and technological developments bring unprecedented stress to our environment. Society has to predict the results of potential health risks from technologically based actions that may have serious, far-reaching consequences. The potential for error in making such predictions or assessment is great and multiplies with the increasing size and complexity of the problem being studied. Because of this, the availability and use of reliable data is the key to any successful forecasting effort. Scientific research and development generate new data and information. Much of the scientific data being produced daily is stored in computers for subsequent analysis. This situation providesmore » both an invaluable resource and an enormous challenge. With large amounts of government funds being devoted to health and environmental research programs and with maintenance of our living environment at stake, we must make maximum use of the resulting data to forecast and avert catastrophic effects. Along with the readily available. The most efficient means of obtaining the data necessary for assessing the health effects of chemicals is to utilize applications include the toxicology databases and information files developed at ORNL. To make most efficient use of the data/information that has already been prepared, attention and resources should be directed toward projects that meticulously evaluate the available data/information and create specialized peer-reviewed value-added databases. Such projects include the National Library of Medicine`s Hazardous Substances Data Bank, and the U.S. Air Force Installation Restoration Toxicology Guide. These and similar value-added toxicology databases were developed at ORNL and are being maintained and updated. These databases and supporting information files, as well as some data evaluation techniques are discussed in this paper with special focus on how they are used to assess potential health effects of environmental agents. 19 refs., 5 tabs.« less

  4. Integration of environmental simulation models with satellite remote sensing and geographic information systems technologies: case studies

    USGS Publications Warehouse

    Steyaert, Louis T.; Loveland, Thomas R.; Brown, Jesslyn F.; Reed, Bradley C.

    1993-01-01

    Environmental modelers are testing and evaluating a prototype land cover characteristics database for the conterminous United States developed by the EROS Data Center of the U.S. Geological Survey and the University of Nebraska Center for Advanced Land Management Information Technologies. This database was developed from multi temporal, 1-kilometer advanced very high resolution radiometer (AVHRR) data for 1990 and various ancillary data sets such as elevation, ecological regions, and selected climatic normals. Several case studies using this database were analyzed to illustrate the integration of satellite remote sensing and geographic information systems technologies with land-atmosphere interactions models at a variety of spatial and temporal scales. The case studies are representative of contemporary environmental simulation modeling at local to regional levels in global change research, land and water resource management, and environmental simulation modeling at local to regional levels in global change research, land and water resource management and environmental risk assessment. The case studies feature land surface parameterizations for atmospheric mesoscale and global climate models; biogenic-hydrocarbons emissions models; distributed parameter watershed and other hydrological models; and various ecological models such as ecosystem, dynamics, biogeochemical cycles, ecotone variability, and equilibrium vegetation models. The case studies demonstrate the important of multi temporal AVHRR data to develop to develop and maintain a flexible, near-realtime land cover characteristics database. Moreover, such a flexible database is needed to derive various vegetation classification schemes, to aggregate data for nested models, to develop remote sensing algorithms, and to provide data on dynamic landscape characteristics. The case studies illustrate how such a database supports research on spatial heterogeneity, land use, sensitivity analysis, and scaling issues involving regional extrapolations and parameterizations of dynamic land processes within simulation models.

  5. Alternative treatment technology information center computer database system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, D.

    1995-10-01

    The Alternative Treatment Technology Information Center (ATTIC) computer database system was developed pursuant to the 1986 Superfund law amendments. It provides up-to-date information on innovative treatment technologies to clean up hazardous waste sites. ATTIC v2.0 provides access to several independent databases as well as a mechanism for retrieving full-text documents of key literature. It can be accessed with a personal computer and modem 24 hours a day, and there are no user fees. ATTIC provides {open_quotes}one-stop shopping{close_quotes} for information on alternative treatment options by accessing several databases: (1) treatment technology database; this contains abstracts from the literature on all typesmore » of treatment technologies, including biological, chemical, physical, and thermal methods. The best literature as viewed by experts is highlighted. (2) treatability study database; this provides performance information on technologies to remove contaminants from wastewaters and soils. It is derived from treatability studies. This database is available through ATTIC or separately as a disk that can be mailed to you. (3) underground storage tank database; this presents information on underground storage tank corrective actions, surface spills, emergency response, and remedial actions. (4) oil/chemical spill database; this provides abstracts on treatment and disposal of spilled oil and chemicals. In addition to these separate databases, ATTIC allows immediate access to other disk-based systems such as the Vendor Information System for Innovative Treatment Technologies (VISITT) and the Bioremediation in the Field Search System (BFSS). The user may download these programs to their own PC via a high-speed modem. Also via modem, users are able to download entire documents through the ATTIC system. Currently, about fifty publications are available, including Superfund Innovative Technology Evaluation (SITE) program documents.« less

  6. Assessment methodologies and statistical issues for computer-aided diagnosis of lung nodules in computed tomography: contemporary research topics relevant to the lung image database consortium.

    PubMed

    Dodd, Lori E; Wagner, Robert F; Armato, Samuel G; McNitt-Gray, Michael F; Beiden, Sergey; Chan, Heang-Ping; Gur, David; McLennan, Geoffrey; Metz, Charles E; Petrick, Nicholas; Sahiner, Berkman; Sayre, Jim

    2004-04-01

    Cancer of the lung and bronchus is the leading fatal malignancy in the United States. Five-year survival is low, but treatment of early stage disease considerably improves chances of survival. Advances in multidetector-row computed tomography technology provide detection of smaller lung nodules and offer a potentially effective screening tool. The large number of images per exam, however, requires considerable radiologist time for interpretation and is an impediment to clinical throughput. Thus, computer-aided diagnosis (CAD) methods are needed to assist radiologists with their decision making. To promote the development of CAD methods, the National Cancer Institute formed the Lung Image Database Consortium (LIDC). The LIDC is charged with developing the consensus and standards necessary to create an image database of multidetector-row computed tomography lung images as a resource for CAD researchers. To develop such a prospective database, its potential uses must be anticipated. The ultimate applications will influence the information that must be included along with the images, the relevant measures of algorithm performance, and the number of required images. In this article we outline assessment methodologies and statistical issues as they relate to several potential uses of the LIDC database. We review methods for performance assessment and discuss issues of defining "truth" as well as the complications that arise when truth information is not available. We also discuss issues about sizing and populating a database.

  7. Database security and encryption technology research and application

    NASA Astrophysics Data System (ADS)

    Zhu, Li-juan

    2013-03-01

    The main purpose of this paper is to discuss the current database information leakage problem, and discuss the important role played by the message encryption techniques in database security, As well as MD5 encryption technology principle and the use in the field of website or application. This article is divided into introduction, the overview of the MD5 encryption technology, the use of MD5 encryption technology and the final summary. In the field of requirements and application, this paper makes readers more detailed and clearly understood the principle, the importance in database security, and the use of MD5 encryption technology.

  8. WEB-GIS Decision Support System for CO2 storage

    NASA Astrophysics Data System (ADS)

    Gaitanaru, Dragos; Leonard, Anghel; Radu Gogu, Constantin; Le Guen, Yvi; Scradeanu, Daniel; Pagnejer, Mihaela

    2013-04-01

    Environmental decision support systems (DSS) paradigm evolves and changes as more knowledge and technology become available to the environmental community. Geographic Information Systems (GIS) can be used to extract, assess and disseminate some types of information, which are otherwise difficult to access by traditional methods. In the same time, with the help of the Internet and accompanying tools, creating and publishing online interactive maps has become easier and rich with options. The Decision Support System (MDSS) developed for the MUSTANG (A MUltiple Space and Time scale Approach for the quaNtification of deep saline formations for CO2 storaGe) project is a user friendly web based application that uses the GIS capabilities. MDSS can be exploited by the experts for CO2 injection and storage in deep saline aquifers. The main objective of the MDSS is to help the experts to take decisions based large structured types of data and information. In order to achieve this objective the MDSS has a geospatial objected-orientated database structure for a wide variety of data and information. The entire application is based on several principles leading to a series of capabilities and specific characteristics: (i) Open-Source - the entire platform (MDSS) is based on open-source technologies - (1) database engine, (2) application server, (3) geospatial server, (4) user interfaces, (5) add-ons, etc. (ii) Multiple database connections - MDSS is capable to connect to different databases that are located on different server machines. (iii)Desktop user experience - MDSS architecture and design follows the structure of a desktop software. (iv)Communication - the server side and the desktop are bound together by series functions that allows the user to upload, use, modify and download data within the application. The architecture of the system involves one database and a modular application composed by: (1) a visualization module, (2) an analysis module, (3) a guidelines module, and (4) a risk assessment module. The Database component is build by using the PostgreSQL and PostGIS open source technology. The visualization module allows the user to view data of CO2 injection sites in different ways: (1) geospatial visualization, (2) table view, (3) 3D visualization. The analysis module will allow the user to perform certain analysis like Injectivity, Containment and Capacity analysis. The Risk Assessment module focus on the site risk matrix approach. The Guidelines module contains the methodologies of CO2 injection and storage into deep saline aquifers guidelines.

  9. Skin Testing for Allergic Rhinitis: A Health Technology Assessment

    PubMed Central

    Kabali, Conrad; Chan, Brian; Higgins, Caroline; Holubowich, Corinne

    2016-01-01

    Background Allergic rhinitis is the most common type of allergy worldwide. The accuracy of skin testing for allergic rhinitis is still debated. This health technology assessment had two objectives: to determine the diagnostic accuracy of skin-prick and intradermal testing in patients with suspected allergic rhinitis and to estimate the costs to the Ontario health system of skin testing for allergic rhinitis. Methods We searched All Ovid MEDLINE, Embase, and Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, CRD Health Technology Assessment Database, Cochrane Central Register of Controlled Trials, and NHS Economic Evaluation Database for studies that evaluated the diagnostic accuracy of skin-prick and intradermal testing for allergic rhinitis using nasal provocation as the reference standard. For the clinical evidence review, data extraction and quality assessment were performed using the QUADAS-2 tool. We used the bivariate random-effects model for meta-analysis. For the economic evidence review, we assessed studies using a modified checklist developed by the (United Kingdom) National Institute for Health and Care Excellence. We estimated the annual cost of skin testing for allergic rhinitis in Ontario for 2015 to 2017 using provincial data on testing volumes and costs. Results We meta-analyzed seven studies with a total of 430 patients that assessed the accuracy of skin-prick testing. The pooled pair of sensitivity and specificity for skin-prick testing was 85% and 77%, respectively. We did not perform a meta-analysis for the diagnostic accuracy of intradermal testing due to the small number of studies (n = 4). Of these, two evaluated the accuracy of intradermal testing in confirming negative skin-prick testing results, with sensitivity ranging from 27% to 50% and specificity ranging from 60% to 100%. The other two studies evaluated the accuracy of intradermal testing as a stand-alone tool for diagnosing allergic rhinitis, with sensitivity ranging from 60% to 79% and specificity ranging from 68% to 69%. We estimated the budget impact of continuing to publicly fund skin testing for allergic rhinitis in Ontario to be between $2.5 million and $3.0 million per year. Conclusions Skin-prick testing is moderately accurate in identifying subjects with or without allergic rhinitis. The diagnostic accuracy of intradermal testing could not be well established from this review. Our best estimate is that publicly funding skin testing for allergic rhinitis costs the Ontario government approximately $2.5 million to $3.0 million per year. PMID:27279928

  10. Skin Testing for Allergic Rhinitis: A Health Technology Assessment.

    PubMed

    2016-01-01

    Allergic rhinitis is the most common type of allergy worldwide. The accuracy of skin testing for allergic rhinitis is still debated. This health technology assessment had two objectives: to determine the diagnostic accuracy of skin-prick and intradermal testing in patients with suspected allergic rhinitis and to estimate the costs to the Ontario health system of skin testing for allergic rhinitis. We searched All Ovid MEDLINE, Embase, and Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, CRD Health Technology Assessment Database, Cochrane Central Register of Controlled Trials, and NHS Economic Evaluation Database for studies that evaluated the diagnostic accuracy of skin-prick and intradermal testing for allergic rhinitis using nasal provocation as the reference standard. For the clinical evidence review, data extraction and quality assessment were performed using the QUADAS-2 tool. We used the bivariate random-effects model for meta-analysis. For the economic evidence review, we assessed studies using a modified checklist developed by the (United Kingdom) National Institute for Health and Care Excellence. We estimated the annual cost of skin testing for allergic rhinitis in Ontario for 2015 to 2017 using provincial data on testing volumes and costs. We meta-analyzed seven studies with a total of 430 patients that assessed the accuracy of skin-prick testing. The pooled pair of sensitivity and specificity for skin-prick testing was 85% and 77%, respectively. We did not perform a meta-analysis for the diagnostic accuracy of intradermal testing due to the small number of studies (n = 4). Of these, two evaluated the accuracy of intradermal testing in confirming negative skin-prick testing results, with sensitivity ranging from 27% to 50% and specificity ranging from 60% to 100%. The other two studies evaluated the accuracy of intradermal testing as a stand-alone tool for diagnosing allergic rhinitis, with sensitivity ranging from 60% to 79% and specificity ranging from 68% to 69%. We estimated the budget impact of continuing to publicly fund skin testing for allergic rhinitis in Ontario to be between $2.5 million and $3.0 million per year. Skin-prick testing is moderately accurate in identifying subjects with or without allergic rhinitis. The diagnostic accuracy of intradermal testing could not be well established from this review. Our best estimate is that publicly funding skin testing for allergic rhinitis costs the Ontario government approximately $2.5 million to $3.0 million per year.

  11. Applications of Technology to CAS Data-Base Production.

    ERIC Educational Resources Information Center

    Weisgerber, David W.

    1984-01-01

    Reviews the economic importance of applying computer technology to Chemical Abstracts Service database production from 1973 to 1983. Database building, technological applications for editorial processing (online editing, Author Index Manufacturing System), and benefits (increased staff productivity, reduced rate of increase of cost of services,…

  12. Early economic evaluation of emerging health technologies: protocol of a systematic review

    PubMed Central

    2014-01-01

    Background The concept of early health technology assessment, discussed well over a decade, has now been collaboratively implemented by industry, government, and academia to select and expedite the development of emerging technologies that may address the needs of patients and health systems. Early economic evaluation is essential to assess the value of emerging technologies, but empirical data to inform the current practice of early evaluation is limited. We propose a systematic review of early economic evaluation studies in order to better understand the current practice. Methods/design This protocol describes a systematic review of economic evaluation studies of regulated health technologies in which the evaluation is conducted prior to regulatory approval and when the technology effectiveness is not well established. Included studies must report an economic evaluation, defined as the comparative analysis of alternatives with respect to their associated costs and health consequences, and must evaluate some regulated health technology such as pharmaceuticals, biologics, high-risk medical devices, or biomarkers. We will conduct the literature search on multiple databases, including MEDLINE, EMBASE, the Centre for Reviews and Dissemination Databases, and EconLit. Additional citations will be identified via scanning reference lists and author searching. We suspect that many early economic evaluation studies are unpublished, especially those conducted for internal use only. Additionally, we use a chain-referral sampling approach to identify authors of unpublished studies who work in technology discovery and development, starting out with our contact lists and authors who published relevant studies. Citation screening and full-text review will be conducted by pairs of reviewers. Abstracted data will include those related to the decision context and decision problem of the early evaluation, evaluation methods (e.g., data sources, methods, and assumptions used to identify, measure, and value the likely effectiveness and the costs and consequences of the new technology, handling of uncertainty), and whether the study results adequately address the main study question or objective. Data will be summarized overall and stratified by publication status. Discussion This study is timely to inform early economic evaluation practice, given the international trend in early health technology assessment initiatives. PMID:25055987

  13. Early economic evaluation of emerging health technologies: protocol of a systematic review.

    PubMed

    Pham, Ba'; Tu, Hong Anh Thi; Han, Dolly; Pechlivanoglou, Petros; Miller, Fiona; Rac, Valeria; Chin, Warren; Tricco, Andrea C; Paulden, Mike; Bielecki, Joanna; Krahn, Murray

    2014-07-23

    The concept of early health technology assessment, discussed well over a decade, has now been collaboratively implemented by industry, government, and academia to select and expedite the development of emerging technologies that may address the needs of patients and health systems. Early economic evaluation is essential to assess the value of emerging technologies, but empirical data to inform the current practice of early evaluation is limited. We propose a systematic review of early economic evaluation studies in order to better understand the current practice. This protocol describes a systematic review of economic evaluation studies of regulated health technologies in which the evaluation is conducted prior to regulatory approval and when the technology effectiveness is not well established. Included studies must report an economic evaluation, defined as the comparative analysis of alternatives with respect to their associated costs and health consequences, and must evaluate some regulated health technology such as pharmaceuticals, biologics, high-risk medical devices, or biomarkers. We will conduct the literature search on multiple databases, including MEDLINE, EMBASE, the Centre for Reviews and Dissemination Databases, and EconLit. Additional citations will be identified via scanning reference lists and author searching. We suspect that many early economic evaluation studies are unpublished, especially those conducted for internal use only. Additionally, we use a chain-referral sampling approach to identify authors of unpublished studies who work in technology discovery and development, starting out with our contact lists and authors who published relevant studies. Citation screening and full-text review will be conducted by pairs of reviewers. Abstracted data will include those related to the decision context and decision problem of the early evaluation, evaluation methods (e.g., data sources, methods, and assumptions used to identify, measure, and value the likely effectiveness and the costs and consequences of the new technology, handling of uncertainty), and whether the study results adequately address the main study question or objective. Data will be summarized overall and stratified by publication status. This study is timely to inform early economic evaluation practice, given the international trend in early health technology assessment initiatives.

  14. ENVIRONMENTAL IMPACT ASSESSMENT OF A HEALTH TECHNOLOGY: A SCOPING REVIEW.

    PubMed

    Polisena, Julie; De Angelis, Gino; Kaunelis, David; Gutierrez-Ibarluzea, Iñaki

    2018-06-13

    The Health Technology Expert Review Panel is an advisory body to Canadian Agency for Drugs and Technologies in Health (CADTH) that develops recommendations on health technology assessments (HTAs) for nondrug health technologies using a deliberative framework. The framework spans several domains, including the environmental impact of the health technology(ies). Our research objective was to identify articles on frameworks, methods or case studies on the environmental impact assessment of health technologies. A literature search in major databases and a focused gray literature search were conducted. The main search concepts were HTA and environmental impact/sustainability. Eligible articles were those that described a conceptual framework or methods used to conduct an environmental assessment of health technologies, and case studies on the application of an environmental assessment. From the 1,710 citations identified, thirteen publications were included. Two articles presented a framework to incorporate environmental assessment in HTAs. Other approaches described weight of evidence practices and comprehensive and integrated environmental impact assessments. Central themes derived include transparency and repeatability, integration of components in a framework or of evidence into a single outcome, data availability to ensure the accuracy of findings, and familiarity with the approach used. Each framework and methods presented have different foci related to the ecosystem, health economics, or engineering practices. Their descriptions suggested transparency, repeatability, and the integration of components or of evidence into a single outcome as their main strengths. Our review is an initial step of a larger initiative by CADTH to develop the methods and processes to address the environmental impact question in an HTA.

  15. U.S. states and territories national tsunami hazard assessment, historic record and sources for waves

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Weaver, C.

    2007-12-01

    In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the qualitative assessments based on frequency and amplitude. The second method to assess tsunami hazard involved using the USGS earthquake databases to search for possible earthquake sources near American coastlines to extend the NOAA/NGDC tsunami databases backward in time. The qualitative tsunami hazard assessment based on the results of the NGDC and USGS database searches will be presented.

  16. Mobile Health Technology in Late-Life Mental Illness: A Focused Literature Review.

    PubMed

    Moussa, Yara; Mahdanian, Artin A; Yu, Ching; Segal, Marilyn; Looper, Karl J; Vahia, Ipsit V; Rej, Soham

    2017-08-01

    In an era of rising geriatric mental health care needs worldwide, technological advances can help address care needs in a cost-effective fashion. Our objective in this review was to assess whether mobile health technology, such as tablets and smartphones, are feasible to use in patients with late-life mental and cognitive disorders, as well as whether they were generally reliable modes of mental health/cognitive assessment. We performed a focused literature review of MEDLINE, PsychInfo, and Embase databases, including papers specifically assessing the implementation of mobile health technologies: electronic tablets (e.g., iPad), smartphones, and other mobile computerized equipment in older adults (age ≥65 years) diagnosed with or at risk of a mental and/or cognitive disorder. A total of 2,079 records were assessed, of which 7 papers were of direct relevance. Studies investigated a broad variety of mobile health technologies. Almost all examined samples with dementia/cognitive dysfunction or at risk for those disorders. All studies exclusively examined the use of mobile health technologies for the assessment of cognitive and or mental illness symptoms or disorders. None of the studies reported participants having any difficulties using the mobile health technology assessments and overall reliability was similar to paper-and-pencil modes of assessment. Overall, mobile health technologies were found to be feasible by patients and had promising reliability for the assessment of cognitive and mental illness domains in older adults. Future clinical trials will be necessary to assess whether portable communication interventions (e.g., symptom tracking) can improve geriatric mental health outcomes. Copyright © 2017 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  17. Potentials of Advanced Database Technology for Military Information Systems

    DTIC Science & Technology

    2001-04-01

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP010866 TITLE: Potentials of Advanced Database Technology for Military... Technology for Military Information Systems Sunil Choennia Ben Bruggemanb a National Aerospace Laboratory, NLR, P.O. Box 90502, 1006 BM Amsterdam...application of advanced information tech- nology, including database technology , as underpin- actions X and Y as dangerous or not? ning is

  18. Implications of Multilingual Interoperability of Speech Technology for Military Use (Les implications de l’interoperabilite multilingue des technologies vocales pour applications militaires)

    DTIC Science & Technology

    2004-09-01

    Databases 2-2 2.3.1 Translanguage English Database 2-2 2.3.2 Australian National Database of Spoken Language 2-3 2.3.3 Strange Corpus 2-3 2.3.4...some relevance to speech technology research. 2.3.1 Translanguage English Database In a daring plan Joseph Mariani, then at LIMSI-CNRS, proposed to...native speakers. The database is known as the ‘ Translanguage English Database’ but is often referred to as the ‘terrible English database.’ About 28

  19. The US Geological Survey's national coal resource assessment: The results

    USGS Publications Warehouse

    Ruppert, Leslie F.; Kirschbaum, Mark A.; Warwick, Peter D.; Flores, Romeo M.; Affolter, Ronald H.; Hatch, Joseph R.

    2002-01-01

    The US Geological Survey and the State geological surveys of many coal-bearing States recently completed a new assessment of the top producing coal beds and coal zones in five major producing coal regions—the Appalachian Basin, Gulf Coast, Illinois Basin, Colorado Plateau, and Northern Rocky Mountains and Great Plains. The assessments, which focused on both coal quality and quantity, utilized geographic information system technology and large databases. Over 1,600,000 million short tons of coal remain in over 60 coal beds and coal zones that were assessed. Given current economic, environmental, and technological restrictions, the majority of US coal production will occur in that portion of the assessed coal resource that is lowest in sulfur content. These resources are concentrated in parts of the central Appalachian Basin, Colorado Plateau, and the Northern Rocky Mountains.

  20. Consortial IT Services: Collaborating To Reduce the Pain.

    ERIC Educational Resources Information Center

    Klonoski, Ed

    The Connecticut Distance Learning Consortium (CTDLC) provides its 32 members with Information Technologies (IT) services including a portal Web site, course management software, course hosting and development, faculty training, a help desk, online assessment, and a student financial aid database. These services are supplied to two- and four-year…

  1. Analyzing critical material demand: A revised approach.

    PubMed

    Nguyen, Ruby Thuy; Fishman, Tomer; Zhao, Fu; Imholte, D D; Graedel, T E

    2018-07-15

    Apparent consumption has been widely used as a metric to estimate material demand. However, with technology advancement and complexity of material use, this metric has become less useful in tracking material flows, estimating recycling feedstocks, and conducting life cycle assessment of critical materials. We call for future research efforts to focus on building a multi-tiered consumption database for the global trade network of critical materials. This approach will help track how raw materials are processed into major components (e.g., motor assemblies) and eventually incorporated into complete pieces of equipment (e.g., wind turbines). Foreseeable challenges would involve: 1) difficulty in obtaining a comprehensive picture of trade partners due to business sensitive information, 2) complexity of materials going into components of a machine, and 3) difficulty maintaining such a database. We propose ways to address these challenges such as making use of digital design, learning from the experience of building similar databases, and developing a strategy for financial sustainability. We recommend that, with the advancement of information technology, small steps toward building such a database will contribute significantly to our understanding of material flows in society and the associated human impacts on the environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Military, Charter, Unreported Domestic Traffic and General Aviation 1976, 1984, 1992, and 2015 Emission Scenarios

    NASA Technical Reports Server (NTRS)

    Mortlock, Alan; VanAlstyne, Richard

    1998-01-01

    The report describes development of databases estimating aircraft engine exhaust emissions for the years 1976 and 1984 from global operations of Military, Charter, historic Soviet and Chinese, Unreported Domestic traffic, and General Aviation (GA). These databases were developed under the National Aeronautics and Space Administration's (NASA) Advanced Subsonic Assessment (AST). McDonnell Douglas Corporation's (MDC), now part of the Boeing Company has previously estimated engine exhaust emissions' databases for the baseline year of 1992 and a 2015 forecast year scenario. Since their original creation, (Ward, 1994 and Metwally, 1995) revised technology algorithms have been developed. Additionally, GA databases have been created and all past NIDC emission inventories have been updated to reflect the new technology algorithms. Revised data (Baughcum, 1996 and Baughcum, 1997) for the scheduled inventories have been used in this report to provide a comparison of the total aviation emission forecasts from various components. Global results of two historic years (1976 and 1984), a baseline year (1992) and a forecast year (2015) are presented. Since engine emissions are directly related to fuel usage, an overview of individual aviation annual global fuel use for each inventory component is also given in this report.

  3. Laptop Computer - Based Facial Recognition System Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. A. Cain; G. B. Singleton

    2001-03-01

    The objective of this project was to assess the performance of the leading commercial-off-the-shelf (COTS) facial recognition software package when used as a laptop application. We performed the assessment to determine the system's usefulness for enrolling facial images in a database from remote locations and conducting real-time searches against a database of previously enrolled images. The assessment involved creating a database of 40 images and conducting 2 series of tests to determine the product's ability to recognize and match subject faces under varying conditions. This report describes the test results and includes a description of the factors affecting the results.more » After an extensive market survey, we selected Visionics' FaceIt{reg_sign} software package for evaluation and a review of the Facial Recognition Vendor Test 2000 (FRVT 2000). This test was co-sponsored by the US Department of Defense (DOD) Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency (DARPA). Administered in May-June 2000, the FRVT 2000 assessed the capabilities of facial recognition systems that were currently available for purchase on the US market. Our selection of this Visionics product does not indicate that it is the ''best'' facial recognition software package for all uses. It was the most appropriate package based on the specific applications and requirements for this specific application. In this assessment, the system configuration was evaluated for effectiveness in identifying individuals by searching for facial images captured from video displays against those stored in a facial image database. An additional criterion was that the system be capable of operating discretely. For this application, an operational facial recognition system would consist of one central computer hosting the master image database with multiple standalone systems configured with duplicates of the master operating in remote locations. Remote users could perform real-time searches where network connectivity is not available. As images are enrolled at the remote locations, periodic database synchronization is necessary.« less

  4. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  5. 77 FR 66617 - HIT Policy and Standards Committees; Workgroup Application Database

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... Database AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of New ONC HIT FACA Workgroup Application Database. The Office of the National Coordinator (ONC) has launched a new Health Information Technology Federal Advisory Committee Workgroup Application Database...

  6. What Adherence Measures Should Be Used in Trials of Home-Based Rehabilitation Interventions? A Systematic Review of the Validity, Reliability, and Acceptability of Measures.

    PubMed

    Frost, Rachael; Levati, Sara; McClurg, Doreen; Brady, Marian; Williams, Brian

    2017-06-01

    To systematically review methods for measuring adherence used in home-based rehabilitation trials and to evaluate their validity, reliability, and acceptability. In phase 1 we searched the CENTRAL database, NHS Economic Evaluation Database, and Health Technology Assessment Database (January 2000 to April 2013) to identify adherence measures used in randomized controlled trials of allied health professional home-based rehabilitation interventions. In phase 2 we searched the databases of MEDLINE, Embase, CINAHL, Allied and Complementary Medicine Database, PsycINFO, CENTRAL, ProQuest Nursing and Allied Health, and Web of Science (inception to April 2015) for measurement property assessments for each measure. Studies assessing the validity, reliability, or acceptability of adherence measures. Two reviewers independently extracted data on participant and measure characteristics, measurement properties evaluated, evaluation methods, and outcome statistics and assessed study quality using the COnsensus-based Standards for the selection of health Measurement INstruments checklist. In phase 1 we included 8 adherence measures (56 trials). In phase 2, from the 222 measurement property assessments identified in 109 studies, 22 high-quality measurement property assessments were narratively synthesized. Low-quality studies were used as supporting data. StepWatch Activity Monitor validly and acceptably measured short-term step count adherence. The Problematic Experiences of Therapy Scale validly and reliably assessed adherence to vestibular rehabilitation exercises. Adherence diaries had moderately high validity and acceptability across limited populations. The Borg 6 to 20 scale, Bassett and Prapavessis scale, and Yamax CW series had insufficient validity. Low-quality evidence supported use of the Joint Protection Behaviour Assessment. Polar A1 series heart monitors were considered acceptable by 1 study. Current rehabilitation adherence measures are limited. Some possess promising validity and acceptability for certain parameters of adherence, situations, and populations and should be used in these situations. Rigorous evaluation of adherence measures in a broader range of populations is needed. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  7. Cross-cultural adaptation of the assistive technology device - Predisposition assessment (ATD PA) for use in Brazil (ATD PA Br).

    PubMed

    Alves, Ana Cristina de Jesus; Matsukura, Thelma Simões; Scherer, Marcia J

    2017-02-01

    The purpose of this study is to conduct a cross-cultural adaptation of the Assistive Technology Device Predisposition Assessment (ATD PA) for use in Brazil. The selection of the Assistive Technology Device Predisposition Assessment (ATD PA) was determined by previous literature reviews of articles published in 2014 and 2016 in six databases with the terms "assistive device" or "assistive technology" or "self-help device" combined with "evidence-based practice" or "framework" or "measurement scale" or "model and outcome assessment". This review indicated that the conceptual model of Assistive Technology (AT) most discussed in the literature was the Matching Person and Technology (MPT) model, and this finding determined the selection of ATD PA as an assessment within the MPT portfolio of measures. The procedures for cross-cultural adaptation were as follows: Equivalence of Concept, Semantic and Operational. Five experts were asked to translate 725 items and these translations were evaluated and a high level of agreement was demonstrated. The Portuguese version, Avaliação de Tecnologia Assistiva - Predisposição ao Uso - ATD PA Br, was derived from the original version in English (ATD PA). The ATD PA Br will support professionals and people with disabilities in Brazil to better select AT devices according to the clients' needs. Implications for rehabilitation Provides a systematic way of selecting assistive technology devices for the use of individuals with disabilities according to the Brazilian reality. A systematic way of selecting the assistive technology that can help decrease the abandonment of the assistive technology use. The use of the Matching Person and Technology theorical model and of the assessment ATD PA Br is essential to guide the researches and clinical practice in Brazil.

  8. Market Assessment of Biomass Gasification and Combustion Technology for Small- and Medium-Scale Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, D.; Haase, S.

    2009-07-01

    This report provides a market assessment of gasification and direct combustion technologies that use wood and agricultural resources to generate heat, power, or combined heat and power (CHP) for small- to medium-scale applications. It contains a brief overview of wood and agricultural resources in the U.S.; a description and discussion of gasification and combustion conversion technologies that utilize solid biomass to generate heat, power, and CHP; an assessment of the commercial status of gasification and combustion technologies; a summary of gasification and combustion system economics; a discussion of the market potential for small- to medium-scale gasification and combustion systems; andmore » an inventory of direct combustion system suppliers and gasification technology companies. The report indicates that while direct combustion and close-coupled gasification boiler systems used to generate heat, power, or CHP are commercially available from a number of manufacturers, two-stage gasification systems are largely in development, with a number of technologies currently in demonstration. The report also cites the need for a searchable, comprehensive database of operating combustion and gasification systems that generate heat, power, or CHP built in the U.S., as well as a national assessment of the market potential for the systems.« less

  9. Surrogate outcomes in health technology assessment: an international comparison.

    PubMed

    Velasco Garrido, Marcial; Mangiapane, Sandra

    2009-07-01

    Our aim was to review the recommendations given by health technology assessment (HTA) institutions in their methodological guidelines concerning the use of surrogate outcomes in their assessments. In a second step, we aimed at quantifying the role surrogate parameters take in assessment reports. We analyzed methodological papers and guidelines from HTA agencies with International Network of Agencies for Health Technology Assessment membership as well as from institutions related to pharmaceutical regulation (i.e., reimbursement, pricing). We analyzed the use of surrogate outcomes in a sample of HTA reports randomly drawn from the HTA database. We checked methods, results (including evidence tables), and conclusions sections and extracted the outcomes reported. We report descriptive statistics on the presence of surrogate outcomes in the reports. We identified thirty-four methodological guidelines, twenty of them addressing the issue of outcome parameter choice and the problematic of surrogate outcomes. Overall HTA agencies call on caution regarding the reliance on surrogate outcomes. None of the agencies has provided a list or catalog of acceptable and validated surrogate outcomes. We extracted the outcome parameter of 140 HTA reports. Only around half of the reports determined the outcomes for the assessment prospectively. Surrogate outcomes had been used in 62 percent of the reports. However, only 3.6 percent were based upon surrogate outcomes exclusively. All of them assessed diagnostic or screening technologies and the surrogate outcomes were predominantly test characteristics. HTA institutions seem to agree on a cautious approach to the use of surrogate outcomes in technology assessment. Thorough assessment of health technologies should not rely exclusively on surrogate outcomes.

  10. Usefulness of Canadian Public Health Insurance Administrative Databases to Assess Breast and Ovarian Cancer Screening Imaging Technologies for BRCA1/2 Mutation Carriers.

    PubMed

    Larouche, Geneviève; Chiquette, Jocelyne; Plante, Marie; Pelletier, Sylvie; Simard, Jacques; Dorval, Michel

    2016-11-01

    In Canada, recommendations for clinical management of hereditary breast and ovarian cancer among individuals carrying a deleterious BRCA1 or BRCA2 mutation have been available since 2007. Eight years later, very little is known about the uptake of screening and risk-reduction measures in this population. Because Canada's public health care system falls under provincial jurisdictions, using provincial health care administrative databases appears a valuable option to assess management of BRCA1/2 mutation carriers. The objective was to explore the usefulness of public health insurance administrative databases in British Columbia, Ontario, and Quebec to assess management after BRCA1/2 genetic testing. Official public health insurance documents were considered potentially useful if they had specific procedure codes, and pertained to procedures performed in the public and private health care systems. All 3 administrative databases have specific procedures codes for mammography and breast ultrasounds. Only Quebec and Ontario have a specific procedure code for breast magnetic resonance imaging. It is impossible to assess, on an individual basis, the frequency of others screening exams, with the exception of CA-125 testing in British Columbia. Screenings done in private practice are excluded from the administrative databases unless covered by special agreements for reimbursement, such as all breast imaging exams in Ontario and mammograms in British Columbia and Quebec. There are no specific procedure codes for risk-reduction surgeries for breast and ovarian cancer. Population-based assessment of breast and ovarian cancer risk management strategies other than mammographic screening, using only administrative data, is currently challenging in the 3 Canadian provinces studied. Copyright © 2016 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.

  11. Disbiome database: linking the microbiome to disease.

    PubMed

    Janssens, Yorick; Nielandt, Joachim; Bronselaer, Antoon; Debunne, Nathan; Verbeke, Frederick; Wynendaele, Evelien; Van Immerseel, Filip; Vandewynckel, Yves-Paul; De Tré, Guy; De Spiegeleer, Bart

    2018-06-04

    Recent research has provided fascinating indications and evidence that the host health is linked to its microbial inhabitants. Due to the development of high-throughput sequencing technologies, more and more data covering microbial composition changes in different disease types are emerging. However, this information is dispersed over a wide variety of medical and biomedical disciplines. Disbiome is a database which collects and presents published microbiota-disease information in a standardized way. The diseases are classified using the MedDRA classification system and the micro-organisms are linked to their NCBI and SILVA taxonomy. Finally, each study included in the Disbiome database is assessed for its reporting quality using a standardized questionnaire. Disbiome is the first database giving a clear, concise and up-to-date overview of microbial composition differences in diseases, together with the relevant information of the studies published. The strength of this database lies within the combination of the presence of references to other databases, which enables both specific and diverse search strategies within the Disbiome database, and the human annotation which ensures a simple and structured presentation of the available data.

  12. Accuracy of LightCycler(R) SeptiFast for the detection and identification of pathogens in the blood of patients with suspected sepsis: a systematic review protocol.

    PubMed

    Dark, Paul; Wilson, Claire; Blackwood, Bronagh; McAuley, Danny F; Perkins, Gavin D; McMullan, Ronan; Gates, Simon; Warhurst, Geoffrey

    2012-01-01

    Background There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probe-based real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care. Registration PROSPERO-NIHR Prospective Register of Systematic Reviews (CRD42011001289).

  13. Quantifying Data Quality for Clinical Trials Using Electronic Data Capture

    PubMed Central

    Nahm, Meredith L.; Pieper, Carl F.; Cunningham, Maureen M.

    2008-01-01

    Background Historically, only partial assessments of data quality have been performed in clinical trials, for which the most common method of measuring database error rates has been to compare the case report form (CRF) to database entries and count discrepancies. Importantly, errors arising from medical record abstraction and transcription are rarely evaluated as part of such quality assessments. Electronic Data Capture (EDC) technology has had a further impact, as paper CRFs typically leveraged for quality measurement are not used in EDC processes. Methods and Principal Findings The National Institute on Drug Abuse Treatment Clinical Trials Network has developed, implemented, and evaluated methodology for holistically assessing data quality on EDC trials. We characterize the average source-to-database error rate (14.3 errors per 10,000 fields) for the first year of use of the new evaluation method. This error rate was significantly lower than the average of published error rates for source-to-database audits, and was similar to CRF-to-database error rates reported in the published literature. We attribute this largely to an absence of medical record abstraction on the trials we examined, and to an outpatient setting characterized by less acute patient conditions. Conclusions Historically, medical record abstraction is the most significant source of error by an order of magnitude, and should be measured and managed during the course of clinical trials. Source-to-database error rates are highly dependent on the amount of structured data collection in the clinical setting and on the complexity of the medical record, dependencies that should be considered when developing data quality benchmarks. PMID:18725958

  14. AN INTEGRATED ASSESSMENT OF THE IMPACTS OF HYDROGEN ECONOMY ON TRANSPORTATION, ENERGY USE, AND AIR EMISSIONS

    EPA Science Inventory

    This paper presents an analysis of the potential energy, economic and environmental implications of hydrogen fuel cell vehicle (H2-FCV) penetration into the U.S. light duty vehicle fleet. The approach, which uses the U.S. EPA MARKet ALlocation technology database and model, allow...

  15. Duplicates, redundancies and inconsistencies in the primary nucleotide databases: a descriptive study.

    PubMed

    Chen, Qingyu; Zobel, Justin; Verspoor, Karin

    2017-01-01

    GenBank, the EMBL European Nucleotide Archive and the DNA DataBank of Japan, known collectively as the International Nucleotide Sequence Database Collaboration or INSDC, are the three most significant nucleotide sequence databases. Their records are derived from laboratory work undertaken by different individuals, by different teams, with a range of technologies and assumptions and over a period of decades. As a consequence, they contain a great many duplicates, redundancies and inconsistencies, but neither the prevalence nor the characteristics of various types of duplicates have been rigorously assessed. Existing duplicate detection methods in bioinformatics only address specific duplicate types, with inconsistent assumptions; and the impact of duplicates in bioinformatics databases has not been carefully assessed, making it difficult to judge the value of such methods. Our goal is to assess the scale, kinds and impact of duplicates in bioinformatics databases, through a retrospective analysis of merged groups in INSDC databases. Our outcomes are threefold: (1) We analyse a benchmark dataset consisting of duplicates manually identified in INSDC-a dataset of 67 888 merged groups with 111 823 duplicate pairs across 21 organisms from INSDC databases - in terms of the prevalence, types and impacts of duplicates. (2) We categorize duplicates at both sequence and annotation level, with supporting quantitative statistics, showing that different organisms have different prevalence of distinct kinds of duplicate. (3) We show that the presence of duplicates has practical impact via a simple case study on duplicates, in terms of GC content and melting temperature. We demonstrate that duplicates not only introduce redundancy, but can lead to inconsistent results for certain tasks. Our findings lead to a better understanding of the problem of duplication in biological databases.Database URL: the merged records are available at https://cloudstor.aarnet.edu.au/plus/index.php/s/Xef2fvsebBEAv9w. © The Author(s) 2017. Published by Oxford University Press.

  16. Duplicates, redundancies and inconsistencies in the primary nucleotide databases: a descriptive study

    PubMed Central

    Chen, Qingyu; Zobel, Justin; Verspoor, Karin

    2017-01-01

    GenBank, the EMBL European Nucleotide Archive and the DNA DataBank of Japan, known collectively as the International Nucleotide Sequence Database Collaboration or INSDC, are the three most significant nucleotide sequence databases. Their records are derived from laboratory work undertaken by different individuals, by different teams, with a range of technologies and assumptions and over a period of decades. As a consequence, they contain a great many duplicates, redundancies and inconsistencies, but neither the prevalence nor the characteristics of various types of duplicates have been rigorously assessed. Existing duplicate detection methods in bioinformatics only address specific duplicate types, with inconsistent assumptions; and the impact of duplicates in bioinformatics databases has not been carefully assessed, making it difficult to judge the value of such methods. Our goal is to assess the scale, kinds and impact of duplicates in bioinformatics databases, through a retrospective analysis of merged groups in INSDC databases. Our outcomes are threefold: (1) We analyse a benchmark dataset consisting of duplicates manually identified in INSDC—a dataset of 67 888 merged groups with 111 823 duplicate pairs across 21 organisms from INSDC databases – in terms of the prevalence, types and impacts of duplicates. (2) We categorize duplicates at both sequence and annotation level, with supporting quantitative statistics, showing that different organisms have different prevalence of distinct kinds of duplicate. (3) We show that the presence of duplicates has practical impact via a simple case study on duplicates, in terms of GC content and melting temperature. We demonstrate that duplicates not only introduce redundancy, but can lead to inconsistent results for certain tasks. Our findings lead to a better understanding of the problem of duplication in biological databases. Database URL: the merged records are available at https://cloudstor.aarnet.edu.au/plus/index.php/s/Xef2fvsebBEAv9w PMID:28077566

  17. A methodology and decision support tool for informing state-level bioenergy policymaking: New Jersey biofuels as a case study

    NASA Astrophysics Data System (ADS)

    Brennan-Tonetta, Margaret

    This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or GGE generated by different feedstock/technology combinations at the state and county level. Modification of the model to incorporate factors that will enable the analysis of broader energy policy issues as those mentioned above, are recommended for future research efforts.

  18. From LDEF to a national Space Environment and Effects (SEE) program: A natural progression

    NASA Technical Reports Server (NTRS)

    Bowles, David E.; Calloway, Robert L.; Funk, Joan G.; Kinard, William H.; Levine, Arlene S.

    1995-01-01

    As the LDEF program draws to a close, it leaves in place the fundamental building blocks for a Space Environment and Effects (SEE) program. Results from LDEF data analyses and investigations now form a substantial core of knowledge on the long term effects of the space environment on materials, system and structures. In addition, these investigations form the basic structure of a critically-needed SEE archive and database system. An agency-wide effort is required to capture all elements of a SEE program to provide a more comprehensive and focused approach to understanding the space environment, determining the best techniques for both flight and ground-based experimentation, updating the models which predict both the environments and those effects on subsystems and spacecraft, and, finally, ensuring that this multitudinous information is properly maintained, and inserted into spacecraft design programs. Many parts and pieces of a SEE program already exist at various locations to fulfill specific needs. The primary purpose of this program, under the direction of the Office of Advanced Concepts and Technology (OACT) in NASA Headquarters, is to take advantage of these parts; apply synergisms where possible; identify and when possible fill-in gaps; coordinate and advocate a comprehensive SEE program. The SEE program must coordinate and support the efforts of well-established technical communities wherein the bulk of the work will continue to be done. The SEE program will consist of a NASA-led SEE Steering Committee, consisting of government and industry users, with the responsibility for coordination between technology developers and NASA customers; and Technical Working Groups with primary responsibility for program technical content in response to user needs. The Technical Working Groups are as follows: Materials and Processes; Plasma and Fields; Ionizing Radiation; Meteoroids and Orbital Debris; Neutral External Contamination; Thermosphere, Thermal, and Solar Conditions; Electromagnetic Effects; Integrated Assessments and Databases. Specific technology development tasks will be solicited through a NASA Research Announcement to be released in May of 1994. The areas in which tasks are solicited include: (1) engineering environment definitions, (2) environments and effects design guidelines, (3) environments and effects assessment models and databases, and (4) flight/ground simulation/technology assessment data.

  19. From LDEF to a national Space Environment and Effects (SEE) program: A natural progression

    NASA Astrophysics Data System (ADS)

    Bowles, David E.; Calloway, Robert L.; Funk, Joan G.; Kinard, William H.; Levine, Arlene S.

    1995-02-01

    As the LDEF program draws to a close, it leaves in place the fundamental building blocks for a Space Environment and Effects (SEE) program. Results from LDEF data analyses and investigations now form a substantial core of knowledge on the long term effects of the space environment on materials, system and structures. In addition, these investigations form the basic structure of a critically-needed SEE archive and database system. An agency-wide effort is required to capture all elements of a SEE program to provide a more comprehensive and focused approach to understanding the space environment, determining the best techniques for both flight and ground-based experimentation, updating the models which predict both the environments and those effects on subsystems and spacecraft, and, finally, ensuring that this multitudinous information is properly maintained, and inserted into spacecraft design programs. Many parts and pieces of a SEE program already exist at various locations to fulfill specific needs. The primary purpose of this program, under the direction of the Office of Advanced Concepts and Technology (OACT) in NASA Headquarters, is to take advantage of these parts; apply synergisms where possible; identify and when possible fill-in gaps; coordinate and advocate a comprehensive SEE program. The SEE program must coordinate and support the efforts of well-established technical communities wherein the bulk of the work will continue to be done. The SEE program will consist of a NASA-led SEE Steering Committee, consisting of government and industry users, with the responsibility for coordination between technology developers and NASA customers; and Technical Working Groups with primary responsibility for program technical content in response to user needs. The Technical Working Groups are as follows: Materials and Processes; Plasma and Fields; Ionizing Radiation; Meteoroids and Orbital Debris; Neutral External Contamination; Thermosphere, Thermal, and Solar Conditions; Electromagnetic Effects; Integrated Assessments and Databases. Specific technology development tasks will be solicited through a NASA Research Announcement to be released in May of 1994. The areas in which tasks are solicited include: (1) engineering environment definitions, (2) environments and effects design guidelines, (3) environments and effects assessment models and databases, and (4) flight/ground simulation/technology assessment data.

  20. The Technology Education Graduate Research Database, 1892-2000. CTTE Monograph.

    ERIC Educational Resources Information Center

    Reed, Philip A., Ed.

    The Technology Education Graduate Research Database (TEGRD) was designed in two parts. The first part was a 384 page bibliography of theses and dissertations from 1892-2000. The second part was an online, searchable database of graduate research completed within technology education from 1892 to the present. The primary goals of the project were:…

  1. LIRIS flight database and its use toward noncooperative rendezvous

    NASA Astrophysics Data System (ADS)

    Mongrard, O.; Ankersen, F.; Casiez, P.; Cavrois, B.; Donnard, A.; Vergnol, A.; Southivong, U.

    2018-06-01

    ESA's fifth and last Automated Transfer Vehicle, ATV Georges Lemaître, tested new rendezvous technology before docking with the International Space Station (ISS) in August 2014. The technology demonstration called Laser Infrared Imaging Sensors (LIRIS) provides an unseen view of the ISS. During Georges Lemaître's rendezvous, LIRIS sensors, composed of two infrared cameras, one visible camera, and a scanning LIDAR (Light Detection and Ranging), were turned on two and a half hours and 3500 m from the Space Station. All sensors worked as expected and a large amount of data was recorded and stored within ATV-5's cargo hold before being returned to Earth with the Soyuz flight 38S in September 2014. As a part of the LIRIS postflight activities, the information gathered by all sensors is collected inside a flight database together with the reference ATV trajectory and attitude estimated by ATV main navigation sensors. Although decoupled from the ATV main computer, the LIRIS data were carefully synchronized with ATV guidance, navigation, and control (GNC) data. Hence, the LIRIS database can be used to assess the performance of various image processing algorithms to provide range and line-of-sight (LoS) navigation at long/medium range but also 6 degree-of-freedom (DoF) navigation at short range. The database also contains information related to the overall ATV position with respect to Earth and the Sun direction within ATV frame such that the effect of the environment on the sensors can also be investigated. This paper introduces the structure of the LIRIS database and provides some example of applications to increase the technology readiness level of noncooperative rendezvous.

  2. RAMI modeling of plant systems for proposed tritium production and extraction facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchard, A.

    2000-04-05

    The control of life-cycle cost is a primary concern during the development, construction, operation, and decommissioning of DOE systems and facilities. An effective tool that can be used to control these costs, beginning with the design stage, is called a reliability, availability, maintainability, and inspectability analysis or, simply, RAMI for short. In 1997, RAMI technology was introduced to the Savannah River Site with applications at the conceptual design stage beginning with the Accelerator Production of Tritium (APT) Project and later extended to the Commercial Light Water Reactor (CLWR) Tritium Extraction Facility (TEF) Project. More recently it has been applied tomore » the as-build Water Treatment Facilities designed for ground water environmental restoration. This new technology and database was applied to the assessment of balance-of-plant systems for the APT Conceptual Design Report. Initial results from the Heat Removal System Assessment revealed that the system conceptual design would cause the APT to fall short of its annual production goal. Using RAM technology to immediately assess this situation, it was demonstrated that the product loss could be gained back by upgrading the system's chiller unit capacity at a cost of less than $1.3 million. The reclaimed production is worth approximately $100 million. The RAM technology has now been extended to assess the conceptual design for the CLWR-TEF Project. More specifically, this technology and database is being used to translate high level availability goals into lower level system design requirements that will ensure the TEF meets its production goal. Results, from the limited number of system assessments performed to date, have already been used to modify the conceptual design for a remote handling system, improving its availability to the point that a redundant system, with its associated costs of installation and operation may no longer be required. RAMI results were also used to justify the elimination of a metal uranium bed in the design of a water cracker system, producing a significant reduction in the estimated construction and operating costs.« less

  3. Solutions for data integration in functional genomics: a critical assessment and case study.

    PubMed

    Smedley, Damian; Swertz, Morris A; Wolstencroft, Katy; Proctor, Glenn; Zouberakis, Michael; Bard, Jonathan; Hancock, John M; Schofield, Paul

    2008-11-01

    The torrent of data emerging from the application of new technologies to functional genomics and systems biology can no longer be contained within the traditional modes of data sharing and publication with the consequence that data is being deposited in, distributed across and disseminated through an increasing number of databases. The resulting fragmentation poses serious problems for the model organism community which increasingly rely on data mining and computational approaches that require gathering of data from a range of sources. In the light of these problems, the European Commission has funded a coordination action, CASIMIR (coordination and sustainability of international mouse informatics resources), with a remit to assess the technical and social aspects of database interoperability that currently prevent the full realization of the potential of data integration in mouse functional genomics. In this article, we assess the current problems with interoperability, with particular reference to mouse functional genomics, and critically review the technologies that can be deployed to overcome them. We describe a typical use-case where an investigator wishes to gather data on variation, genomic context and metabolic pathway involvement for genes discovered in a genome-wide screen. We go on to develop an automated approach involving an in silico experimental workflow tool, Taverna, using web services, BioMart and MOLGENIS technologies for data retrieval. Finally, we focus on the current impediments to adopting such an approach in a wider context, and strategies to overcome them.

  4. Assessing telemedicine: a systematic review of the literature.

    PubMed

    Roine, R; Ohinmaa, A; Hailey, D

    2001-09-18

    To clarify the current status of telemedicine, we carried out a systematic review of the literature. We identified controlled assessment studies of telemedicine that reported patient outcomes, administrative changes or economic assessments and assessed the quality of that literature. We carried out a systematic electronic search for articles published from 1966 to early 2000 using the MEDLINE (1966-April 2000), HEALTHSTAR (1975-January 2000), EMBASE (1988-February 2000) and CINALH (1982-January 2000) databases. In addition, the HSTAT database (Health Services/Technology Assessment Text, US National Library of Medicine), the Database of Abstracts of Reviews of Effectiveness (DARE, NHS Centre for Reviews and Dissemination, United Kingdom), the NHS Economic Evaluation Database and the Cochrane Controlled Trials Register were searched. We consulted experts in the field and did a manual search of the reference lists of review articles. A total of 1124 studies were identified. Based on a review of the abstracts, 133 full-text articles were obtained for closer inspection. Of these, 50 were deemed to represent assessment studies fulfilling the inclusion criteria of the review. Thirty-four of the articles assessed at least some clinical outcomes; the remaining 16 were mainly economic analyses. Most of the available literature referred only to pilot projects and short-term outcomes, and most of the studies were of low quality. Relatively convincing evidence of effectiveness was found only for teleradiology, teleneurosurgery, telepsychiatry, transmission of echocardiographic images, and the use of electronic referrals enabling e-mail consultations and video conferencing between primary and secondary health care providers. Economic analyses suggested that teleradiology, especially transmission of CT images, can be cost-saving. Evidence regarding the effectiveness or cost-effectiveness of telemedicine is still limited. Based on current scientific evidence, only a few telemedicine applications can be recommended for broader use.

  5. Unified communication to reach vulnerable mothers.

    PubMed

    Tezcan, B; Von Rege, I; Henkson, H; Oteng-Ntim, E

    2011-01-01

    The feasibility of using a mobile text to reach vulnerable patient groups was assessed in this study. A total of 121 pregnant or postnatal women were randomly asked to complete a questionnaire. The questionnaire was given to them in the antenatal clinic, postnatal ward, antenatal ward or in the day assessment unit at St Thomas' Hospital, London. The forms were collected and analysed using an Excel database. The results of this survey show that mobile technology is readily available for 97% of the obstetric population. In mothers from vulnerable groups and in mothers from deprived areas, 61% possessed 3rd generation mobile technology. The majority of mothers surveyed wanted their care supplemented by the use of their mobile phones.

  6. Evolution of Database Replication Technologies for WLCG

    NASA Astrophysics Data System (ADS)

    Baranowski, Zbigniew; Lobato Pardavila, Lorena; Blaszczyk, Marcin; Dimitrov, Gancho; Canali, Luca

    2015-12-01

    In this article we summarize several years of experience on database replication technologies used at WLCG and we provide a short review of the available Oracle technologies and their key characteristics. One of the notable changes and improvement in this area in recent past has been the introduction of Oracle GoldenGate as a replacement of Oracle Streams. We report in this article on the preparation and later upgrades for remote replication done in collaboration with ATLAS and Tier 1 database administrators, including the experience from running Oracle GoldenGate in production. Moreover, we report on another key technology in this area: Oracle Active Data Guard which has been adopted in several of the mission critical use cases for database replication between online and offline databases for the LHC experiments.

  7. Pilot Aircraft Interface Objectives/Rationale

    NASA Technical Reports Server (NTRS)

    Shively, Jay

    2010-01-01

    Objective: Database and proof of concept for guidelines for GCS compliance a) Rationale: 1) Provide research test-bed to develop guidelines. 2) Modify GCS for NAS Compliance to provide proof of concept. b) Approach: 1) Assess current state of GCS technology. 2) Information Requirements Definition. 3) SME Workshop. 4) Modify an Existing GCS for NAS Compliance. 5) Define exemplar UAS (choose system to develop prototype). 6) Define Candidate Displays & Controls. 7) Evaluate/ refine in Simulations. 8) Demonstrate in flight. c) Deliverables: 1) Information Requirements Report. 2) Workshop Proceedings. 3) Technical Reports/ papers on Simulations & Flight Demo. 4) Database for guidelines.

  8. Property Graph vs RDF Triple Store: A Comparison on Glycan Substructure Search

    PubMed Central

    Alocci, Davide; Mariethoz, Julien; Horlacher, Oliver; Bolleman, Jerven T.; Campbell, Matthew P.; Lisacek, Frederique

    2015-01-01

    Resource description framework (RDF) and Property Graph databases are emerging technologies that are used for storing graph-structured data. We compare these technologies through a molecular biology use case: glycan substructure search. Glycans are branched tree-like molecules composed of building blocks linked together by chemical bonds. The molecular structure of a glycan can be encoded into a direct acyclic graph where each node represents a building block and each edge serves as a chemical linkage between two building blocks. In this context, Graph databases are possible software solutions for storing glycan structures and Graph query languages, such as SPARQL and Cypher, can be used to perform a substructure search. Glycan substructure searching is an important feature for querying structure and experimental glycan databases and retrieving biologically meaningful data. This applies for example to identifying a region of the glycan recognised by a glycan binding protein (GBP). In this study, 19,404 glycan structures were selected from GlycomeDB (www.glycome-db.org) and modelled for being stored into a RDF triple store and a Property Graph. We then performed two different sets of searches and compared the query response times and the results from both technologies to assess performance and accuracy. The two implementations produced the same results, but interestingly we noted a difference in the query response times. Qualitative measures such as portability were also used to define further criteria for choosing the technology adapted to solving glycan substructure search and other comparable issues. PMID:26656740

  9. Review and assessment of the database and numerical modeling for turbine heat transfer

    NASA Technical Reports Server (NTRS)

    Gladden, H. J.; Simoneau, R. J.

    1989-01-01

    The objectives of the NASA Hot Section Technology (HOST) Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  10. Using bibliographic databases in technology transfer

    NASA Technical Reports Server (NTRS)

    Huffman, G. David

    1987-01-01

    When technology developed for a specific purpose is used in another application, the process is called technology transfer--the application of an existing technology to a new use or user for purposes other than those for which the technology was originally intended. Using Bibliographical Databases in Technology Transfer deals with demand-pull transfer, technology transfer that arises from need recognition, and is a guide for conducting demand-pull technology transfer studies. It can be used by a researcher as a self-teaching manual or by an instructor as a classroom text. A major problem of technology transfer is finding applicable technology to transfer. Described in detail is the solution to this problem, the use of computerized, bibliographic databases, which currently contain virtually all documented technology of the past 15 years. A general framework for locating technology is described. NASA technology organizations and private technology transfer firms are listed for consultation.

  11. HEALTH TECHNOLOGY ASSESSMENT EVIDENCE ON E-HEALTH/M-HEALTH TECHNOLOGIES: EVALUATING THE TRANSPARENCY AND THOROUGHNESS.

    PubMed

    Vukovic, Vladimir; Favaretti, Carlo; Ricciardi, Walter; de Waure, Chiara

    2018-01-01

    Evaluation is crucial for integration of e-Health/m-Health into healthcare systems and health technology assessment (HTA) could offer sound methodological basis for these evaluations. Aim of this study was to look for HTA reports on e-Health/m-Health technologies and to analyze their transparency, consistency and thoroughness, with the goal to detect areas that need improvement. PubMed, ISI-WOS, and University of York - Centre for Reviews and Dissemination-electronic databases were searched to identify reports on e-Health/m-Health technologies, published up until April 1, 2016. The International Network of Agencies for Health Technology Assessment (INAHTA) checklist was used to evaluate transparency and consistency of included reports. Thoroughness was assessed by checking the presence of domains suggested by the European network for Health Technology Assessment (EUnetHTA) HTA Core Model. Twenty-eight reports published between 1999 and 2015 were included. Most were delivered by non-European countries (71.4 percent) and only 35.7 percent were classified as full reports. All the HTA reports defined the scope of research whereas more than 80 percent provided author details, summary, discussed findings, and conclusion. On the contrary, policy and research questions were clearly defined in around 30 percent and 50 percent of reports. With respect to the EUnetHTA Core Model, around 70 percent of reports dealt with effectiveness and economic evaluation, more than 50 percent described health problem and approximately 40 percent organizational and social aspects. E-Health/m-Health technologies are increasingly present in the field of HTA. Yet, our review identified several missing elements. Most of the reports failed to respond to relevant assessment components, especially ethical, social and organizational implications.

  12. A Methodology for Integrated, Multiregional Life Cycle Assessment Scenarios under Large-Scale Technological Change.

    PubMed

    Gibon, Thomas; Wood, Richard; Arvesen, Anders; Bergesen, Joseph D; Suh, Sangwon; Hertwich, Edgar G

    2015-09-15

    Climate change mitigation demands large-scale technological change on a global level and, if successfully implemented, will significantly affect how products and services are produced and consumed. In order to anticipate the life cycle environmental impacts of products under climate mitigation scenarios, we present the modeling framework of an integrated hybrid life cycle assessment model covering nine world regions. Life cycle assessment databases and multiregional input-output tables are adapted using forecasted changes in technology and resources up to 2050 under a 2 °C scenario. We call the result of this modeling "technology hybridized environmental-economic model with integrated scenarios" (THEMIS). As a case study, we apply THEMIS in an integrated environmental assessment of concentrating solar power. Life-cycle greenhouse gas emissions for this plant range from 33 to 95 g CO2 eq./kWh across different world regions in 2010, falling to 30-87 g CO2 eq./kWh in 2050. Using regional life cycle data yields insightful results. More generally, these results also highlight the need for systematic life cycle frameworks that capture the actual consequences and feedback effects of large-scale policies in the long term.

  13. Atmospheric Effects of Subsonic Aircraft: Interim Assessment Report of the Advanced Subsonic Technology Program

    NASA Technical Reports Server (NTRS)

    Friedl, Randall R. (Editor)

    1997-01-01

    This first interim assessment of the subsonic assessment (SASS) project attempts to summarize concisely the status of our knowledge concerning the impacts of present and future subsonic aircraft fleets. It also highlights the major areas of scientific uncertainty, through review of existing data bases and model-based sensitivity studies. In view of the need for substantial improvements in both model formulations and experimental databases, this interim assessment cannot provide confident numerical predictions of aviation impacts. However, a number of quantitative estimates are presented, which provide some guidance to policy makers.

  14. The National Landslide Database and GIS for Great Britain: construction, development, data acquisition, application and communication

    NASA Astrophysics Data System (ADS)

    Pennington, Catherine; Dashwood, Claire; Freeborough, Katy

    2014-05-01

    The National Landslide Database has been developed by the British Geological Survey (BGS) and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 16,500 records of landslide events, each documented as fully as possible. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and crowd-sourcing information through social media and other online resources. This information is invaluable for the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domain map currently under development rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures and an understanding of causative factors and their spatial distribution, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP and data collected for the National Landslide Database is used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a variety of stakeholders for research purposes.

  15. Probabilistic seismic hazard assessment for northern Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Kosuwan, S.; Nguyen, M. L.; Shi, X.; Sieh, K.

    2016-12-01

    We assess seismic hazard for northern Southeast Asia through constructing an earthquake and fault database, conducting a series of ground-shaking scenarios and proposing regional seismic hazard maps. Our earthquake database contains earthquake parameters from global and local seismic catalogues, including the ISC, ISC-GEM, the global ANSS Comprehensive Catalogues, Seismological Bureau, Thai Meteorological Department, Thailand, and Institute of Geophysics Vietnam Academy of Science and Technology, Vietnam. To harmonize the earthquake parameters from various catalogue sources, we remove duplicate events and unify magnitudes into the same scale. Our active fault database include active fault data from previous studies, e.g. the active fault parameters determined by Wang et al. (2014), Department of Mineral Resources, Thailand, and Institute of Geophysics, Vietnam Academy of Science and Technology, Vietnam. Based on the parameters from analysis of the databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and time elapsed of last events), we determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the felt intensities of historical earthquakes to the modelled ground motions using ground motion prediction equations (GMPEs). By incorporating the best-fitting GMPEs and site conditions, we utilized site effect and assessed probabilistic seismic hazard. The highest seismic hazard is in the region close to the Sagaing Fault, which cuts through some major cities in central Myanmar. The northern segment of Sunda megathrust, which could potentially cause M8-class earthquake, brings significant hazard along the Western Coast of Myanmar and eastern Bangladesh. Besides, we conclude a notable hazard level in northern Vietnam and the boundary between Myanmar, Thailand and Laos, due to a series of strike-slip faults, which could potentially cause moderate-large earthquakes. Note that although much of the region has a low probability of damaging shaking, low-probability events have resulted in much destruction recently in SE Asia (e.g. 2008 Wenchuan, 2015 Sabah earthquakes).

  16. "Mr. Database" : Jim Gray and the History of Database Technologies.

    PubMed

    Hanwahr, Nils C

    2017-12-01

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

  17. Economic evaluation of manual therapy for musculoskeletal diseases: a protocol for a systematic review and narrative synthesis of evidence.

    PubMed

    Kim, Chang-Gon; Mun, Su-Jeong; Kim, Ka-Na; Shin, Byung-Cheul; Kim, Nam-Kwen; Lee, Dong-Hyo; Lee, Jung-Han

    2016-05-13

    Manual therapy is the non-surgical conservative management of musculoskeletal disorders using the practitioner's hands on the patient's body for diagnosing and treating disease. The aim of this study is to systematically review trial-based economic evaluations of manual therapy relative to other interventions used for the management of musculoskeletal diseases. Randomised clinical trials (RCTs) on the economic evaluation of manual therapy for musculoskeletal diseases will be included in the review. The following databases will be searched from their inception: Medline, Embase, Cochrane Central Register of Controlled Trials (CENTRAL), Cumulative Index to Nursing and Allied Health Literature (CINAHL), Econlit, Mantis, Index to Chiropractic Literature, Science Citation Index, Social Science Citation Index, Allied and Complementary Medicine Database (AMED), Cochrane Database of Systematic Reviews (CDSR), National Health Service Database of Abstracts of Reviews of Effects (NHS DARE), National Health Service Health Technology Assessment Database (NHS HTA), National Health Service Economic Evaluation Database (NHS EED), CENTRAL, five Korean medical databases (Oriental Medicine Advanced Searching Integrated System (OASIS), Research Information Service System (RISS), DBPIA, Korean Traditional Knowledge Portal (KTKP) and KoreaMed) and three Chinese databases (China National Knowledge Infrastructure (CNKI), VIP and Wanfang). The evidence for the cost-effectiveness, cost-utility and cost-benefit of manual therapy for musculoskeletal diseases will be assessed as the primary outcome. Health-related quality of life and adverse effects will be assessed as secondary outcomes. We will critically appraise the included studies using the Cochrane risk of bias tool and the Drummond checklist. Results will be summarised using Slavin's qualitative best-evidence synthesis approach. The results of the study will be disseminated via a peer-reviewed journal and/or conference presentations. PROSPERO CRD42015026757. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  18. Surface Transportation Security Priority Assessment

    DTIC Science & Technology

    2010-03-01

    intercity buses), and pipelines, and related infrastructure (including roads and highways), that are within the territory of the United States...Modernizing the information technology infrastructure used to vet the identity of travelers and transportation workers  Using terrorist databases to...examination of persons travelling , surface transportation modes tend to operate in a much more open environment, making it difficult to screen workers

  19. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement - Part III: Material Property Characterization, Analysis, and Test Methods

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.

    2005-01-01

    The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.

  20. Architecture Knowledge for Evaluating Scalable Databases

    DTIC Science & Technology

    2015-01-16

    problems, arising from the proliferation of new data models and distributed technologies for building scalable, available data stores . Architects must...longer are relational databases the de facto standard for building data repositories. Highly distributed, scalable “ NoSQL ” databases [11] have emerged...This is especially challenging at the data storage layer. The multitude of competing NoSQL database technologies creates a complex and rapidly

  1. What Is eHealth (4): A Scoping Exercise to Map the Field

    PubMed Central

    Sloan, David; Gregor, Peter; Sullivan, Frank; Detmer, Don; Kahan, James P; Oortwijn, Wija; MacGillivray, Steve

    2005-01-01

    Background Lack of consensus on the meaning of eHealth has led to uncertainty among academics, policymakers, providers and consumers. This project was commissioned in light of the rising profile of eHealth on the international policy agenda and the emerging UK National Programme for Information Technology (now called Connecting for Health) and related developments in the UK National Health Service. Objectives To map the emergence and scope of eHealth as a topic and to identify its place within the wider health informatics field, as part of a larger review of research and expert analysis pertaining to current evidence, best practice and future trends. Methods Multiple databases of scientific abstracts were explored in a nonsystematic fashion to assess the presence of eHealth or conceptually related terms within their taxonomies, to identify journals in which articles explicitly referring to eHealth are contained and the topics covered, and to identify published definitions of the concept. The databases were Medline (PubMed), the Cumulative Index of Nursing and Allied Health Literature (CINAHL), the Science Citation Index (SCI), the Social Science Citation Index (SSCI), the Cochrane Database (including Dare, Central, NHS Economic Evaluation Database [NHS EED], Health Technology Assessment [HTA] database, NHS EED bibliographic) and ISTP (now known as ISI proceedings).We used the search query, “Ehealth OR e-health OR e*health”. The timeframe searched was 1997-2003, although some analyses contain data emerging subsequent to this period. This was supplemented by iterative searches of Web-based sources, such as commercial and policy reports, research commissioning programmes and electronic news pages. Definitions extracted from both searches were thematically analyzed and compared in order to assess conceptual heterogeneity. Results The term eHealth only came into use in the year 2000, but has since become widely prevalent. The scope of the topic was not immediately discernable from that of the wider health informatics field, for which over 320000 publications are listed in Medline alone, and it is not explicitly represented within the existing Medical Subject Headings (MeSH) taxonomy. Applying eHealth as narrative search term to multiple databases yielded 387 relevant articles, distributed across 154 different journals, most commonly related to information technology and telemedicine, but extending to such areas as law. Most eHealth articles are represented on Medline. Definitions of eHealth vary with respect to the functions, stakeholders, contexts and theoretical issues targeted. Most encompass a broad range of medical informatics applications either specified (eg, decision support, consumer health information) or presented in more general terms (eg, to manage, arrange or deliver health care). However the majority emphasize the communicative functions of eHealth and specify the use of networked digital technologies, primarily the Internet, thus differentiating eHealth from the field of medical informatics. While some definitions explicitly target health professionals or patients, most encompass applications for all stakeholder groups. The nature of the scientific and broader literature pertaining to eHealth closely reflects these conceptualizations. Conclusions We surmise that the field – as it stands today – may be characterized by the global definitions suggested by Eysenbach and Eng. PMID:15829481

  2. Cutaneous lichen planus: A systematic review of treatments.

    PubMed

    Fazel, Nasim

    2015-06-01

    Various treatment modalities are available for cutaneous lichen planus. Pubmed, EMBASE, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Database of Abstracts of Reviews of Effects, and Health Technology Assessment Database were searched for all the systematic reviews and randomized controlled trials related to cutaneous lichen planus. Two systematic reviews and nine relevant randomized controlled trials were identified. Acitretin, griseofulvin, hydroxychloroquine and narrow band ultraviolet B are demonstrated to be effective in the treatment of cutaneous lichen planus. Sulfasalazine is effective, but has an unfavorable safety profile. KH1060, a vitamin D analogue, is not beneficial in the management of cutaneous lichen planus. Evidence from large scale randomized trials demonstrating the safety and efficacy for many other treatment modalities used to treat cutaneous lichen planus is simply not available.

  3. Magnetic Resonance Imaging as an Adjunct to Mammography for Breast Cancer Screening in Women at Less Than High Risk for Breast Cancer: A Health Technology Assessment

    PubMed Central

    Nikitovic-Jokic, Milica; Holubowich, Corinne

    2016-01-01

    Background Screening with mammography can detect breast cancer early, before clinical symptoms appear. Some cancers, however, are not captured with mammography screening alone. Among women at high risk for breast cancer, magnetic resonance imaging (MRI) has been suggested as a safe adjunct (supplemental) screening tool that can detect breast cancers missed on screening mammography, potentially reducing the number of deaths associated with the disease. However, the use of adjunct screening tests may also increase the number of false-positive test results, which may lead to unnecessary follow-up testing, as well as patient stress and anxiety. We investigated the benefits and harms of MRI as an adjunct to mammography compared with mammography alone for screening women at less than high risk (average or higher than average risk) for breast cancer. Methods We searched Ovid MEDLINE, Ovid Embase, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects (DARE), Centre for Reviews and Dissemination (CRD) Health Technology Assessment Database, and National Health Service (NHS) Economic Evaluation Database, from January 2002 to January 2016, for evidence of effectiveness, harms, and diagnostic accuracy. Only studies evaluating the use of screening breast MRI as an adjunct to mammography in the specified populations were included. Results No studies in women at less than high risk for breast cancer met our inclusion criteria. Conclusions It remains uncertain if the use of adjunct screening breast MRI in women at less than high risk (average or higher than average risk) for breast cancer will reduce breast cancer–related mortality without significant increases in unnecessary follow-up testing and treatment. PMID:27990198

  4. A New Approach To Secure Federated Information Bases Using Agent Technology.

    ERIC Educational Resources Information Center

    Weippi, Edgar; Klug, Ludwig; Essmayr, Wolfgang

    2003-01-01

    Discusses database agents which can be used to establish federated information bases by integrating heterogeneous databases. Highlights include characteristics of federated information bases, including incompatible database management systems, schemata, and frequently changing context; software agent technology; Java agents; system architecture;…

  5. EST databases and web tools for EST projects.

    PubMed

    Shen, Yao-Qing; O'Brien, Emmet; Koski, Liisa; Lang, B Franz; Burger, Gertraud

    2009-01-01

    This chapter outlines key considerations for constructing and implementing an EST database. Instead of showing the technological details step by step, emphasis is put on the design of an EST database suited to the specific needs of EST projects and how to choose the most suitable tools. Using TBestDB as an example, we illustrate the essential factors to be considered for database construction and the steps for data population and annotation. This process employs technologies such as PostgreSQL, Perl, and PHP to build the database and interface, and tools such as AutoFACT for data processing and annotation. We discuss these in comparison to other available technologies and tools, and explain the reasons for our choices.

  6. Relational Database Technology: An Overview.

    ERIC Educational Resources Information Center

    Melander, Nicole

    1987-01-01

    Describes the development of relational database technology as it applies to educational settings. Discusses some of the new tools and models being implemented in an effort to provide educators with technologically advanced ways of answering questions about education programs and data. (TW)

  7. Orthographic and Phonological Neighborhood Databases across Multiple Languages.

    PubMed

    Marian, Viorica

    2017-01-01

    The increased globalization of science and technology and the growing number of bilinguals and multilinguals in the world have made research with multiple languages a mainstay for scholars who study human function and especially those who focus on language, cognition, and the brain. Such research can benefit from large-scale databases and online resources that describe and measure lexical, phonological, orthographic, and semantic information. The present paper discusses currently-available resources and underscores the need for tools that enable measurements both within and across multiple languages. A general review of language databases is followed by a targeted introduction to databases of orthographic and phonological neighborhoods. A specific focus on CLEARPOND illustrates how databases can be used to assess and compare neighborhood information across languages, to develop research materials, and to provide insight into broad questions about language. As an example of how using large-scale databases can answer questions about language, a closer look at neighborhood effects on lexical access reveals that not only orthographic, but also phonological neighborhoods can influence visual lexical access both within and across languages. We conclude that capitalizing upon large-scale linguistic databases can advance, refine, and accelerate scientific discoveries about the human linguistic capacity.

  8. Osteoporosis therapies: evidence from health-care databases and observational population studies.

    PubMed

    Silverman, Stuart L

    2010-11-01

    Osteoporosis is a well-recognized disease with severe consequences if left untreated. Randomized controlled trials are the most rigorous method for determining the efficacy and safety of therapies. Nevertheless, randomized controlled trials underrepresent the real-world patient population and are costly in both time and money. Modern technology has enabled researchers to use information gathered from large health-care or medical-claims databases to assess the practical utilization of available therapies in appropriate patients. Observational database studies lack randomization but, if carefully designed and successfully completed, can provide valuable information that complements results obtained from randomized controlled trials and extends our knowledge to real-world clinical patients. Randomized controlled trials comparing fracture outcomes among osteoporosis therapies are difficult to perform. In this regard, large observational database studies could be useful in identifying clinically important differences among therapeutic options. Database studies can also provide important information with regard to osteoporosis prevalence, health economics, and compliance and persistence with treatment. This article describes the strengths and limitations of both randomized controlled trials and observational database studies, discusses considerations for observational study design, and reviews a wealth of information generated by database studies in the field of osteoporosis.

  9. Review of early assessment models of innovative medical technologies.

    PubMed

    Fasterholdt, Iben; Krahn, Murray; Kidholm, Kristian; Yderstræde, Knud Bonnet; Pedersen, Kjeld Møller

    2017-08-01

    Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models for early assessment in different health organisations and discusses which models hold most promise for hospital decision makers. A scoping review of published studies between 1996 and 2015 was performed using nine databases. The following information was collected: decision context, decision problem, and a description of the early assessment model. 2362 articles were identified and 12 studies fulfilled the inclusion criteria. An additional 12 studies were identified and included in the review by searching reference lists. The majority of the 24 early assessment studies were variants of traditional cost-effectiveness analysis. Around one fourth of the studies presented an evaluation model with a broader focus than cost-effectiveness. Uncertainty was mostly handled by simple sensitivity or scenario analysis. This review shows that evaluation models using known methods assessing cost-effectiveness are most prevalent in early assessment, but seems ill-suited for early assessment in hospitals. Four models provided some usable elements for the development of a hospital-based model. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  10. Use of dual-energy X-ray absorptiometry (DXA) for diagnosis and fracture risk assessment; WHO-criteria, T- and Z-score, and reference databases.

    PubMed

    Dimai, Hans P

    2017-11-01

    Dual-energy X-ray absorptiometry (DXA) is a two-dimensional imaging technology developed to assess bone mineral density (BMD) of the entire human skeleton and also specifically of skeletal sites known to be most vulnerable to fracture. In order to simplify interpretation of BMD measurement results and allow comparability among different DXA-devices, the T-score concept was introduced. This concept involves an individual's BMD which is then compared with the mean value of a young healthy reference population, with the difference expressed as a standard deviation (SD). Since the early nineties of the past century, the diagnostic categories "normal, osteopenia, and osteoporosis", as recommended by a WHO working Group, are based on this concept. Thus, DXA is still the globally accepted "gold-standard" method for the noninvasive diagnosis of osteoporosis. Another score obtained from DXA measurement, termed Z-score, describes the number of SDs by which the BMD in an individual differs from the mean value expected for age and sex. Although not intended for diagnosis of osteoporosis in adults, it nevertheless provides information about an individual's fracture risk compared to peers. DXA measurement can either be used as a "stand-alone" means in the assessment of an individual's fracture risk, or incorporated into one of the available fracture risk assessment tools such as FRAX® or Garvan, thus improving the predictive power of such tools. The issue which reference databases should be used by DXA-device manufacturers for T-score reference standards has been recently addressed by an expert group, who recommended use National Health and Nutrition Examination Survey III (NHANES III) databases for the hip reference standard but own databases for the lumbar spine. Furthermore, in men it is recommended use female reference databases for calculation of the T-score and use male reference databases for calculation of Z-score. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Advanced life support study

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summary reports on each of the eight tasks undertaken by this contract are given. Discussed here is an evaluation of a Closed Ecological Life Support System (CELSS), including modeling and analysis of Physical/Chemical Closed Loop Life Support (P/C CLLS); the Environmental Control and Life Support Systems (ECLSS) evolution - Intermodule Ventilation study; advanced technologies interface requirements relative to ECLSS; an ECLSS resupply analysis; the ECLSS module addition relocation systems engineering analysis; an ECLSS cost/benefit analysis to identify rack-level interface requirements of the alternate technologies evaluated in the ventilation study, with a comparison of these with the rack level interface requirements for the baseline technologies; advanced instrumentation - technology database enhancement; and a clean room survey and assessment of various ECLSS evaluation options for different growth scenarios.

  12. Aviation Trends Related to Atmospheric Environment Safety Technologies Project Technical Challenges

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Barr, Lawrence C.; Evans, Joni K.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    Current and future aviation safety trends related to the National Aeronautics and Space Administration's Atmospheric Environment Safety Technologies Project's three technical challenges (engine icing characterization and simulation capability; airframe icing simulation and engineering tool capability; and atmospheric hazard sensing and mitigation technology capability) were assessed by examining the National Transportation Safety Board (NTSB) accident database (1989 to 2008), incidents from the Federal Aviation Administration (FAA) accident/incident database (1989 to 2006), and literature from various industry and government sources. The accident and incident data were examined for events involving fixed-wing airplanes operating under Federal Aviation Regulation (FAR) Parts 121, 135, and 91 for atmospheric conditions related to airframe icing, ice-crystal engine icing, turbulence, clear air turbulence, wake vortex, lightning, and low visibility (fog, low ceiling, clouds, precipitation, and low lighting). Five future aviation safety risk areas associated with the three AEST technical challenges were identified after an exhaustive survey of a variety of sources and include: approach and landing accident reduction, icing/ice detection, loss of control in flight, super density operations, and runway safety.

  13. A comparison of two search methods for determining the scope of systematic reviews and health technology assessments.

    PubMed

    Forsetlund, Louise; Kirkehei, Ingvild; Harboe, Ingrid; Odgaard-Jensen, Jan

    2012-01-01

    This study aims to compare two different search methods for determining the scope of a requested systematic review or health technology assessment. The first method (called the Direct Search Method) included performing direct searches in the Cochrane Database of Systematic Reviews (CDSR), Database of Abstracts of Reviews of Effects (DARE) and the Health Technology Assessments (HTA). Using the comparison method (called the NHS Search Engine) we performed searches by means of the search engine of the British National Health Service, NHS Evidence. We used an adapted cross-over design with a random allocation of fifty-five requests for systematic reviews. The main analyses were based on repeated measurements adjusted for the order in which the searches were conducted. The Direct Search Method generated on average fewer hits (48 percent [95 percent confidence interval {CI} 6 percent to 72 percent], had a higher precision (0.22 [95 percent CI, 0.13 to 0.30]) and more unique hits than when searching by means of the NHS Search Engine (50 percent [95 percent CI, 7 percent to 110 percent]). On the other hand, the Direct Search Method took longer (14.58 minutes [95 percent CI, 7.20 to 21.97]) and was perceived as somewhat less user-friendly than the NHS Search Engine (-0.60 [95 percent CI, -1.11 to -0.09]). Although the Direct Search Method had some drawbacks such as being more time-consuming and less user-friendly, it generated more unique hits than the NHS Search Engine, retrieved on average fewer references and fewer irrelevant results.

  14. Alternatives to relational database: comparison of NoSQL and XML approaches for clinical data storage.

    PubMed

    Lee, Ken Ka-Yin; Tang, Wai-Choi; Choi, Kup-Sze

    2013-04-01

    Clinical data are dynamic in nature, often arranged hierarchically and stored as free text and numbers. Effective management of clinical data and the transformation of the data into structured format for data analysis are therefore challenging issues in electronic health records development. Despite the popularity of relational databases, the scalability of the NoSQL database model and the document-centric data structure of XML databases appear to be promising features for effective clinical data management. In this paper, three database approaches--NoSQL, XML-enabled and native XML--are investigated to evaluate their suitability for structured clinical data. The database query performance is reported, together with our experience in the databases development. The results show that NoSQL database is the best choice for query speed, whereas XML databases are advantageous in terms of scalability, flexibility and extensibility, which are essential to cope with the characteristics of clinical data. While NoSQL and XML technologies are relatively new compared to the conventional relational database, both of them demonstrate potential to become a key database technology for clinical data management as the technology further advances. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  15. Overcoming Dietary Assessment Challenges in Low-Income Countries: Technological Solutions Proposed by the International Dietary Data Expansion (INDDEX) Project.

    PubMed

    Coates, Jennifer C; Colaiezzi, Brooke A; Bell, Winnie; Charrondiere, U Ruth; Leclercq, Catherine

    2017-03-16

    An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper.

  16. Overcoming Dietary Assessment Challenges in Low-Income Countries: Technological Solutions Proposed by the International Dietary Data Expansion (INDDEX) Project

    PubMed Central

    Coates, Jennifer C.; Colaiezzi, Brooke A.; Bell, Winnie; Charrondiere, U. Ruth; Leclercq, Catherine

    2017-01-01

    An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper. PMID:28300759

  17. A Holistic approach to assess older adults' wellness using e-health technologies.

    PubMed

    Thompson, Hilaire J; Demiris, George; Rue, Tessa; Shatil, Evelyn; Wilamowska, Katarzyna; Zaslavsky, Oleg; Reeder, Blaine

    2011-12-01

    To date, methodologies are lacking that address a holistic assessment of wellness in older adults. Technology applications may provide a platform for such an assessment, but have not been validated. We set out to demonstrate whether e-health applications could support the assessment of older adults' wellness in community-dwelling older adults. Twenty-seven residents of independent retirement community were followed over 8 weeks. Subjects engaged in the use of diverse technologies to assess cognitive performance, physiological and functional variables, as well as psychometric components of wellness. Data were integrated from various e-health sources into one study database. Correlations were assessed between different parameters, and hierarchical cluster analysis was used to explore the validity of the wellness model. We found strong associations across multiple parameters of wellness within the conceptual model, including cognitive, functional, and physical. However, spirituality did not correlate with any other parameter studied in contrast to prior studies of older adults. Participants expressed overall positive attitudes toward the e-health tools and the holistic approach to the assessment of wellness, without expressing any privacy concerns. Parameters were highly correlated across multiple domains of wellness. Important clusters were noted to be formed across cognitive and physiological domains, giving further evidence of need for an integrated approach to the assessment of wellness. This finding warrants further replication in larger and more diverse samples of older adults to standardize and deploy these technologies across population groups.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Link, P.K.

    A total of 48 papers were presented at the Engineering Geology and Geotechnical Engineering 30th Symposium. These papers are presented in this proceedings under the following headings: site characterization--Pocatello area; site characterization--Boise Area; site assessment; Idaho National Engineering Laboratory; geophysical methods; remediation; geotechnical engineering; and hydrogeology, northern and western Idaho. Individual papers have been processed separately for inclusion in the Energy Science and Technology Database.

  19. Probabilistic simulation of concurrent engineering of propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Technology readiness and the available infrastructure is assessed for timely computational simulation of concurrent engineering for propulsion systems. Results for initial coupled multidisciplinary, fabrication-process, and system simulators are presented including uncertainties inherent in various facets of engineering processes. An approach is outlined for computationally formalizing the concurrent engineering process from cradle-to-grave via discipline dedicated workstations linked with a common database.

  20. Databases for Assessment of Military Speech Technology Equipment. (les Bases de donnees pour l’evatuation des equipements de technologie vocale militaire)

    DTIC Science & Technology

    2000-03-01

    Prieur de la Crte d’Or Ciudad Universitaria 94114 Arcueil Cedex 28040 Madrid France Spain Mr. John J. Grieco Dr. Dough Reynolds AFRL/IFEC Information...CANADA HONGRIE PORTUGAL Directeur - Recherche et d~veloppement - Department for Scientific Estado Maior da Forqa Afrea Communications et gestion de

  1. Dynamic Analytics-Driven Assessment of Vulnerabilities and Exploitation

    DTIC Science & Technology

    2016-07-15

    integration with big data technologies such as Hadoop , nor does it natively support exporting of events to external relational databases. OSSIM supports...power of big data analytics to determine correlations and temporal causality among vulnerabilities and cyber events. The vulnerability dependencies...via the SCAPE (formerly known as LLCySA [6]). This is illustrated as a big data cyber analytic system architecture in

  2. Leveraging Relational Technology through Industry Partnerships.

    ERIC Educational Resources Information Center

    Brush, Leonard M.; Schaller, Anthony J.

    1988-01-01

    Carnegie Mellon University has leveraged its technological expertise with database management systems (DBMS) into joint technological and developmental partnerships with DBMS and application software vendors. Carnegie's relational database strategy, the strategy of partnerships and how they were formed, and how the partnerships are doing are…

  3. The USGS national geothermal resource assessment: An update

    USGS Publications Warehouse

    Williams, C.F.; Reed, M.J.; Galanis, S.P.; DeAngelo, J.

    2007-01-01

    The U. S. Geological Survey (USGS) is working with the Department of Energy's (DOE) Geothermal Technologies Program and other geothermal organizations on a three-year effort to produce an updated assessment of available geothermal resources. The new assessment will introduce significant changes in the models for geothermal energy recovery factors, estimates of reservoir volumes, and limits to temperatures and depths for electric power production. It will also include the potential impact of evolving Enhanced Geothermal Systems (EGS) technology. An important focus in the assessment project is on the development of geothermal resource models consistent with the production histories and observed characteristics of exploited geothermal fields. New models for the recovery of heat from heterogeneous, fractured reservoirs provide a physically realistic basis for evaluating the production potential of both natural geothermal reservoirs and reservoirs that may be created through the application of EGS technology. Project investigators have also made substantial progress studying geothermal systems and the factors responsible for their formation through studies in the Great Basin-Modoc Plateau region, Coso, Long Valley, the Imperial Valley and central Alaska, Project personnel are also entering the supporting data and resulting analyses into geospatial databases that will be produced as part of the resource assessment.

  4. [Effect of 3D printing technology on pelvic fractures:a Meta-analysis].

    PubMed

    Zhang, Yu-Dong; Wu, Ren-Yuan; Xie, Ding-Ding; Zhang, Lei; He, Yi; Zhang, Hong

    2018-05-25

    To evaluate the effect of 3D printing technology applied in the surgical treatment of pelvic fractures through the published literatures by Meta-analysis. The PubMed database, EMCC database, CBM database, CNKI database, VIP database and Wanfang database were searched from the date of database foundation to August 2017 to collect the controlled clinical trials in wich 3D printing technology was applied in preoperative planning of pelvic fracture surgery. The retrieved literatures were screened according to predefined inclusion and exclusion criteria, and quality evaluation were performed. Then, the available data were extracted and analyzed with the RevMan5.3 software. Totally 9 controlled clinical trials including 638 cases were chosen. Among them, 279 cases were assigned to the 3D printing technology group and 359 cases to the conventional group. The Meta-analysis results showed that the operative time[SMD=-2.81, 95%CI(-3.76, -1.85)], intraoperative blood loss[SMD=-3.28, 95%CI(-4.72, -1.85)] and the rate of complication [OR=0.47, 95%CI(0.25, 0.87)] in the 3D printing technology were all lower than those in the conventional group;the excellent and good rate of pelvic fracture reduction[OR=2.09, 95%CI(1.32, 3.30)] and postoperative pelvic functional restoration [OR=1.94, 95%CI(1.15, 3.28) in the 3D printing technology were all superior to those in the conventional group. 3D printing technology applied in the surgical treatment of pelvic fractures has the advantage of shorter operative time, less intraoperative blood loss and lower rate of complication, and can improve the quality of pelvic fracture reduction and the recovery of postoperative pelvic function. Copyright© 2018 by the China Journal of Orthopaedics and Traumatology Press.

  5. Accuracy of LightCycler® SeptiFast for the detection and identification of pathogens in the blood of patients with suspected sepsis: a systematic review protocol

    PubMed Central

    Wilson, Claire; Blackwood, Bronagh; McAuley, Danny F; Perkins, Gavin D; McMullan, Ronan; Gates, Simon; Warhurst, Geoffrey

    2012-01-01

    Background There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probe-based real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). Study selection: diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. Data extraction: three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care. Registration PROSPERO—NIHR Prospective Register of Systematic Reviews (CRD42011001289). PMID:22240646

  6. Quality assessment of clinical practice guidelines for integrative medicine in China: A systematic review.

    PubMed

    Yao, Sha; Wei, Dang; Chen, Yao-Long; Wang, Qi; Wang, Xiao-Qin; Zeng, Zhao; Li, Hui

    2017-05-01

    To assess the quality of integrative medicine clinical practice guidelines (CPGs) published before 2014. A systematic search of the scientific literature published before 2014 was conducted to select integrative medicine CPGs. Four major Chinese integrated databases and one guideline database were searched: the Chinese Biomedical Literature Database (CBM), the China National Knowledge Infrastructure (CNKI), China Science and Technology Journal Database (VIP), Wanfang Data, and the China Guideline Clearinghouse (CGC). Four reviewers independently assessed the quality of the included guidelines using the Appraisal of Guidelines for Research and Evaluation (AGREE) II Instrument. Overall consensus among the reviewers was assessed using the intra-class correlation coefficient (ICC). A total of 41 guidelines published from 2003 to 2014 were included. The overall consensus among the reviewers was good [ICC: 0.928; 95% confifi dence interval (CI): 0.920 to 0.935]. The scores on the 6 AGREE domains were: 17% for scope and purpose (range: 6% to 32%), 11% for stakeholder involvement (range: 0 to 24%), 10% for rigor of development (range: 3% to 22%), 39% for clarity and presentation (range: 25% to 64%), 11% for applicability (range: 4% to 24%), and 1% for editorial independence (range: 0 to 15%). The quality of integrative medicine CPGs was low, the development of integrative medicine CPGs should be guided by systematic methodology. More emphasis should be placed on multi-disciplinary guideline development groups, quality of evidence, management of funding and conflfl icts of interest, and guideline updates in the process of developing integrative medicine CPGs in China.

  7. ECLSS evolution: Advanced instrumentation interface requirements. Volume 3: Appendix C

    NASA Technical Reports Server (NTRS)

    1991-01-01

    An Advanced ECLSS (Environmental Control and Life Support System) Technology Interfaces Database was developed primarily to provide ECLSS analysts with a centralized and portable source of ECLSS technologies interface requirements data. The database contains 20 technologies which were previously identified in the MDSSC ECLSS Technologies database. The primary interfaces of interest in this database are fluid, electrical, data/control interfaces, and resupply requirements. Each record contains fields describing the function and operation of the technology. Fields include: an interface diagram, description applicable design points and operating ranges, and an explaination of data, as required. A complete set of data was entered for six of the twenty components including Solid Amine Water Desorbed (SAWD), Thermoelectric Integrated Membrane Evaporation System (TIMES), Electrochemical Carbon Dioxide Concentrator (EDC), Solid Polymer Electrolysis (SPE), Static Feed Electrolysis (SFE), and BOSCH. Additional data was collected for Reverse Osmosis Water Reclaimation-Potable (ROWRP), Reverse Osmosis Water Reclaimation-Hygiene (ROWRH), Static Feed Solid Polymer Electrolyte (SFSPE), Trace Contaminant Control System (TCCS), and Multifiltration Water Reclamation - Hygiene (MFWRH). A summary of the database contents is presented in this report.

  8. Keyless Entry: Building a Text Database Using OCR Technology.

    ERIC Educational Resources Information Center

    Grotophorst, Clyde W.

    1989-01-01

    Discusses the use of optical character recognition (OCR) technology to produce an ASCII text database. A tutorial on digital scanning and OCR is provided, and a systems integration project which used the Calera CDP-3000XF scanner and text retrieval software to construct a database of dissertations at George Mason University is described. (four…

  9. A Database Practicum for Teaching Database Administration and Software Development at Regis University

    ERIC Educational Resources Information Center

    Mason, Robert T.

    2013-01-01

    This research paper compares a database practicum at the Regis University College for Professional Studies (CPS) with technology oriented practicums at other universities. Successful andragogy for technology courses can motivate students to develop a genuine interest in the subject, share their knowledge with peers and can inspire students to…

  10. Alternatives to relational databases in precision medicine: Comparison of NoSQL approaches for big data storage using supercomputers

    NASA Astrophysics Data System (ADS)

    Velazquez, Enrique Israel

    Improvements in medical and genomic technologies have dramatically increased the production of electronic data over the last decade. As a result, data management is rapidly becoming a major determinant, and urgent challenge, for the development of Precision Medicine. Although successful data management is achievable using Relational Database Management Systems (RDBMS), exponential data growth is a significant contributor to failure scenarios. Growing amounts of data can also be observed in other sectors, such as economics and business, which, together with the previous facts, suggests that alternate database approaches (NoSQL) may soon be required for efficient storage and management of big databases. However, this hypothesis has been difficult to test in the Precision Medicine field since alternate database architectures are complex to assess and means to integrate heterogeneous electronic health records (EHR) with dynamic genomic data are not easily available. In this dissertation, we present a novel set of experiments for identifying NoSQL database approaches that enable effective data storage and management in Precision Medicine using patients' clinical and genomic information from the cancer genome atlas (TCGA). The first experiment draws on performance and scalability from biologically meaningful queries with differing complexity and database sizes. The second experiment measures performance and scalability in database updates without schema changes. The third experiment assesses performance and scalability in database updates with schema modifications due dynamic data. We have identified two NoSQL approach, based on Cassandra and Redis, which seems to be the ideal database management systems for our precision medicine queries in terms of performance and scalability. We present NoSQL approaches and show how they can be used to manage clinical and genomic big data. Our research is relevant to the public health since we are focusing on one of the main challenges to the development of Precision Medicine and, consequently, investigating a potential solution to the progressively increasing demands on health care.

  11. Current Abstracts Nuclear Reactors and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bales, J.D.; Hicks, S.C.

    1993-01-01

    This publication Nuclear Reactors and Technology (NRT) announces on a monthly basis the current worldwide information available from the open literature on nuclear reactors and technology, including all aspects of power reactors, components and accessories, fuel elements, control systems, and materials. This publication contains the abstracts of DOE reports, journal articles, conference papers, patents, theses, and monographs added to the Energy Science and Technology Database during the past month. Also included are US information obtained through acquisition programs or interagency agreements and international information obtained through acquisition programs or interagency agreements and international information obtained through the International Energy Agency`smore » Energy Technology Data Exchange or government-to-government agreements. The digests in NRT and other citations to information on nuclear reactors back to 1948 are available for online searching and retrieval on the Energy Science and Technology Database and Nuclear Science Abstracts (NSA) database. Current information, added daily to the Energy Science and Technology Database, is available to DOE and its contractors through the DOE Integrated Technical Information System. Customized profiles can be developed to provide current information to meet each user`s needs.« less

  12. Nuclear Reactors and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cason, D.L.; Hicks, S.C.

    1992-01-01

    This publication Nuclear Reactors and Technology (NRT) announces on a monthly basis the current worldwide information available from the open literature on nuclear reactors and technology, including all aspects of power reactors, components and accessories, fuel elements, control systems, and materials. This publication contains the abstracts of DOE reports, journal articles, conference papers, patents, theses, and monographs added to the Energy Science and Technology Database during the past month. Also included are US information obtained through acquisition programs or interagency agreements and international information obtained through the International Energy Agency`s Energy Technology Data Exchange or government-to-government agreements. The digests inmore » NRT and other citations to information on nuclear reactors back to 1948 are available for online searching and retrieval on the Energy Science and Technology Database and Nuclear Science Abstracts (NSA) database. Current information, added daily to the Energy Science and Technology Database, is available to DOE and its contractors through the DOE Integrated Technical Information System. Customized profiles can be developed to provide current information to meet each user`s needs.« less

  13. Implementing a national early awareness and alert system for new and emerging health technologies in Italy: the COTE Project.

    PubMed

    Migliore, Antonio; Perrini, Maria Rosaria; Jefferson, Tom; Cerbo, Marina

    2012-07-01

    The aim of this study was to establish a national Early Awareness and Alert (EAA) system for the identification and assessment of new and emerging health technologies in Italy. In 2008, Agenas, a public body supporting Regions and the Ministry of Health (MoH) in health services research, started a project named COTE (Observatory of New and Emerging Health Technologies) with the ultimate aim of implementing a national EAA system. The COTE project involved all stakeholders (MoH, Regions, Industry, Universities, technical government bodies, and Scientific Societies), in defining the key characteristics and methods of the EAA system. Agreement with stakeholders was reached using three separate workshops. During the workshops, participants shared and agreed methods for identification of new and emerging health technologies, prioritization, and assessment. The structure of the Horizon Scanning (HS) reports was discussed and defined. The main channels for dissemination of outputs were identified as the EuroScan database, and the stakeholders' Web portals. During the final workshop, Agenas presented the first three HS reports produced at national level and proposed the establishment of a permanent national EAA system. The COTE Project created the basis for a permanent national EAA system in Italy. An infrastructure to enable the stakeholders network to grow was created, methods to submit new and emerging health technologies for possible evaluation were established, methods for assessment of the technologies selected were defined, and the stakeholders involvement was delineated (in the identification, assessment, and dissemination stages).

  14. Economic evaluations in gastroenterology in Brazil: A systematic review.

    PubMed

    de Paiva Haddad, Luciana Bertocco; Decimoni, Tassia Cristina; Turri, Jose Antonio; Leandro, Roseli; de Soárez, Patrícia Coelho

    2016-02-06

    To systematically review economic evaluations in gastroenterology, relating to Brazil, published between 1980 and 2013. We selected full and partial economic evaluations from among those retrieved by searching the following databases: MEDLINE (PubMed); Excerpta Medica; the Latin American and Caribbean Health Sciences Literature database; the Scientific Electronic Library Online; the database of the Centre for Reviews and Dissemination; the National Health Service (NHS) Economic Evaluation Database; the NHS Health Technology Assessment database; the Health Economics database of the Brazilian Virtual Library of Health; Scopus; Web of Science; and the Brazilian Network for the Evaluation of Health Technologies. Two researchers, working independently, selected the studies and extracted the data. We identified 535 health economic evaluations relating to Brazil and published in the 1980-2013 period. Of those 535 articles, only 40 dealt with gastroenterology. Full and partial economic evaluations respectively accounted for 23 (57.5%) and 17 (42.5%) of the 40 studies included. Among the 23 full economic evaluations, there were 11 cost-utility analyses, seven cost-effectiveness analyses, four cost-consequence analyses, and one cost-minimization analysis. Of the 40 studies, 25 (62.5%) evaluated medications; 7 (17.5%) evaluated procedures; and 3 (7.5%) evaluated equipment. Most (55%) of the studies were related to viral hepatitis, and most (63.4%) were published after 2010. Other topics included gastrointestinal cancer, liver transplantation, digestive diseases and hernias. Over the 33-year period examined, the number of such economic evaluations relating to Brazil, especially of those evaluating medications for the treatment of hepatitis, increased considerably. Further studies are needed in order to ensure that expenditures on health care in Brazil are made as fairly and efficiently as possible.

  15. Economic evaluations in gastroenterology in Brazil: A systematic review

    PubMed Central

    de Paiva Haddad, Luciana Bertocco; Decimoni, Tassia Cristina; Turri, Jose Antonio; Leandro, Roseli; de Soárez, Patrícia Coelho

    2016-01-01

    AIM: To systematically review economic evaluations in gastroenterology, relating to Brazil, published between 1980 and 2013. METHODS: We selected full and partial economic evaluations from among those retrieved by searching the following databases: MEDLINE (PubMed); Excerpta Medica; the Latin American and Caribbean Health Sciences Literature database; the Scientific Electronic Library Online; the database of the Centre for Reviews and Dissemination; the National Health Service (NHS) Economic Evaluation Database; the NHS Health Technology Assessment database; the Health Economics database of the Brazilian Virtual Library of Health; Scopus; Web of Science; and the Brazilian Network for the Evaluation of Health Technologies. Two researchers, working independently, selected the studies and extracted the data. RESULTS: We identified 535 health economic evaluations relating to Brazil and published in the 1980-2013 period. Of those 535 articles, only 40 dealt with gastroenterology. Full and partial economic evaluations respectively accounted for 23 (57.5%) and 17 (42.5%) of the 40 studies included. Among the 23 full economic evaluations, there were 11 cost-utility analyses, seven cost-effectiveness analyses, four cost-consequence analyses, and one cost-minimization analysis. Of the 40 studies, 25 (62.5%) evaluated medications; 7 (17.5%) evaluated procedures; and 3 (7.5%) evaluated equipment. Most (55%) of the studies were related to viral hepatitis, and most (63.4%) were published after 2010. Other topics included gastrointestinal cancer, liver transplantation, digestive diseases and hernias. Over the 33-year period examined, the number of such economic evaluations relating to Brazil, especially of those evaluating medications for the treatment of hepatitis, increased considerably. CONCLUSION: Further studies are needed in order to ensure that expenditures on health care in Brazil are made as fairly and efficiently as possible. PMID:26855823

  16. Enhanced Living by Assessing Voice Pathology Using a Co-Occurrence Matrix

    PubMed Central

    Muhammad, Ghulam; Alhamid, Mohammed F.; Hossain, M. Shamim; Almogren, Ahmad S.; Vasilakos, Athanasios V.

    2017-01-01

    A large number of the population around the world suffers from various disabilities. Disabilities affect not only children but also adults of different professions. Smart technology can assist the disabled population and lead to a comfortable life in an enhanced living environment (ELE). In this paper, we propose an effective voice pathology assessment system that works in a smart home framework. The proposed system takes input from various sensors, and processes the acquired voice signals and electroglottography (EGG) signals. Co-occurrence matrices in different directions and neighborhoods from the spectrograms of these signals were obtained. Several features such as energy, entropy, contrast, and homogeneity from these matrices were calculated and fed into a Gaussian mixture model-based classifier. Experiments were performed with a publicly available database, namely, the Saarbrucken voice database. The results demonstrate the feasibility of the proposed system in light of its high accuracy and speed. The proposed system can be extended to assess other disabilities in an ELE. PMID:28146069

  17. Enhanced Living by Assessing Voice Pathology Using a Co-Occurrence Matrix.

    PubMed

    Muhammad, Ghulam; Alhamid, Mohammed F; Hossain, M Shamim; Almogren, Ahmad S; Vasilakos, Athanasios V

    2017-01-29

    A large number of the population around the world suffers from various disabilities. Disabilities affect not only children but also adults of different professions. Smart technology can assist the disabled population and lead to a comfortable life in an enhanced living environment (ELE). In this paper, we propose an effective voice pathology assessment system that works in a smart home framework. The proposed system takes input from various sensors, and processes the acquired voice signals and electroglottography (EGG) signals. Co-occurrence matrices in different directions and neighborhoods from the spectrograms of these signals were obtained. Several features such as energy, entropy, contrast, and homogeneity from these matrices were calculated and fed into a Gaussian mixture model-based classifier. Experiments were performed with a publicly available database, namely, the Saarbrucken voice database. The results demonstrate the feasibility of the proposed system in light of its high accuracy and speed. The proposed system can be extended to assess other disabilities in an ELE.

  18. Creation of a digital slide and tissue microarray resource from a multi-institutional predictive toxicology study in the rat: an initial report from the PredTox group.

    PubMed

    Mulrane, Laoighse; Rexhepaj, Elton; Smart, Valerie; Callanan, John J; Orhan, Diclehan; Eldem, Türkan; Mally, Angela; Schroeder, Susanne; Meyer, Kirstin; Wendt, Maria; O'Shea, Donal; Gallagher, William M

    2008-08-01

    The widespread use of digital slides has only recently come to the fore with the development of high-throughput scanners and high performance viewing software. This development, along with the optimisation of compression standards and image transfer techniques, has allowed the technology to be used in wide reaching applications including integration of images into hospital information systems and histopathological training, as well as the development of automated image analysis algorithms for prediction of histological aberrations and quantification of immunohistochemical stains. Here, the use of this technology in the creation of a comprehensive library of images of preclinical toxicological relevance is demonstrated. The images, acquired using the Aperio ScanScope CS and XT slide acquisition systems, form part of the ongoing EU FP6 Integrated Project, Innovative Medicines for Europe (InnoMed). In more detail, PredTox (abbreviation for Predictive Toxicology) is a subproject of InnoMed and comprises a consortium of 15 industrial (13 large pharma, 1 technology provider and 1 SME) and three academic partners. The primary aim of this consortium is to assess the value of combining data generated from 'omics technologies (proteomics, transcriptomics, metabolomics) with the results from more conventional toxicology methods, to facilitate further informed decision making in preclinical safety evaluation. A library of 1709 scanned images was created of full-face sections of liver and kidney tissue specimens from male Wistar rats treated with 16 proprietary and reference compounds of known toxicity; additional biological materials from these treated animals were separately used to create 'omics data, that will ultimately be used to populate an integrated toxicological database. In respect to assessment of the digital slides, a web-enabled digital slide management system, Digital SlideServer (DSS), was employed to enable integration of the digital slide content into the 'omics database and to facilitate remote viewing by pathologists connected with the project. DSS also facilitated manual annotation of digital slides by the pathologists, specifically in relation to marking particular lesions of interest. Tissue microarrays (TMAs) were constructed from the specimens for the purpose of creating a repository of tissue from animals used in the study with a view to later-stage biomarker assessment. As the PredTox consortium itself aims to identify new biomarkers of toxicity, these TMAs will be a valuable means of validation. In summary, a large repository of histological images was created enabling the subsequent pathological analysis of samples through remote viewing and, along with the utilisation of TMA technology, will allow the validation of biomarkers identified by the PredTox consortium. The population of the PredTox database with these digitised images represents the creation of the first toxicological database integrating 'omics and preclinical data with histological images.

  19. The Use of Technology in Identifying Hospital Malnutrition: Scoping Review.

    PubMed

    Trtovac, Dino; Lee, Joon

    2018-01-19

    Malnutrition is a condition most commonly arising from the inadequate consumption of nutrients necessary to maintain physiological health and is associated with the development of cardiovascular disease, osteoporosis, and sarcopenia. Malnutrition occurring in the hospital setting is caused by insufficient monitoring, identification, and assessment efforts. Furthermore, the ability of health care workers to identify and recognize malnourished patients is suboptimal. Therefore, interventions focusing on the identification and treatment of malnutrition are valuable, as they reduce the risks and rates of malnutrition within hospitals. Technology may be a particularly useful ally in identifying malnutrition due to scalability, timeliness, and effectiveness. In an effort to explore the issue, this scoping review synthesized the availability of technological tools to detect and identify hospital malnutrition. Our objective was to conduct a scoping review of the different forms of technology used in addressing malnutrition among adults admitted to hospital to (1) identify the extent of the published literature on this topic, (2) describe key findings, and (3) identify outcomes. We designed and implemented a search strategy in 3 databases (PubMed, Scopus, and CINAHL). We completed a descriptive numerical summary and analyzed study characteristics. One reviewer independently extracted data from the databases. We retrieved and reviewed a total of 21 articles. We categorized articles by the computerized tool or app type: malnutrition assessment (n=15), food intake monitoring (n=5), or both (n=1). Within those categories, we subcategorized the different technologies as either hardware (n=4), software (n=13), or both (n=4). An additional subcategory under software was cloud-based apps (n=1). Malnutrition in the acute hospital setting was largely an unrecognized problem, owing to insufficient monitoring, identification, and initial assessments of identifying both patients who are already malnourished and those who are at risk of malnourishment. Studies went on to examine the effectiveness of health care workers (nurses and doctors) with a knowledge base focused on clinical care and their ability to accurately and consistently identify malnourished geriatric patients within that setting. Most articles reported effectiveness in accurately increasing malnutrition detection and awareness. Computerized tools and apps may also help reduce health care workers' workload and time spent assessing patients for malnutrition. Hospitals may also benefit from implementing malnutrition technology through observing decreased length of stay, along with decreased foregone costs related to missing malnutrition diagnoses. It is beneficial to study the impact of these technologies to examine possible areas of improvement. A future systematic review would further contribute to the evidence and effectiveness of the use of technologies in assessing and monitoring hospital malnutrition. ©Dino Trtovac, Joon Lee. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 19.01.2018.

  20. Enablers and barriers to using two-way information technology in the management of adults with diabetes: A descriptive systematic review.

    PubMed

    Macdonald, Emma M; Perrin, Byron M; Kingsley, Michael Ic

    2017-01-01

    Background This systematic review aimed to explore the enablers and barriers faced by adults with diabetes using two-way information communication technologies to support diabetes self-management. Methods Relevant literature was obtained from five databases using search strategies combining four major constructs: adults with diabetes, biomedical technology, communication technology and patient utilisation. Results Of 8430 unique articles identified, 48 were included for review. Risk of bias was assessed using either the Newcastle-Ottowa or Cochrane risk of bias assessment tools. Seventy-one percent of studies were of cohort design with the majority of studies assessed at high or unclear risk of bias. Consistently identified barriers included poorly designed interfaces requiring manual data entry and systems that lacked functionalities valued by patients. Commonly cited enablers included access to reliable technology, highly automated data entry and transmission, graphical display of data with immediate feedback, and supportive health care professionals and family members. Conclusions People with diabetes face a number of potentially modifiable barriers in using technology to support their diabetes management. In order to address these barriers, end users should be consulted in the design process and consideration given to theories of technology adoption to inform design and implementation. Systems should be designed to solve clinical or behavioural problems that are identified by patients as priorities. Technology should be as automated, streamlined, mobile, low cost and integrated as possible in order to limit the burden of usage for the patient and maximise clinical usefulness.

  1. Development of expert systems for analyzing electronic documents

    NASA Astrophysics Data System (ADS)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  2. Systematic review and modelling of the cost-effectiveness of cardiac magnetic resonance imaging compared with current existing testing pathways in ischaemic cardiomyopathy.

    PubMed

    Campbell, Fiona; Thokala, Praveen; Uttley, Lesley C; Sutton, Anthea; Sutton, Alex J; Al-Mohammad, Abdallah; Thomas, Steven M

    2014-09-01

    Cardiac magnetic resonance imaging (CMR) is increasingly used to assess patients for myocardial viability prior to revascularisation. This is important to ensure that only those likely to benefit are subjected to the risk of revascularisation. To assess current evidence on the accuracy and cost-effectiveness of CMR to test patients prior to revascularisation in ischaemic cardiomyopathy; to develop an economic model to assess cost-effectiveness for different imaging strategies; and to identify areas for further primary research. Databases searched were: MEDLINE including MEDLINE In-Process & Other Non-Indexed Citations Initial searches were conducted in March 2011 in the following databases with dates: MEDLINE including MEDLINE In-Process & Other Non-Indexed Citations via Ovid (1946 to March 2011); Bioscience Information Service (BIOSIS) Previews via Web of Science (1969 to March 2011); EMBASE via Ovid (1974 to March 2011); Cochrane Database of Systematic Reviews via The Cochrane Library (1996 to March 2011); Cochrane Central Register of Controlled Trials via The Cochrane Library 1998 to March 2011; Database of Abstracts of Reviews of Effects via The Cochrane Library (1994 to March 2011); NHS Economic Evaluation Database via The Cochrane Library (1968 to March 2011); Health Technology Assessment Database via The Cochrane Library (1989 to March 2011); and the Science Citation Index via Web of Science (1900 to March 2011). Additional searches were conducted from October to November 2011 in the following databases with dates: MEDLINE including MEDLINE In-Process & Other Non-Indexed Citations via Ovid (1946 to November 2011); BIOSIS Previews via Web of Science (1969 to October 2011); EMBASE via Ovid (1974 to November 2011); Cochrane Database of Systematic Reviews via The Cochrane Library (1996 to November 2011); Cochrane Central Register of Controlled Trials via The Cochrane Library (1998 to November 2011); Database of Abstracts of Reviews of Effects via The Cochrane Library (1994 to November 2011); NHS Economic Evaluation Database via The Cochrane Library (1968 to November 2011); Health Technology Assessment Database via The Cochrane Library (1989 to November 2011); and the Science Citation Index via Web of Science (1900 to October 2011). Electronic databases were searched March-November 2011. The systematic review selected studies that assessed the clinical effectiveness and cost-effectiveness of CMR to establish the role of CMR in viability assessment compared with other imaging techniques: stress echocardiography, single-photon emission computed tomography (SPECT) and positron emission tomography (PET). Studies had to have an appropriate reference standard and contain accuracy data or sufficient details so that accuracy data could be calculated. Data were extracted by two reviewers and discrepancies resolved by discussion. Quality of studies was assessed using the QUADAS II tool (University of Bristol, Bristol, UK). A rigorous diagnostic accuracy systematic review assessed clinical and cost-effectiveness of CMR in viability assessment. A health economic model estimated costs and quality-adjusted life-years (QALYs) accrued by diagnostic pathways for identifying patients with viable myocardium in ischaemic cardiomyopathy with a view to revascularisation. The pathways involved CMR, stress echocardiography, SPECT, PET alone or in combination. Strategies of no testing and revascularisation were included to determine the most cost-effective strategy. Twenty-four studies met the inclusion criteria. All were prospective. Participant numbers ranged from 8 to 52. The mean left ventricular ejection fraction in studies reporting this outcome was 24-62%. CMR approaches included stress CMR and late gadolinium-enhanced cardiovascular magnetic resonance imaging (CE CMR). Recovery following revascularisation was the reference standard. Twelve studies assessed diagnostic accuracy of stress CMR and 14 studies assessed CE CMR. A bivariate regression model was used to calculate the sensitivity and specificity of CMR. Summary sensitivity and specificity for stress CMR was 82.2% [95% confidence interval (CI) 73.2% to 88.7%] and 87.1% (95% CI 80.4% to 91.7%) and for CE CMR was 95.5% (95% CI 94.1% to 96.7%) and 53% (95% CI 40.4% to 65.2%) respectively. The sensitivity and specificity of PET, SPECT and stress echocardiography were calculated using data from 10 studies and systematic reviews. The sensitivity of PET was 94.7% (95% CI 90.3% to 97.2%), of SPECT was 85.1% (95% CI 78.1% to 90.2%) and of stress echocardiography was 77.6% (95% CI 70.7% to 83.3%). The specificity of PET was 68.8% (95% CI 50% to 82.9%), of SPECT was 62.1% (95% CI 52.7% to 70.7%) and of stress echocardiography was 69.6% (95% CI 62.4% to 75.9%). All currently used diagnostic strategies were cost-effective compared with no testing at current National Institute for Health and Care Excellence thresholds. If the annual mortality rates for non-viable patients were assumed to be higher for revascularised patients, then testing with CE CMR was most cost-effective at a threshold of £20,000/QALY. The proportion of model runs in which each strategy was most cost-effective, at a threshold of £20,000/QALY, was 40% for CE CMR, 42% for PET and 16.5% for revascularising everyone. The expected value of perfect information at £20,000/QALY was £620 per patient. If all patients (viable or not) gained benefit from revascularisation, then it was most cost-effective to revascularise all patients. Definitions and techniques assessing viability were highly variable, making data extraction and comparisons difficult. Lack of evidence meant assumptions were made in the model leading to uncertainty; differing scenarios were generated around key assumptions. All the diagnostic pathways are a cost-effective use of NHS resources. Given the uncertainty in the mortality rates, the cost-effectiveness analysis was performed using a set of scenarios. The cost-effectiveness analyses suggest that CE CMR and revascularising everyone were the optimal strategies. Future research should look at implementation costs for this type of imaging service, provide guidance on consistent reporting of diagnostic testing data for viability assessment, and focus on the impact of revascularisation or best medical therapy in this group of high-risk patients. The National Institute of Health Technology Assessment programme.

  3. An Analysis Platform for Mobile Ad Hoc Network (MANET) Scenario Execution Log Data

    DTIC Science & Technology

    2016-01-01

    these technologies. 4.1 Backend Technologies • Java 1.8 • my-sql-connector- java -5.0.8.jar • Tomcat • VirtualBox • Kali MANET Virtual Machine 4.2...Frontend Technologies • LAMPP 4.3 Database • MySQL Server 5. Database The SEDAP database settings and structure are described in this section...contains all the backend java functionality including the web services, should be placed in the webapps directory inside the Tomcat installation

  4. ICT based technology to support play for children with severe physical disabilities.

    PubMed

    van den Heuvel, Renée; Lexis, Monique; de Witte, Luc

    2015-01-01

    Play is important for a child's development. Children with severe physical disabilities experience difficulties engaging in play. With the progress of technology the possibilities to support play are increasing. The purpose of this review was to gain insight into the possibilities and availability of ICT based technology to support play in children with severe physical disabilities. A systematic literature search within the databases PubMed, CINAHL, IEEE and ERIC was carried out. Three reviewers assessed titles and abstracts independently. Additionally, Google Scholar, conference proceedings and reference lists were used. The included publications reported on 27 different technologies, which can be classified into three main groups; robots, virtual reality systems and computer systems. There are several options that may have great potential in supporting play for this target group.

  5. Text mining to decipher free-response consumer complaints: insights from the NHTSA vehicle owner's complaint database.

    PubMed

    Ghazizadeh, Mahtab; McDonald, Anthony D; Lee, John D

    2014-09-01

    This study applies text mining to extract clusters of vehicle problems and associated trends from free-response data in the National Highway Traffic Safety Administration's vehicle owner's complaint database. As the automotive industry adopts new technologies, it is important to systematically assess the effect of these changes on traffic safety. Driving simulators, naturalistic driving data, and crash databases all contribute to a better understanding of how drivers respond to changing vehicle technology, but other approaches, such as automated analysis of incident reports, are needed. Free-response data from incidents representing two severity levels (fatal incidents and incidents involving injury) were analyzed using a text mining approach: latent semantic analysis (LSA). LSA and hierarchical clustering identified clusters of complaints for each severity level, which were compared and analyzed across time. Cluster analysis identified eight clusters of fatal incidents and six clusters of incidents involving injury. Comparisons showed that although the airbag clusters across the two severity levels have the same most frequent terms, the circumstances around the incidents differ. The time trends show clear increases in complaints surrounding the Ford/Firestone tire recall and the Toyota unintended acceleration recall. Increases in complaints may be partially driven by these recall announcements and the associated media attention. Text mining can reveal useful information from free-response databases that would otherwise be prohibitively time-consuming and difficult to summarize manually. Text mining can extend human analysis capabilities for large free-response databases to support earlier detection of problems and more timely safety interventions.

  6. The economic impact of Clostridium difficile infection: a systematic review.

    PubMed

    Nanwa, Natasha; Kendzerska, Tetyana; Krahn, Murray; Kwong, Jeffrey C; Daneman, Nick; Witteman, William; Mittmann, Nicole; Cadarette, Suzanne M; Rosella, Laura; Sander, Beate

    2015-04-01

    With Clostridium difficile infection (CDI) on the rise, knowledge of the current economic burden of CDI can inform decisions on interventions related to CDI. We systematically reviewed CDI cost-of-illness (COI) studies. We performed literature searches in six databases: MEDLINE, Embase, the Health Technology Assessment Database, the National Health Service Economic Evaluation Database, the Cost-Effectiveness Analysis Registry, and EconLit. We also searched gray literature and conducted reference list searches. Two reviewers screened articles independently. One reviewer abstracted data and assessed quality using a modified guideline for economic evaluations. The second reviewer validated the abstraction and assessment. We identified 45 COI studies between 1988 and June 2014. Most (84%) of the studies were from the United States, calculating costs of hospital stays (87%), and focusing on direct costs (100%). Attributable mean CDI costs ranged from $8,911 to $30,049 for hospitalized patients. Few studies stated resource quantification methods (0%), an epidemiological approach (0%), or a justified study perspective (16%) in their cost analyses. In addition, few studies conducted sensitivity analyses (7%). Forty-five COI studies quantified and confirmed the economic impact of CDI. Costing methods across studies were heterogeneous. Future studies should follow standard COI methodology, expand study perspectives (e.g., patient), and explore populations least studied (e.g., community-acquired CDI).

  7. Measurement tools for the diagnosis of nasal septal deviation: a systematic review

    PubMed Central

    2014-01-01

    Objective To perform a systematic review of measurement tools utilized for the diagnosis of nasal septal deviation (NSD). Methods Electronic database searches were performed using MEDLINE (from 1966 to second week of August 2013), EMBASE (from 1966 to second week of August 2013), Web of Science (from 1945 to second week of August 2013) and all Evidence Based Medicine Reviews Files (EBMR); Cochrane Database of Systematic Review (CDSR), Cochrane Central Register of Controlled Trials (CCTR), Cochrane Methodology Register (CMR), Database of Abstracts of Reviews of Effects (DARE), American College of Physicians Journal Club (ACP Journal Club), Health Technology Assessments (HTA), NHS Economic Evaluation Database (NHSEED) till the second quarter of 2013. The search terms used in database searches were ‘nasal septum’, ‘deviation’, ‘diagnosis’, ‘nose deformities’ and ‘nose malformation’. The studies were reviewed using the updated Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) tool. Results Online searches resulted in 23 abstracts after removal of duplicates that resulted from overlap of studies between the electronic databases. An additional 15 abstracts were excluded due to lack of relevance. A total of 8 studies were systematically reviewed. Conclusions Diagnostic modalities such as acoustic rhinometry, rhinomanometry and nasal spectral sound analysis may be useful in identifying NSD in anterior region of the nasal cavity, but these tests in isolation are of limited utility. Compared to anterior rhinoscopy, nasal endoscopy, and imaging the above mentioned index tests lack sensitivity and specificity in identifying the presence, location, and severity of NSD. PMID:24762010

  8. Assessment and diffusion of medical innovations in France: an overview

    PubMed Central

    Dubromel, Amélie; Geffroy, Loïc; Aulagner, Gilles; Dussart, Claude

    2018-01-01

    ABSTRACT Background: In France, a significant part of health expenditure is publicly funding. This put a heavy burden on society. In an economic context requiring tight control of public spending, it seems relevant to control the diffusion of medical innovations. That is why health technology assessment is subject to an increasing interest at national level for management and approval decisions. This article provides an overview of the assessment and diffusion of medical innovation in France. Method: The data are extracted from French authorities or organisations websites and documents and from French legislative texts. In addition, regarding discussion, a search in MEDLINE database was carried out. Results: An overview of the assessment and diffusion of medical innovation in France is given by presenting the different types of medical innovations according to French health system definition (I); introducing French authorities participating to health technology assessment and describe assessment procedures of medical innovations (II); and giving details about market access process of innovative health product in France (III). Key opportunities and challenges of medical innovation assessment and diffusion in France are discussed at the end of this article. Conclusion: In France, medical innovation is considered as a crucial component for quality of care and performance of healthcare system. The aim of health technology assessment is to promote a secure and timely access to innovation for patients. Nevertheless, it appears necessary to improve regulatory mechanisms. PMID:29686802

  9. Assessment and diffusion of medical innovations in France: an overview.

    PubMed

    Dubromel, Amélie; Geffroy, Loïc; Aulagner, Gilles; Dussart, Claude

    2018-01-01

    Background: In France, a significant part of health expenditure is publicly funding. This put a heavy burden on society. In an economic context requiring tight control of public spending, it seems relevant to control the diffusion of medical innovations. That is why health technology assessment is subject to an increasing interest at national level for management and approval decisions. This article provides an overview of the assessment and diffusion of medical innovation in France. Method: The data are extracted from French authorities or organisations websites and documents and from French legislative texts. In addition, regarding discussion, a search in MEDLINE database was carried out. Results: An overview of the assessment and diffusion of medical innovation in France is given by presenting the different types of medical innovations according to French health system definition (I); introducing French authorities participating to health technology assessment and describe assessment procedures of medical innovations (II); and giving details about market access process of innovative health product in France (III). Key opportunities and challenges of medical innovation assessment and diffusion in France are discussed at the end of this article. Conclusion: In France, medical innovation is considered as a crucial component for quality of care and performance of healthcare system. The aim of health technology assessment is to promote a secure and timely access to innovation for patients. Nevertheless, it appears necessary to improve regulatory mechanisms.

  10. Expanding the Security Dimension of Surety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SENGLAUB, MICHAEL E.

    1999-10-01

    A small effort was conducted at Sandia National Laboratories to explore the use of a number of modern analytic technologies in the assessment of terrorist actions and to predict trends. This work focuses on Bayesian networks as a means of capturing correlations between groups, tactics, and targets. The data that was used as a test of the methodology was obtained by using a special parsing algorithm written in JAVA to create records in a database from information articles captured electronically. As a vulnerability assessment technique the approach proved very useful. The technology also proved to be a valuable development mediummore » because of the ability to integrate blocks of information into a deployed network rather than waiting to fully deploy only after all relevant information has been assembled.« less

  11. Cryogenic hydrogen-induced air liquefaction technologies

    NASA Technical Reports Server (NTRS)

    Escher, William J. D.

    1990-01-01

    Extensively utilizing a special advanced airbreathing propulsion archives database, as well as direct contacts with individuals who were active in the field in previous years, a technical assessment of cryogenic hydrogen-induced air liquefaction, as a prospective onboard aerospace vehicle process, was performed and documented. The resulting assessment report is summarized. Technical findings are presented relating the status of air liquefaction technology, both as a singular technical area, and also that of a cluster of collateral technical areas including: compact lightweight cryogenic heat exchangers; heat exchanger atmospheric constituents fouling alleviation; para/ortho hydrogen shift conversion catalysts; hydrogen turbine expanders, cryogenic air compressors and liquid air pumps; hydrogen recycling using slush hydrogen as heat sink; liquid hydrogen/liquid air rocket-type combustion devices; air collection and enrichment systems (ACES); and technically related engine concepts.

  12. Review and assessment of the database and numerical modeling for turbine heat transfer

    NASA Technical Reports Server (NTRS)

    Gladden, H. J.; Simoneau, R. J.

    1988-01-01

    The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  13. Sensor Acquisition for Water Utilities: Survey, Down Selection Process, and Technology List

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alai, M; Glascoe, L; Love, A

    2005-06-29

    The early detection of the biological and chemical contamination of water distribution systems is a necessary capability for securing the nation's water supply. Current and emerging early-detection technology capabilities and shortcomings need to be identified and assessed to provide government agencies and water utilities with an improved methodology for assessing the value of installing these technologies. The Department of Homeland Security (DHS) has tasked a multi-laboratory team to evaluate current and future needs to protect the nation's water distribution infrastructure by supporting an objective evaluation of current and new technologies. The LLNL deliverable from this Operational Technology Demonstration (OTD) wasmore » to assist the development of a technology acquisition process for a water distribution early warning system. The technology survey includes a review of previous sensor surveys and current test programs and a compiled database of relevant technologies. In the survey paper we discuss previous efforts by governmental agencies, research organizations, and private companies. We provide a survey of previous sensor studies with regard to the use of Early Warning Systems (EWS) that includes earlier surveys, testing programs, and response studies. The list of sensor technologies was ultimately developed to assist in the recommendation of candidate technologies for laboratory and field testing. A set of recommendations for future sensor selection efforts has been appended to this document, as has a down selection example for a hypothetical water utility.« less

  14. The Health Information Technology Competencies Tool: Does It Translate for Nursing Informatics in the United States?

    PubMed

    Sipes, Carolyn; Hunter, Kathleen; McGonigle, Dee; West, Karen; Hill, Taryn; Hebda, Toni

    2017-12-01

    Information technology use in healthcare delivery mandates a prepared workforce. The initial Health Information Technology Competencies tool resulted from a 2-year transatlantic effort by experts from the US and European Union to identify approaches to develop skills and knowledge needed by healthcare workers. It was determined that competencies must be identified before strategies are established, resulting in a searchable database of more than 1000 competencies representing five domains, five skill levels, and more than 250 roles. Health Information Technology Competencies is available at no cost and supports role- or competency-based queries. Health Information Technology Competencies developers suggest its use for curriculum planning, job descriptions, and professional development.The Chamberlain College of Nursing informatics research team examined Health Information Technology Competencies for its possible application to our research and our curricular development, comparing it originally with the TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 tools, which examine informatics competencies at four levels of nursing practice. Additional analysis involved the 2015 Nursing Informatics: Scope and Standards of Practice. Informatics is a Health Information Technology Competencies domain, so clear delineation of nursing-informatics competencies was expected. Researchers found TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 differed from Health Information Technology Competencies 2016 in focus, definitions, ascribed competencies, and defined levels of expertise. When Health Information Technology Competencies 2017 was compared against the nursing informatics scope and standards, researchers found an increase in the number of informatics competencies but not to a significant degree. This is not surprising, given that Health Information Technology Competencies includes all healthcare workers, while the TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 tools and the American Nurses Association Nursing Informatics: Scope and Standards of Practice are nurse specific. No clear cross mapping across these tools and the standards of nursing informatics practice exists. Further examination and review are needed to translate Health Information Technology Competencies as a viable tool for nursing informatics use in the US.

  15. Many Miles to Go: A Systematic Review of the State of Cost-Utility Analyses in Brazil.

    PubMed

    Campolina, Alessandro G; Rozman, Luciana M; Decimoni, Tassia C; Leandro, Roseli; Novaes, Hillegonda M D; De Soárez, Patrícia Coelho

    2017-04-01

    Little is known about the quality and quantity of cost-utility analyses (CUAs) in Brazil. The objective of this study was to provide a systematic review of published CUAs of healthcare technologies in Brazil. We performed a systematic review of economic evaluations studies published in MEDLINE, EMBASE, LILACS (Latin American and Caribbean Health Sciences Literature), SciELO (Scientific Electronic Library Online), NHS EED (National Health Service Economic Evaluation Database), HTA (Health Technology Assessment) Database, Web of Science, Scopus, Bireme (Biblioteca Regional de Medicina), BVS ECOS (Health Economics database of the Brazilian Virtual Library of Health), and SISREBRATS (Sistema de Informação da Rede Brasileira de Avaliação de Tecnologias em Saúde [Brazilian Network for the Evaluation of Health Technologies]) from 1980 to 2013. Articles were included if they were CUAs according to the classification devised by Drummond et al. Two independent reviewers screened articles for relevance and carried out data extraction. Disagreements were resolved through discussion or through consultation with a third reviewer. We performed a qualitative narrative synthesis. Of the 535 health economic evaluations (HEEs) relating to Brazil, only 40 were CUAs and therefore included in the analysis. Most studies adhered to methodological guidelines for quality of reporting and 77.5% used quality-adjusted life-years (QALYs) as the health outcome. Of these studies, 51.6% did not report the population used to elicit preferences for outcomes and 45.2% used a specific population such as expert opinion. The preference elicitation method was not reported in 58.1% of these studies. The majority (80.6%) of studies did not report the instrument used to derive health state valuations and no publication reported whether tariffs (or preference weights) were national or international. No study mentioned the methodology used to estimate QALYs. Many published Brazilian cost-utility studies adhere to key recommended general methods for HEE; however, the use of QALY calculations is far from being the current international standard. Development of health preferences research can contribute to quality improvement of health technology assessment reports in Brazil.

  16. [Conceptual foundations of creation of branch database of technology and intellectual property rights owned by scientific institutions, organizations, higher medical educational institutions and enterprises of healthcare sphere of Ukraine].

    PubMed

    Horban', A Ie

    2013-09-01

    The question of implementation of the state policy in the field of technology transfer in the medical branch to implement the law of Ukraine of 02.10.2012 No 5407-VI "On Amendments to the law of Ukraine" "On state regulation of activity in the field of technology transfers", namely to ensure the formation of branch database on technology and intellectual property rights owned by scientific institutions, organizations, higher medical education institutions and enterprises of healthcare sphere of Ukraine and established by budget are considered. Analysis of international and domestic experience in the processing of information about intellectual property rights and systems implementation support transfer of new technologies are made. The main conceptual principles of creation of this branch database of technology transfer and branch technology transfer network are defined.

  17. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    PubMed

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    ERIC Educational Resources Information Center

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  19. Constraints on global temperature target overshoot.

    PubMed

    Ricke, K L; Millar, R J; MacMartin, D G

    2017-11-07

    In the aftermath of the Paris Agreement, the climate science and policy communities are beginning to assess the feasibility and potential benefits of limiting global warming to 1.5 °C or 2 °C above preindustrial. Understanding the dependence of the magnitude and duration of possible temporary exceedance (i.e., "overshoot") of temperature targets on sustainable energy decarbonization futures and carbon dioxide (CO 2 ) removal rates will be an important contribution to this policy discussion. Drawing upon results from the mitigation literature and the IPCC Working Group 3 (WG3) scenario database, we examine the global mean temperature implications of differing, independent pathways for the decarbonization of global energy supply and the implementation of negative emissions technologies. We find that within the scope of scenarios broadly-consistent with the WG3 database, the magnitude of temperature overshoot is more sensitive to the rate of decarbonization. However, limiting the duration of overshoot to less than two centuries requires ambitious deployment of both decarbonization and negative emissions technology. The dependencies of temperature target overshoot's properties upon currently untested negative emissions technologies suggests that it will be important to consider how climate impacts depend on both the magnitude and duration of overshoot, not just long term residual warming.

  20. [Neuromuscular electric stimulation therapy in otorhinolaryngology].

    PubMed

    Miller, S; Kühn, D; Jungheim, M; Schwemmle, C; Ptok, M

    2014-02-01

    Animal experiments have shown that after specific nerve traumatization, neuromuscular electrostimulation (NMES) can promote nerve regeneration and reduce synkinesia without negatively interfering with normal regeneration processes. NMES is used routinely in physical rehabilitation medicine. This systematic literature search in the Cochrane Central Register of Controlled Trials, the Cochrane Database of Systematic Reviews, the DAHTA database, the Health Technology Assessment Database and MEDLINE or PubMed considered studies on the use of NMES in otorhinolaryngology that have been published in German or English. The search identified 180 studies. These were evaluated and relevant studies were included in the further evaluation. In the fields of otorhinolaryngology and phoniatry/paediatric audiology, clinical studies investigating the effects of NMES on facial and laryngeal paresis, as well as dysphonia and dysphagia have been carried out. The evidence collected to date is encouraging; particularly for the treatment of certain forms of dysphagia and laryngeal paresis.

  1. In situ remediation of DNAPL compounds in low permeability media fate/transport, in situ control technologies, and risk reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-08-01

    In this project, in situ remediation technologies are being tested and evaluated for both source control and mass removal of dense, non-aqueous phase liquid (DNAPL) compounds in low permeability media (LPM). This effort is focused on chlorinated solvents (e.g., trichloroethylene and perchloroethylene) in the vadose and saturated zones of low permeability, massive deposits, and stratified deposits with inter-bedded clay lenses. The project includes technology evaluation and screening analyses and field-scale testing at both clean and contaminated sites in the US and Canada. Throughout this project, activities have been directed at understanding the processes that influence DNPAL compound migration and treatmentmore » in LPM and to assessing the operation and performance of the remediation technologies developed and tested. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less

  2. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object-oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  3. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object- oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  4. Video Discs in Libraries.

    ERIC Educational Resources Information Center

    Barker, Philip

    1986-01-01

    Discussion of developments in information storage technology likely to have significant impact upon library utilization focuses on hardware (videodisc technology) and software developments (knowledge databases; computer networks; database management systems; interactive video, computer, and multimedia user interfaces). Three generic computer-based…

  5. Results of a national survey indicating information technology skills needed by nurses at time of entry into the work force.

    PubMed

    McCannon, Melinda; O'Neal, Pamela V

    2003-08-01

    A national survey was conducted to determine the information technology skills nurse administrators consider critical for new nurses entering the work force. The sample consisted of 2,000 randomly selected members of the American Organization of Nurse Executives. Seven hundred fifty-two usable questionnaires were returned, for a response rate of 38%. The questionnaire used a 5-point Likert scale and consisted of 17 items that assessed various technology skills and demographic information. The questionnaire was developed and pilot tested with content experts to establish content validity. Descriptive analysis of the data revealed that using e-mail effectively, operating basic Windows applications, and searching databases were critical information technology skills. The most critical information technology skill involved knowing nursing-specific software, such as bedside charting and computer-activated medication dispensers. To effectively prepare nursing students with technology skills needed at the time of entry into practice, nursing faculty need to incorporate information technology skills into undergraduate nursing curricula.

  6. A content analysis of Health Technology Assessment programs in Latin America.

    PubMed

    Arellano, Luis E; Reza, Mercedes; Blasco, Juan Antonio; Andradas, Elena

    2009-10-01

    Health Technology Assessment (HTA) is a relatively new concept in Latin America (LA). The objectives of this exploratory study were to identify HTA programs in LA, review HTA documents produced by those programs, and assess the extent to which HTA aims are being achieved. An electronic search through two databases was performed to identify HTA programs in LA. A content analysis was performed on HTA documents (n = 236) produced by six programs between January 2000 and March 2007. Results were analyzed by comparing document content with the main goals of HTA. The number of HTA documents increased incrementally during the study period. The documents produced were mostly short HTA documents (82 percent) that assessed technologies such as drugs (31 percent), diagnostic and/or screening technologies (18 percent), or medical procedures (18 percent). Two-thirds (66 percent) of all HTA documents addressed issues related to clinical effectiveness and economic evaluations. Ethical, social, and/or legal issues were rarely addressed (<1 percent). The two groups most often targeted for dissemination of HTA information were third-party payers (55 percent) or government policy makers (41 percent). This study showed that while HTA programs in LA have attempted to address the main goals of HTA, they have done so through the production of short documents that focus on practical high-technology areas of importance to two specific target groups. Clinical and economic considerations still take precedence over ethical, social, and/or legal issues. Thus, an integrated conceptual framework in LA is wanting.

  7. Collaboration in health technology assessment (EUnetHTA joint action, 2010-2012): four case studies.

    PubMed

    Huić, Mirjana; Nachtnebel, Anna; Zechmeister, Ingrid; Pasternak, Iris; Wild, Claudia

    2013-07-01

    The aim of this study was to present the first four collaborative health technology assessment (HTA) processes on health technologies of different types and life cycles targeted toward diverse HTA users and facilitators, as well as the barriers of these collaborations. Retrospective analysis, through four case studies, was performed on the first four collaboration experiences of agencies participating in the EUnetHTA Joint Action project (2010-12), comprising different types and life cycles of health technologies for a diverse target audience, and different types of collaboration. The methods used to initiate collaboration, partner contributions, the assessment methodology, report structure, time frame, and factors acting as possible barriers to and facilitators of this collaboration were described. Two ways were used to initiate collaboration in the first four collaborative HTA processes: active brokering of information, so-called "calls for collaboration," and individual contact between agencies after identifying a topic common to two agencies in the Planned and Ongoing Projects database. Several success factors are recognized: predefined project management, high degree of commitment to the project; adherence to timelines; high relevance of technology; a common understanding of the methods applied and advanced experience in HTA; finally, acceptance of English-written reports by decision makers in non-English-speaking countries. Barriers like late identification of collaborative partners, nonacceptance of English language and different methodology of assessment should be overcome. Timely and efficient, different collaborative HTA processes on relative efficacy/effectiveness and safety on different types and life cycles of health technologies, targeted toward diverse HTA users in Europe are possible. There are still barriers to overcome.

  8. Collaboration spotting for dental science.

    PubMed

    Leonardi, E; Agocs, A; Fragkiskos, S; Kasfikis, N; Le Goff, J M; Cristalli, M P; Luzzi, V; Polimeni, A

    2014-10-06

    The goal of the Collaboration Spotting project is to create an automatic system to collect information about publications and patents related to a given technology, to identify the key players involved, and to highlight collaborations and related technologies. The collected information can be visualized in a web browser as interactive graphical maps showing in an intuitive way the players and their collaborations (Sociogram) and the relations among the technologies (Technogram). We propose to use the system to study technologies related to Dental Science. In order to create a Sociogram, we create a logical filter based on a set of keywords related to the technology under study. This filter is used to extract a list of publications from the Web of Science™ database. The list is validated by an expert in the technology and sent to CERN where it is inserted in the Collaboration Spotting database. Here, an automatic software system uses the data to generate the final maps. We studied a set of recent technologies related to bone regeneration procedures of oro--maxillo--facial critical size defects, namely the use of Porous HydroxyApatite (HA) as a bone substitute alone (bone graft) or as a tridimensional support (scaffold) for insemination and differentiation ex--vivo of Mesenchymal Stem Cells. We produced the Sociograms for these technologies and the resulting maps are now accessible on--line. The Collaboration Spotting system allows the automatic creation of interactive maps to show the current and historical state of research on a specific technology. These maps are an ideal tool both for researchers who want to assess the state--of--the--art in a given technology, and for research organizations who want to evaluate their contribution to the technological development in a given field. We demonstrated that the system can be used for Dental Science and produced the maps for an initial set of technologies in this field. We now plan to enlarge the set of mapped technologies in order to make the Collaboration Spotting system a useful reference tool for Dental Science research.

  9. Collaboration Spotting for oral medicine.

    PubMed

    Leonardi, E; Agocs, A; Fragkiskos, S; Kasfikis, N; Le Goff, J M; Cristalli, M P; Luzzi, V; Polimeni, A

    2014-09-01

    The goal of the Collaboration Spotting project is to create an automatic system to collect information about publications and patents related to a given technology, to identify the key players involved, and to highlight collaborations and related technologies. The collected information can be visualized in a web browser as interactive graphical maps showing in an intuitive way the players and their collaborations (Sociogram) and the relations among the technologies (Technogram). We propose to use the system to study technologies related to oral medicine. In order to create a sociogram, we create a logical filter based on a set of keywords related to the technology under study. This filter is used to extract a list of publications from the Web of Science™ database. The list is validated by an expert in the technology and sent to CERN where it is inserted in the Collaboration Spotting database. Here, an automatic software system uses the data to generate the final maps. We studied a set of recent technologies related to bone regeneration procedures of oro-maxillo-facial critical size defects, namely the use of porous hydroxyapatite (HA) as a bone substitute alone (bone graft) or as a tridimensional support (scaffold) for insemination and differentiation ex vivo of mesenchymal stem cells. We produced the sociograms for these technologies and the resulting maps are now accessible on-line. The Collaboration Spotting system allows the automatic creation of interactive maps to show the current and historical state of research on a specific technology. These maps are an ideal tool both for researchers who want to assess the state-of-the-art in a given technology, and for research organizations who want to evaluate their contribution to the technological development in a given field. We demonstrated that the system can be used in oral medicine as is produced the maps for an initial set of technologies in this field. We now plan to enlarge the set of mapped technologies in order to make the Collaboration Spotting system a useful reference tool for oral medicine research.

  10. [Review of the health technology assessment on surgeries in Japan].

    PubMed

    Nishigori, Tatsuto; Kawakami, Koji; Goto, Rei; Hida, Koya; Sakai, Yoshiharu

    2015-01-01

    Health Technology Assessment (HTA) is the systematic evaluation to measure the value of new health technologies. It improves the quality of choices on hand for cost-effective health technologies that are considered valuable. Japan has built a society of longevity consisted of the institution of the universal health care system, which is financially unsustainable. In Japan, no independent HTA organization has been publicly established but the government is contemplating implementation of such system. To advance the usage of HTA into surgery, we need to establish methods for evaluating new surgical technologies with steep learning curves. The promotion of clinical researches is also essential, especially by taking advantage of observational studies from medical big data such as the Japanese nationwide database which has more than four million surgical cases registered. In addition, we need more clinical information regarding each surgical patient's quality of life and socioeconomic status. The countries already introduced HTA into their health care system have measures to solve the problems that arose and have developed necessary evaluating methods. To introduce and promote HTA in Japan without taking away the benefit of our current healthcare, it is required that surgeons collaborate with other specialists such as methodologists and health economists.

  11. Assessment of pollution prevention and control technology for plating operations

    NASA Technical Reports Server (NTRS)

    Chalmer, Paul D.; Sonntag, William A.; Cushnie, George C., Jr.

    1995-01-01

    The National Center for Manufacturing Sciences (NCMS) is sponsoring an on-going project to assess pollution prevention and control technology available to the plating industry and to make this information available to those who can benefit from it. Completed project activities include extensive surveys of the plating industry and vendors of technologies and an indepth literature review. The plating industry survey was performed in cooperation with the National Association of Metal Finishers. The contractor that conducted the surveys and prepared the project products was CAI Engineering. The initial products of the project were made available in April, 1994. These products include an extensive report that presents the results of the surveys and literature review and an electronic database. The project results are useful for all those associated with pollution prevention and control in the plating industry. The results show which treatment, recovery and bath maintenance technologies have been most successful for different plating processes and the costs for purchasing and operating these technologies. The project results also cover trends in chemical substitution, the identification of compliance-problem pollutants, sludge generation rates, off-site sludge recovery and disposal options, and many other pertinent topics.

  12. Application of GIS Rapid Mapping Technology in Disaster Monitoring

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Tu, J.; Liu, G.; Zhao, Q.

    2018-04-01

    With the rapid development of GIS and RS technology, especially in recent years, GIS technology and its software functions have been increasingly mature and enhanced. And with the rapid development of mathematical statistical tools for spatial modeling and simulation, has promoted the widespread application and popularization of quantization in the field of geology. Based on the investigation of field disaster and the construction of spatial database, this paper uses remote sensing image, DEM and GIS technology to obtain the data information of disaster vulnerability analysis, and makes use of the information model to carry out disaster risk assessment mapping.Using ArcGIS software and its spatial data modeling method, the basic data information of the disaster risk mapping process was acquired and processed, and the spatial data simulation tool was used to map the disaster rapidly.

  13. Technology and the Modern Library.

    ERIC Educational Resources Information Center

    Boss, Richard W.

    1984-01-01

    Overview of the impact of information technology on libraries highlights turnkey vendors, bibliographic utilities, commercial suppliers of records, state and regional networks, computer-to-computer linkages, remote database searching, terminals and microcomputers, building local databases, delivery of information, digital telefacsimile,…

  14. Metabolomics: building on a century of biochemistry to guide human health

    PubMed Central

    German, J. Bruce; Hammock, Bruce D.; Watkins, Steven M.

    2006-01-01

    Medical diagnosis and treatment efficacy will improve significantly when a more personalized system for health assessment is implemented. This system will require diagnostics that provide sufficiently detailed information about the metabolic status of individuals such that assay results will be able to guide food, drug and lifestyle choices to maintain or improve distinct aspects of health without compromising others. Achieving this goal will use the new science of metabolomics – comprehensive metabolic profiling of individuals linked to the biological understanding of human integrative metabolism. Candidate technologies to accomplish this goal are largely available, yet they have not been brought into practice for this purpose. Metabolomic technologies must be sufficiently rapid, accurate and affordable to be routinely accessible to both healthy and acutely ill individuals. The use of metabolomic data to predict the health trajectories of individuals will require bioinformatic tools and quantitative reference databases. These databases containing metabolite profiles from the population must be built, stored and indexed according to metabolic and health status. Building and annotating these databases with the knowledge to predict how a specific metabolic pattern from an individual can be adjusted with diet, drugs and lifestyle to improve health represents a logical application of the biochemistry knowledge that the life sciences have produced over the past 100 years. PMID:16680201

  15. [Construction and application of special analysis database of geoherbs based on 3S technology].

    PubMed

    Guo, Lan-ping; Huang, Lu-qi; Lv, Dong-mei; Shao, Ai-juan; Wang, Jian

    2007-09-01

    In this paper,the structures, data sources, data codes of "the spacial analysis database of geoherbs" based 3S technology are introduced, and the essential functions of the database, such as data management, remote sensing, spacial interpolation, spacial statistics, spacial analysis and developing are described. At last, two examples for database usage are given, the one is classification and calculating of NDVI index of remote sensing image in geoherbal area of Atractylodes lancea, the other one is adaptation analysis of A. lancea. These indicate that "the spacial analysis database of geoherbs" has bright prospect in spacial analysis of geoherbs.

  16. Systematic Review of Health Economic Evaluation Studies Developed in Brazil from 1980 to 2013.

    PubMed

    Decimoni, Tassia Cristina; Leandro, Roseli; Rozman, Luciana Martins; Craig, Dawn; Iglesias, Cynthia P; Novaes, Hillegonda Maria Dutilh; de Soárez, Patrícia Coelho

    2018-01-01

    Brazil has sought to use economic evaluation to support healthcare decision-making processes. While a number of health economic evaluations (HEEs) have been conducted, no study has systematically reviewed the quality of Brazilian HEE. The objective of this systematic review was to provide an overview regarding the state of HEE research and to evaluate the number, characteristics, and quality of reporting of published HEE studies conducted in a Brazilian setting. We systematically searched electronic databases (MEDLINE, EMBASE, Latin American, and Caribbean Literature on Health Sciences Database, Scientific Electronic Library Online, NHS Economic Evaluation Database, health technology assessment Database, Bireme, and Biblioteca Virtual em Saúde Economia da Saúde ); citation indexes (SCOPUS, Web of Science), and Sistema de Informação da Rede Brasileira de Avaliação de Tecnologia em Saúde . Partial and full HEEs published between 1980 and 2013 that referred to a Brazilian setting were considered for inclusion. In total, 535 studies were included in the review, 36.8% of these were considered to be full HEE. The category of healthcare technologies more frequently assessed were procedures (34.8%) and drugs (28.8%) which main objective was treatment (72.1%). Forty-four percent of the studies reported their funding source and 36% reported a conflict of interest. Overall, the full HEE quality of reporting was satisfactory. But some items were generally poorly reported and significant improvement is required: (1) methods used to estimate healthcare resource use quantities and unit costs, (2) methods used to estimate utility values, (3) sources of funding, and (4) conflicts of interest. A steady number of HEE have been published in Brazil since 1980. To improve their contribution to inform national healthcare policy efforts need to be made to enhance the quality of reporting of HEEs and promote improvements in the way HEEs are designed, implemented (i.e., using sound methods for HEEs) and reported.

  17. Generalized Database Management System Support for Numeric Database Environments.

    ERIC Educational Resources Information Center

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  18. [A Terahertz Spectral Database Based on Browser/Server Technique].

    PubMed

    Zhang, Zhuo-yong; Song, Yue

    2015-09-01

    With the solution of key scientific and technical problems and development of instrumentation, the application of terahertz technology in various fields has been paid more and more attention. Owing to the unique characteristic advantages, terahertz technology has been showing a broad future in the fields of fast, non-damaging detections, as well as many other fields. Terahertz technology combined with other complementary methods can be used to cope with many difficult practical problems which could not be solved before. One of the critical points for further development of practical terahertz detection methods depends on a good and reliable terahertz spectral database. We developed a BS (browser/server) -based terahertz spectral database recently. We designed the main structure and main functions to fulfill practical requirements. The terahertz spectral database now includes more than 240 items, and the spectral information was collected based on three sources: (1) collection and citation from some other abroad terahertz spectral databases; (2) collected from published literatures; and (3) spectral data measured in our laboratory. The present paper introduced the basic structure and fundament functions of the terahertz spectral database developed in our laboratory. One of the key functions of this THz database is calculation of optical parameters. Some optical parameters including absorption coefficient, refractive index, etc. can be calculated based on the input THz time domain spectra. The other main functions and searching methods of the browser/server-based terahertz spectral database have been discussed. The database search system can provide users convenient functions including user registration, inquiry, displaying spectral figures and molecular structures, spectral matching, etc. The THz database system provides an on-line searching function for registered users. Registered users can compare the input THz spectrum with the spectra of database, according to the obtained correlation coefficient one can perform the searching task very fast and conveniently. Our terahertz spectral database can be accessed at http://www.teralibrary.com. The proposed terahertz spectral database is based on spectral information so far, and will be improved in the future. We hope this terahertz spectral database can provide users powerful, convenient, and high efficient functions, and could promote the broader applications of terahertz technology.

  19. Experience with Multi-Tier Grid MySQL Database Service Resiliency at BNL

    NASA Astrophysics Data System (ADS)

    Wlodek, Tomasz; Ernst, Michael; Hover, John; Katramatos, Dimitrios; Packard, Jay; Smirnov, Yuri; Yu, Dantong

    2011-12-01

    We describe the use of F5's BIG-IP smart switch technology (3600 Series and Local Traffic Manager v9.0) to provide load balancing and automatic fail-over to multiple Grid services (GUMS, VOMS) and their associated back-end MySQL databases. This resiliency is introduced in front of the external application servers and also for the back-end database systems, which is what makes it "multi-tier". The combination of solutions chosen to ensure high availability of the services, in particular the database replication and fail-over mechanism, are discussed in detail. The paper explains the design and configuration of the overall system, including virtual servers, machine pools, and health monitors (which govern routing), as well as the master-slave database scheme and fail-over policies and procedures. Pre-deployment planning and stress testing will be outlined. Integration of the systems with our Nagios-based facility monitoring and alerting is also described. And application characteristics of GUMS and VOMS which enable effective clustering will be explained. We then summarize our practical experiences and real-world scenarios resulting from operating a major US Grid center, and assess the applicability of our approach to other Grid services in the future.

  20. Relational Data Bases--Are You Ready?

    ERIC Educational Resources Information Center

    Marshall, Dorothy M.

    1989-01-01

    Migrating from a traditional to a relational database technology requires more than traditional project management techniques. An overview of what to consider before migrating to relational database technology is presented. Leadership, staffing, vendor support, hardware, software, and application development are discussed. (MLW)

  1. A Database for Decision-Making in Training and Distributed Learning Technology

    DTIC Science & Technology

    1998-04-01

    developer must answer these questions: ♦ Who will develop the courseware? Should we outsource ? ♦ What media should we use? How much will it cost? ♦ What...to develop , the database can be useful for answering staffing questions and planning transitions to technology- assisted courses. The database...of distributed learning curricula in com- parison to traditional methods. To develop a military-wide distributed learning plan, the existing course

  2. Psychometric Properties of Patient-Facing eHealth Evaluation Measures: Systematic Review and Analysis

    PubMed Central

    Turvey, Carolyn L; Nazi, Kim M; Holman, John E; Hogan, Timothy P; Shimada, Stephanie L; Kennedy, Diana R

    2017-01-01

    Background Significant resources are being invested into eHealth technology to improve health care. Few resources have focused on evaluating the impact of use on patient outcomes A standardized set of metrics used across health systems and research will enable aggregation of data to inform improved implementation, clinical practice, and ultimately health outcomes associated with use of patient-facing eHealth technologies. Objective The objective of this project was to conduct a systematic review to (1) identify existing instruments for eHealth research and implementation evaluation from the patient’s point of view, (2) characterize measurement components, and (3) assess psychometrics. Methods Concepts from existing models and published studies of technology use and adoption were identified and used to inform a search strategy. Search terms were broadly categorized as platforms (eg, email), measurement (eg, survey), function/information use (eg, self-management), health care occupations (eg, nurse), and eHealth/telemedicine (eg, mHealth). A computerized database search was conducted through June 2014. Included articles (1) described development of an instrument, or (2) used an instrument that could be traced back to its original publication, or (3) modified an instrument, and (4) with full text in English language, and (5) focused on the patient perspective on technology, including patient preferences and satisfaction, engagement with technology, usability, competency and fluency with technology, computer literacy, and trust in and acceptance of technology. The review was limited to instruments that reported at least one psychometric property. Excluded were investigator-developed measures, disease-specific assessments delivered via technology or telephone (eg, a cancer-coping measure delivered via computer survey), and measures focused primarily on clinician use (eg, the electronic health record). Results The search strategy yielded 47,320 articles. Following elimination of duplicates and non-English language publications (n=14,550) and books (n=27), another 31,647 articles were excluded through review of titles. Following a review of the abstracts of the remaining 1096 articles, 68 were retained for full-text review. Of these, 16 described an instrument and six used an instrument; one instrument was drawn from the GEM database, resulting in 23 articles for inclusion. None included a complete psychometric evaluation. The most frequently assessed property was internal consistency (21/23, 91%). Testing for aspects of validity ranged from 48% (11/23) to 78% (18/23). Approximately half (13/23, 57%) reported how to score the instrument. Only six (26%) assessed the readability of the instrument for end users, although all the measures rely on self-report. Conclusions Although most measures identified in this review were published after the year 2000, rapidly changing technology makes instrument development challenging. Platform-agnostic measures need to be developed that focus on concepts important for use of any type of eHealth innovation. At present, there are important gaps in the availability of psychometrically sound measures to evaluate eHealth technologies. PMID:29021128

  3. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  4. Fecal Microbiota Therapy for Clostridium difficile Infection: A Health Technology Assessment.

    PubMed

    2016-01-01

    Fecal microbiota therapy is increasingly being used to treat patients with Clostridium difficile infection. This health technology assessment primarily evaluated the effectiveness and cost-effectiveness of fecal microbiota therapy compared with the usual treatment (antibiotic therapy). We performed a literature search using Ovid MEDLINE, Embase, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, CRD Health Technology Assessment Database, Cochrane Central Register of Controlled Trials, and NHS Economic Evaluation Database. For the economic review, we applied economic filters to these search results. We also searched the websites of agencies for other health technology assessments. We conducted a meta-analysis to analyze effectiveness. The quality of the body of evidence for each outcome was examined according to the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group criteria. Using a step-wise, structural methodology, we determined the overall quality to be high, moderate, low, or very low. We used a survey to examine physicians' perception of patients' lived experience, and a modified grounded theory method to analyze information from the survey. For the review of clinical effectiveness, 16 of 1,173 citations met the inclusion criteria. A meta-analysis of two randomized controlled trials found that fecal microbiota therapy significantly improved diarrhea associated with recurrent C. difficile infection versus treatment with vancomycin (relative risk 3.24, 95% confidence interval [CI] 1.85-5.68) (GRADE: moderate). While fecal microbiota therapy is not associated with a significant decrease in mortality compared with antibiotic therapy (relative risk 0.69, 95% CI 0.14-3.39) (GRADE: low), it is associated with a significant increase in adverse events (e.g., short-term diarrhea, relative risk 30.76, 95% CI 4.46-212.44; abdominal cramping, relative risk 14.81, 95% CI 2.07-105.97) (GRADE: low). For the value-for-money component, two of 151 economic evaluations met the inclusion criteria. One reported that fecal microbiota therapy was dominant (more effective and less expensive) compared with vancomycin; the other reported an incremental cost-effectiveness ratio of $17,016 USD per quality-adjusted life-year for fecal microbiota therapy compared with vancomycin. This ratio for the second study indicated that there would be additional cost associated with each recurrent C. difficile infection resolved. In Ontario, if fecal microbiota therapy were adopted to treat recurrent C. difficile infection, considering it from the perspective of the Ministry of Health and Long-Term Care as the payer, an estimated $1.5 million would be saved after the first year of adoption and $2.9 million after 3 years. The contradiction between the second economic evaluation and the savings we estimated may be a result of the lower cost of fecal microbiota therapy and hospitalization in Ontario compared with the cost of therapy used in the US model. Physicians reported that C. difficile infection significantly reduced patients' quality of life. Physicians saw fecal microbiota therapy as improving patients' quality of life because patients could resume daily activities. Physicians reported that their patients were happy with the procedures required to receive fecal microbiota therapy. In patients with recurrent C. difficile infection, fecal microbiota therapy improves outcomes that are important to patients and provides good value for money.

  5. Fecal Microbiota Therapy for Clostridium difficile Infection: A Health Technology Assessment

    PubMed Central

    2016-01-01

    Background Fecal microbiota therapy is increasingly being used to treat patients with Clostridium difficile infection. This health technology assessment primarily evaluated the effectiveness and cost-effectiveness of fecal microbiota therapy compared with the usual treatment (antibiotic therapy). Methods We performed a literature search using Ovid MEDLINE, Embase, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, CRD Health Technology Assessment Database, Cochrane Central Register of Controlled Trials, and NHS Economic Evaluation Database. For the economic review, we applied economic filters to these search results. We also searched the websites of agencies for other health technology assessments. We conducted a meta-analysis to analyze effectiveness. The quality of the body of evidence for each outcome was examined according to the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group criteria. Using a step-wise, structural methodology, we determined the overall quality to be high, moderate, low, or very low. We used a survey to examine physicians’ perception of patients’ lived experience, and a modified grounded theory method to analyze information from the survey. Results For the review of clinical effectiveness, 16 of 1,173 citations met the inclusion criteria. A meta-analysis of two randomized controlled trials found that fecal microbiota therapy significantly improved diarrhea associated with recurrent C. difficile infection versus treatment with vancomycin (relative risk 3.24, 95% confidence interval [CI] 1.85–5.68) (GRADE: moderate). While fecal microbiota therapy is not associated with a significant decrease in mortality compared with antibiotic therapy (relative risk 0.69, 95% CI 0.14–3.39) (GRADE: low), it is associated with a significant increase in adverse events (e.g., short-term diarrhea, relative risk 30.76, 95% CI 4.46–212.44; abdominal cramping, relative risk 14.81, 95% CI 2.07–105.97) (GRADE: low). For the value-for-money component, two of 151 economic evaluations met the inclusion criteria. One reported that fecal microbiota therapy was dominant (more effective and less expensive) compared with vancomycin; the other reported an incremental cost-effectiveness ratio of $17,016 USD per quality-adjusted life-year for fecal microbiota therapy compared with vancomycin. This ratio for the second study indicated that there would be additional cost associated with each recurrent C. difficile infection resolved. In Ontario, if fecal microbiota therapy were adopted to treat recurrent C. difficile infection, considering it from the perspective of the Ministry of Health and Long-Term Care as the payer, an estimated $1.5 million would be saved after the first year of adoption and $2.9 million after 3 years. The contradiction between the second economic evaluation and the savings we estimated may be a result of the lower cost of fecal microbiota therapy and hospitalization in Ontario compared with the cost of therapy used in the US model. Physicians reported that C. difficile infection significantly reduced patients’ quality of life. Physicians saw fecal microbiota therapy as improving patients’ quality of life because patients could resume daily activities. Physicians reported that their patients were happy with the procedures required to receive fecal microbiota therapy. Conclusions In patients with recurrent C. difficile infection, fecal microbiota therapy improves outcomes that are important to patients and provides good value for money. PMID:27516814

  6. WEB-BASED DATABASE ON RENEWAL TECHNOLOGIES ...

    EPA Pesticide Factsheets

    As U.S. utilities continue to shore up their aging infrastructure, renewal needs now represent over 43% of annual expenditures compared to new construction for drinking water distribution and wastewater collection systems (Underground Construction [UC], 2016). An increased understanding of renewal options will ultimately assist drinking water utilities in reducing water loss and help wastewater utilities to address infiltration and inflow issues in a cost-effective manner. It will also help to extend the service lives of both drinking water and wastewater mains. This research effort involved collecting case studies on the use of various trenchless pipeline renewal methods and providing the information in an online searchable database. The overall objective was to further support technology transfer and information sharing regarding emerging and innovative renewal technologies for water and wastewater mains. The result of this research is a Web-based, searchable database that utility personnel can use to obtain technology performance and cost data, as well as case study references. The renewal case studies include: technologies used; the conditions under which the technology was implemented; costs; lessons learned; and utility contact information. The online database also features a data mining tool for automated review of the technologies selected and cost data. Based on a review of the case study results and industry data, several findings are presented on tren

  7. An Entropy Approach to Disclosure Risk Assessment: Lessons from Real Applications and Simulated Domains

    PubMed Central

    Airoldi, Edoardo M.; Bai, Xue; Malin, Bradley A.

    2011-01-01

    We live in an increasingly mobile world, which leads to the duplication of information across domains. Though organizations attempt to obscure the identities of their constituents when sharing information for worthwhile purposes, such as basic research, the uncoordinated nature of such environment can lead to privacy vulnerabilities. For instance, disparate healthcare providers can collect information on the same patient. Federal policy requires that such providers share “de-identified” sensitive data, such as biomedical (e.g., clinical and genomic) records. But at the same time, such providers can share identified information, devoid of sensitive biomedical data, for administrative functions. On a provider-by-provider basis, the biomedical and identified records appear unrelated, however, links can be established when multiple providers’ databases are studied jointly. The problem, known as trail disclosure, is a generalized phenomenon and occurs because an individual’s location access pattern can be matched across the shared databases. Due to technical and legal constraints, it is often difficult to coordinate between providers and thus it is critical to assess the disclosure risk in distributed environments, so that we can develop techniques to mitigate such risks. Research on privacy protection has so far focused on developing technologies to suppress or encrypt identifiers associated with sensitive information. There is growing body of work on the formal assessment of the disclosure risk of database entries in publicly shared databases, but a less attention has been paid to the distributed setting. In this research, we review the trail disclosure problem in several domains with known vulnerabilities and show that disclosure risk is influenced by the distribution of how people visit service providers. Based on empirical evidence, we propose an entropy metric for assessing such risk in shared databases prior to their release. This metric assesses risk by leveraging the statistical characteristics of a visit distribution, as opposed to person-level data. It is computationally efficient and superior to existing risk assessment methods, which rely on ad hoc assessment that are often computationally expensive and unreliable. We evaluate our approach on a range of location access patterns in simulated environments. Our results demonstrate the approach is effective at estimating trail disclosure risks and the amount of self-information contained in a distributed system is one of the main driving factors. PMID:21647242

  8. Nuclear science abstracts (NSA) database 1948--1974 (on the Internet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Nuclear Science Abstracts (NSA) is a comprehensive abstract and index collection of the International Nuclear Science and Technology literature for the period 1948 through 1976. Included are scientific and technical reports of the US Atomic Energy Commission, US Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Coverage of the literature since 1976 is provided by Energy Science and Technology Database. Approximately 25% of the records in the file contain abstracts. These are from the following volumes of the print Nuclear Science Abstracts: Volumes 12--18, Volume 29, and Volume 33. The database containsmore » over 900,000 bibliographic records. All aspects of nuclear science and technology are covered, including: Biomedical Sciences; Metals, Ceramics, and Other Materials; Chemistry; Nuclear Materials and Waste Management; Environmental and Earth Sciences; Particle Accelerators; Engineering; Physics; Fusion Energy; Radiation Effects; Instrumentation; Reactor Technology; Isotope and Radiation Source Technology. The database includes all records contained in Volume 1 (1948) through Volume 33 (1976) of the printed version of Nuclear Science Abstracts (NSA). This worldwide coverage includes books, conference proceedings, papers, patents, dissertations, engineering drawings, and journal literature. This database is now available for searching through the GOV. Research Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less

  9. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Senoo, Tetsuo

    As computer technology, communication technology and others have progressed, many corporations are likely to locate constructing and utilizing their own databases at the center of the information activities, and aim at developing their information activities newly. This paper considers how information management in a corporation is affected under changing management and technology environments, and clarifies and generalizes what in-house databases should be constructed and utilized from the viewpoints of requirements to be furnished, types and forms of information to be dealt, indexing, use type and frequency, evaluation method and so on. The author outlines an information system of Matsushita called MATIS (Matsushita Technical Information System) as an actual example, and describes the present status and some points to be reminded in constructing and utilizing databases of REP, BOOK and SYMP.

  10. 77 FR 71089 - Pilot Loading of Aeronautical Database Updates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-29

    ... the use of newer systems and data-transfer mechanisms such as those employing wireless technology. In... which enables wireless updating of systems and databases. The current regulation does not accommodate... maintenance); Recordkeeping requirements; Training for pilots; Technological advancements in data-transfer...

  11. Technology use for health education to caregivers: an integrative review of nursing literature.

    PubMed

    Nogueira, Paula Cristina; de Carvalho Nagliate, Patrícia; de Godoy, Simone; Rangel, Elaine Maria Leite; Trevizan, Maria Auxiliadora; Mendes, Isabel Amélia Costa

    2013-08-01

    Providing caregivers with health education through educational technologies enhances safe care; and stimulates the decision process and communication among professionals, caregivers and patients. This article is an integrative review to identify what educational technologies have been used for health education to caregivers. The databases Web of Science, Bireme and Scopus were consulted. The inclusion criteria are as follows: full papers, published between 2001 and 2011, in English, Portuguese or Spanish. The descriptors used are the following: educational technology, health education and caregivers. Thirty-four papers were found, 27 of which were excluded because they did not comply with the inclusion criteria, resulting in a final sample of 7 papers. The results evidenced the use of light and hard technologies in health education for caregivers, aimed at the therapeutic discussion of care as well as telehealth service delivery. Research is needed which uses and assesses the use of hard educational technologies in health education for caregivers. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Health and Wellness Technology Use by Historically Underserved Health Consumers: Systematic Review

    PubMed Central

    Perchonok, Jennifer

    2012-01-01

    Background The implementation of health technology is a national priority in the United States and widely discussed in the literature. However, literature about the use of this technology by historically underserved populations is limited. Information on culturally informed health and wellness technology and the use of these technologies to reduce health disparities facing historically underserved populations in the United States is sparse in the literature. Objective To examine ways in which technology is being used by historically underserved populations to decrease health disparities through facilitating or improving health care access and health and wellness outcomes. Methods We conducted a systematic review in four library databases (PubMed, PsycINFO, Web of Science, and Engineering Village) to investigate the use of technology by historically underserved populations. Search strings consisted of three topics (eg, technology, historically underserved populations, and health). Results A total of 424 search phrases applied in the four databases returned 16,108 papers. After review, 125 papers met the selection criteria. Within the selected papers, 30 types of technology, 19 historically underserved groups, and 23 health issues were discussed. Further, almost half of the papers (62 papers) examined the use of technology to create effective and culturally informed interventions or educational tools. Finally, 12 evaluation techniques were used to assess the technology. Conclusions While the reviewed studies show how technology can be used to positively affect the health of historically underserved populations, the technology must be tailored toward the intended population, as personally relevant and contextually situated health technology is more likely than broader technology to create behavior changes. Social media, cell phones, and videotapes are types of technology that should be used more often in the future. Further, culturally informed health information technology should be used more for chronic diseases and disease management, as it is an innovative way to provide holistic care and reminders to otherwise underserved populations. Additionally, design processes should be stated regularly so that best practices can be created. Finally, the evaluation process should be standardized to create a benchmark for culturally informed health information technology. PMID:22652979

  13. Objective quality assessment for multiexposure multifocus image fusion.

    PubMed

    Hassen, Rania; Wang, Zhou; Salama, Magdy M A

    2015-09-01

    There has been a growing interest in image fusion technologies, but how to objectively evaluate the quality of fused images has not been fully understood. Here, we propose a method for objective quality assessment of multiexposure multifocus image fusion based on the evaluation of three key factors of fused image quality: 1) contrast preservation; 2) sharpness; and 3) structure preservation. Subjective experiments are conducted to create an image fusion database, based on which, performance evaluation shows that the proposed fusion quality index correlates well with subjective scores, and gives a significant improvement over the existing fusion quality measures.

  14. JICST Factual Database JICST DNA Database

    NASA Astrophysics Data System (ADS)

    Shirokizawa, Yoshiko; Abe, Atsushi

    Japan Information Center of Science and Technology (JICST) has started the on-line service of DNA database in October 1988. This database is composed of EMBL Nucleotide Sequence Library and Genetic Sequence Data Bank. The authors outline the database system, data items and search commands. Examples of retrieval session are presented.

  15. Literature searching for clinical and cost-effectiveness studies used in health technology assessment reports carried out for the National Institute for Clinical Excellence appraisal system.

    PubMed

    Royle, P; Waugh, N

    2003-01-01

    To contribute to making searching for Technology Assessment Reports (TARs) more cost-effective by suggesting an optimum literature retrieval strategy. A sample of 20 recent TARs. All sources used to search for clinical and cost-effectiveness studies were recorded. In addition, all studies that were included in the clinical and cost-effectiveness sections of the TARs were identified, and their characteristics recorded, including author, journal, year, study design, study size and quality score. Each was also classified by publication type, and then checked to see whether it was indexed in the following databases: MEDLINE, EMBASE, and then either the Cochrane Controlled Trials Register (CCTR) for clinical effectiveness studies or the NHS Economic Evaluation Database (NHS EED) for the cost-effectiveness studies. Any study not found in at least one of these databases was checked to see whether it was indexed in the Science Citation Index (SCI) and BIOSIS, and the American Society of Clinical Oncology (ASCO) Online if a cancer review. Any studies still not found were checked to see whether they were in a number of additional databases. The median number of sources searched per TAR was 20, and the range was from 13 to 33 sources. Six sources (CCTR, DARE, EMBASE, MEDLINE, NHS EED and sponsor/industry submissions to National Institute for Clinical Excellence) were used in all reviews. After searching the MEDLINE, EMBASE and NHS EED databases, 87.3% of the clinical effectiveness studies and 94.8% of the cost-effectiveness studies were found, rising to 98.2% when SCI, BIOSIS and ASCO Online and 97.9% when SCI and ASCO Online, respectively, were added. The median number of sources searched for the 14 TARs that included an economic model was 9.0 per TAR. A sensitive search filter for identifying non-randomised controlled trials (RCT), constructed for MEDLINE and using the search terms from the bibliographic records in the included studies, retrieved only 85% of the known sample. Therefore, it is recommended that when searching for non-RCT studies a search is done for the intervention alone, and records are then scanned manually for those that look relevant. Searching additional databases beyond the Cochrane Library (which includes CCTR, NHS EED and the HTA database), MEDLINE, EMBASE and SCI, plus BIOSIS limited to meeting abstracts only, was seldom found to be effective in retrieving additional studies for inclusion in the clinical and cost-effectiveness sections of TARs (apart from reviews of cancer therapies, where a search of the ASCO database is recommended). A more selective approach to database searching would suffice in most cases and would save resources, thereby making the TAR process more efficient. However, searching non-database sources (including submissions from manufacturers, recent meeting abstracts, contact with experts and checking reference lists) does appear to be a productive way of identifying further studies.

  16. Digital health technology for use in patients with serious mental illness: a systematic review of the literature.

    PubMed

    Batra, Sonal; Baker, Ross A; Wang, Tao; Forma, Felicia; DiBiasi, Faith; Peters-Strickland, Timothy

    2017-01-01

    As the capabilities and reach of technology have expanded, there is an accompanying proliferation of digital technologies developed for use in the care of patients with mental illness. The objective of this review was to systematically search published literature to identify currently available health technologies and their intended uses for patients with serious mental illness. The Medline, Embase, and BIOSIS Previews electronic databases were searched to identify peer-reviewed English language articles that reported the use of digital, mobile, and other advanced technology in patients with schizophrenia/schizoaffective disorder, bipolar disorder, and major depressive disorder. Eligible studies were systematically reviewed based on Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Eighteen studies that met the inclusion criteria were identified. Digital health technologies (DHTs) assessed in the selected studies included mobile applications (apps), digital medicine, digital personal health records, and an electronic pill container. Smartphone apps accounted for the largest share of DHTs. The intended uses of DHTs could be broadly classified as monitoring to gain a better understanding of illness, clinical assessment, and intervention. Overall, studies indicated high usability/feasibility and efficacy/effectiveness, with several reporting validity against established clinical scales. Users were generally engaged with the DHT, and mobile assessments were deemed helpful in monitoring disease symptoms. Rapidly proliferating digital technologies seem to be feasible for short-term use in patients with serious mental illness; nevertheless, long-term effectiveness data from naturalistic studies will help demonstrate their usefulness and facilitate their adoption and integration into the mental health-care system.

  17. Digital health technology for use in patients with serious mental illness: a systematic review of the literature

    PubMed Central

    Batra, Sonal; Baker, Ross A; Wang, Tao; Forma, Felicia; DiBiasi, Faith; Peters-Strickland, Timothy

    2017-01-01

    Background As the capabilities and reach of technology have expanded, there is an accompanying proliferation of digital technologies developed for use in the care of patients with mental illness. The objective of this review was to systematically search published literature to identify currently available health technologies and their intended uses for patients with serious mental illness. Materials and methods The Medline, Embase, and BIOSIS Previews electronic databases were searched to identify peer-reviewed English language articles that reported the use of digital, mobile, and other advanced technology in patients with schizophrenia/schizoaffective disorder, bipolar disorder, and major depressive disorder. Eligible studies were systematically reviewed based on Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Results Eighteen studies that met the inclusion criteria were identified. Digital health technologies (DHTs) assessed in the selected studies included mobile applications (apps), digital medicine, digital personal health records, and an electronic pill container. Smartphone apps accounted for the largest share of DHTs. The intended uses of DHTs could be broadly classified as monitoring to gain a better understanding of illness, clinical assessment, and intervention. Overall, studies indicated high usability/feasibility and efficacy/effectiveness, with several reporting validity against established clinical scales. Users were generally engaged with the DHT, and mobile assessments were deemed helpful in monitoring disease symptoms. Conclusion Rapidly proliferating digital technologies seem to be feasible for short-term use in patients with serious mental illness; nevertheless, long-term effectiveness data from naturalistic studies will help demonstrate their usefulness and facilitate their adoption and integration into the mental health-care system. PMID:29042823

  18. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  19. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  20. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  1. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...

  2. Human spaceflight technology needs-a foundation for JSC's technology strategy

    NASA Astrophysics Data System (ADS)

    Stecklein, J. M.

    Human space exploration has always been heavily influenced by goals to achieve a specific mission on a specific schedule. This approach drove rapid technology development, the rapidity of which added risks and became a major driver for costs and cost uncertainty. The National Aeronautics and Space Administration (NASA) is now approaching the extension of human presence throughout the solar system by balancing a proactive yet less schedule-driven development of technology with opportunistic scheduling of missions as the needed technologies are realized. This approach should provide cost effective, low risk technology development that will enable efficient and effective manned spaceflight missions. As a first step, the NASA Human Spaceflight Architecture Team (HAT) has identified a suite of critical technologies needed to support future manned missions across a range of destinations, including in cis-lunar space, near earth asteroid visits, lunar exploration, Mars moons, and Mars exploration. The challenge now is to develop a strategy and plan for technology development that efficiently enables these missions over a reasonable time period, without increasing technology development costs unnecessarily due to schedule pressure, and subsequently mitigating development and mission risks. NASA's Johnson Space Center (JSC), as the nation's primary center for human exploration, is addressing this challenge through an innovative approach in allocating Internal Research and Development funding to projects. The HAT Technology Needs (Tech Needs) Database has been developed to correlate across critical technologies and the NASA Office of Chief Technologist Technology Area Breakdown Structure (TABS). The TechNeeds Database illuminates that many critical technologies may support a single technical capability gap, that many HAT technology needs may map to a single TABS technology discipline, and that a single HAT technology need may map to multiple TABS technology disciplines. Th- TechNeeds Database greatly clarifies understanding of the complex relationships of critical technologies to mission and architecture element needs. Extensions to the core TechNeeds Database allow JSC to factor in and appropriately weight JSC core technology competencies, and considerations of commercialization potential and partnership potential. The inherent coupling among these, along with an appropriate importance weighting, has provided an initial prioritization for allocation of technology development research funding at JSc. The HAT Technology Needs Database, with a core of built-in reports, clarifies and communicates complex technology needs for cost effective human space exploration so that an organization seeking to assure that research prioritization supports human spaceflight of the future can be successful.

  3. Human Spaceflight Technology Needs - A Foundation for JSC's Technology Strategy

    NASA Technical Reports Server (NTRS)

    Stecklein, Jonette M.

    2013-01-01

    Human space exploration has always been heavily influenced by goals to achieve a specific mission on a specific schedule. This approach drove rapid technology development, the rapidity of which adds risks as well as provides a major driver for costs and cost uncertainty. The National Aeronautics and Space Administration (NASA) is now approaching the extension of human presence throughout the solar system by balancing a proactive yet less schedule-driven development of technology with opportunistic scheduling of missions as the needed technologies are realized. This approach should provide cost effective, low risk technology development that will enable efficient and effective manned spaceflight missions. As a first step, the NASA Human Spaceflight Architecture Team (HAT) has identified a suite of critical technologies needed to support future manned missions across a range of destinations, including in cis-lunar space, near earth asteroid visits, lunar exploration, Mars moons, and Mars exploration. The challenge now is to develop a strategy and plan for technology development that efficiently enables these missions over a reasonable time period, without increasing technology development costs unnecessarily due to schedule pressure, and subsequently mitigating development and mission risks. NASA's Johnson Space Center (JSC), as the nation s primary center for human exploration, is addressing this challenge through an innovative approach in allocating Internal Research and Development funding to projects. The HAT Technology Needs (TechNeeds) Database has been developed to correlate across critical technologies and the NASA Office of Chief Technologist Technology Area Breakdown Structure (TABS). The TechNeeds Database illuminates that many critical technologies may support a single technical capability gap, that many HAT technology needs may map to a single TABS technology discipline, and that a single HAT technology need may map to multiple TABS technology disciplines. The TechNeeds Database greatly clarifies understanding of the complex relationships of critical technologies to mission and architecture element needs. Extensions to the core TechNeeds Database allow JSC to factor in and appropriately weight JSC Center Core Technology Competencies, and considerations of Commercialization Potential and Partnership Potential. The inherent coupling among these, along with an appropriate importance weighting, has provided an initial prioritization for allocation of technology development research funding for JSC. The HAT Technology Needs Database, with a core of built-in reports, clarifies and communicates complex technology needs for cost effective human space exploration such that an organization seeking to assure that research prioritization supports human spaceflight of the future can be successful.

  4. NOVAC - Network for Observation of Volcanic and Atmospheric Change: Data archiving and management

    NASA Astrophysics Data System (ADS)

    Lehmann, T.; Kern, C.; Vogel, L.; Platt, U.; Johansson, M.; Galle, B.

    2009-12-01

    The potential for volcanic risk assessment using real-time gas emissions data and the recognized power of sharing data from multiple eruptive centers were the motivation for a European Union FP6 Research Program project entitled NOVAC: Network for Observation of Volcanic and Atmospheric Change. Starting in 2005, a worldwide network of permanent scanning Differential Optical Absorption Spectroscopy (DOAS) instruments was installed at 26 volcanoes around the world. These ground-based remote sensing instruments record the characteristic absorption of volcanic gas emissions (e.g. SO2, BrO) in the ultra-violet wavelength region. A real-time DOAS retrieval was implemented to evaluate the measured spectra, thus providing the respective observatories with gas emission data which can be used for volcanic risk assessment and hazard prediction. Observatory personnel at each partner institution were trained on technical and scientific aspects of the DOAS technique, and a central database was created to allow the exchange of data and ideas between all partners. A bilateral benefit for volcano observatories as well as scientific institutions (e.g. universities and research centers) resulted. Volcano observatories were provided with leading edge technology for measuring volcanic SO2 emission fluxes, and now use this technology for monitoring and risk assessment, while the involved universities and research centers are working on global studies and characterizing the atmospheric impact of the observed gas emissions. The NOVAC database takes into account that project members use the database in a variety of different ways. Therefore, the data is structured in layers, the top of which contains basic information about each instrument. The second layer contains evaluated emission data such as SO2 column densities, SO2 emission fluxes, and BrO/SO2 ratios. The lowest layer contains all spectra measured by the individual instruments. Online since the middle of 2006, the NOVAC database currently contains 26 volcanoes, 56 instruments and more than 50 million spectra. It is scalable for up to 200 or more volcanoes, as the NOVAC project is open to outside participation. The data is archived in a MySQL Database system, storing and querying is done with PHP functions. The web interface is dynamically created based on the existing dataset and offers approx. 150 different search, display, and sorting options. Each user has a separate account and can save his personal search configuration from session to session. Search results are displayed in table form and can also be downloaded. Both evaluated data files and measured spectra can be downloaded as single files or in packages. The spectra can be plotted directly from the database, as well as several measurement values and evaluated parameters over selectable timescales. Because of the large extent of the dataset, major emphasis was placed on performance optimization.

  5. Advanced Technology Composite Fuselage-Structural Performance

    NASA Technical Reports Server (NTRS)

    Walker, T. H.; Minguet, P. J.; Flynn, B. W.; Carbery, D. J.; Swanson, G. D.; Ilcewicz, L. B.

    1997-01-01

    Boeing is studying the technologies associated with the application of composite materials to commercial transport fuselage structure under the NASA-sponsored contracts for Advanced Technology Composite Aircraft Structures (ATCAS) and Materials Development Omnibus Contract (MDOC). This report addresses the program activities related to structural performance of the selected concepts, including both the design development and subsequent detailed evaluation. Design criteria were developed to ensure compliance with regulatory requirements and typical company objectives. Accurate analysis methods were selected and/or developed where practical, and conservative approaches were used where significant approximations were necessary. Design sizing activities supported subsequent development by providing representative design configurations for structural evaluation and by identifying the critical performance issues. Significant program efforts were directed towards assessing structural performance predictive capability. The structural database collected to perform this assessment was intimately linked to the manufacturing scale-up activities to ensure inclusion of manufacturing-induced performance traits. Mechanical tests were conducted to support the development and critical evaluation of analysis methods addressing internal loads, stability, ultimate strength, attachment and splice strength, and damage tolerance. Unresolved aspects of these performance issues were identified as part of the assessments, providing direction for future development.

  6. Databases in the Central Government : State-of-the-art and the Future

    NASA Astrophysics Data System (ADS)

    Ohashi, Tomohiro

    Management and Coordination Agency, Prime Minister’s Office, conducted a survey by questionnaire against all Japanese Ministries and Agencies, in November 1985, on a subject of the present status of databases produced or planned to be produced by the central government. According to the results, the number of the produced databases has been 132 in 19 Ministries and Agencies. Many of such databases have been possessed by Defence Agency, Ministry of Construction, Ministry of Agriculture, Forestry & Fisheries, and Ministry of International Trade & Industries and have been in the fields of architecture & civil engineering, science & technology, R & D, agriculture, forestry and fishery. However the ratio of the databases available for other Ministries and Agencies has amounted to only 39 percent of all produced databases and the ratio of the databases unavailable for them has amounted to 60 percent of all of such databases, because of in-house databases and so forth. The outline of such results of the survey is reported and the databases produced by the central government are introduced under the items of (1) databases commonly used by all Ministries and Agencies, (2) integrated databases, (3) statistical databases and (4) bibliographic databases. The future problems are also described from the viewpoints of technology developments and mutual uses of databases.

  7. Development blocks in innovation networks: The Swedish manufacturing industry, 1970-2007.

    PubMed

    Taalbi, Josef

    2017-01-01

    The notion of development blocks (Dahmén, 1950, 1991) suggests the co-evolution of technologies and industries through complementarities and the overcoming of imbalances. This study proposes and applies a methodology to analyse development blocks empirically. To assess the extent and character of innovational interdependencies between industries the study combines analysis of innovation biographies and statistical network analysis. This is made possible by using data from a newly constructed innovation output database for Sweden. The study finds ten communities of closely related industries in which innovation activity has been prompted by the emergence of technological imbalances or by the exploitation of new technological opportunities. The communities found in the Swedish network of innovation are shown to be stable over time and often characterized by strong user-supplier interdependencies. These findings serve to stress how historical imbalances and opportunities are key to understanding the dynamics of the long-run development of industries and new technologies.

  8. HARMONIZING HEALTH TECHNOLOGY ASSESSMENT PRACTICES IN UNIVERSITY HOSPITALS: TO WHAT EXTENT IS THE MINI-HTA MODEL SUITABLE IN THE FRENCH CONTEXT?

    PubMed

    Martelli, Nicolas; Devaux, Capucine; van den Brink, Hélène; Billaux, Mathilde; Pineau, Judith; Prognon, Patrice; Borget, Isabelle

    2017-01-01

    The number of new medical devices for individual use that are launched annually exceeds the assessment capacity of the French national health technology assessment (HTA) agency. This has resulted in hospitals, and particularly university hospitals (UHs), developing hospital-based HTA initiatives to support their decisions for purchasing innovative devices. However, the methodologies used in such hospitals have no common basis. The aim of this study was to assess a mini-HTA model as a potential solution to harmonize HTA methodology in French UHs. A systematic review was conducted on Medline, Embase, Health Technology Assessment database, and Google Scholar to identify published articles reporting the use of mini-HTA tools and decision support-like models. A survey was also carried out in eighteen French UHs to identify in-house decision support tools. Finally, topics evaluated in the Danish mini-HTA model and in French UHs were compared using Jaccard similarity coefficients. Our findings showed differences between topics evaluated in French UHs and those assessed in decision support models from the literature. Only five topics among the thirteen most evaluated in French UHs were similar to those assessed in the Danish mini-HTA model. The organizational and ethical/social impacts were rarely explored among the surveyed models used in French UHs when introducing new medical devices. Before its widespread and harmonized use in French UHs, the mini-HTA model would first require adaptations to the French context.

  9. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    NASA Astrophysics Data System (ADS)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  10. Use of technology in children’s dietary assessment

    PubMed Central

    Boushey, CJ; Kerr, DA; Wright, J; Lutes, KD; Ebert, DS; Delp, EJ

    2010-01-01

    Background Information on dietary intake provides some of the most valuable insights for mounting intervention programmes for the prevention of chronic diseases. With the growing concern about adolescent overweight, the need to accurately measure diet becomes imperative. Assessment among adolescents is problematic as this group has irregular eating patterns and they have less enthusiasm for recording food intake. Subjects/Methods We used qualitative and quantitative techniques among adolescents to assess their preferences for dietary assessment methods. Results Dietary assessment methods using technology, for example, a personal digital assistant (PDA) or a disposable camera, were preferred over the pen and paper food record. Conclusions There was a strong preference for using methods that incorporate technology such as capturing images of food. This suggests that for adolescents, dietary methods that incorporate technology may improve cooperation and accuracy. Current computing technology includes higher resolution images, improved memory capacity and faster processors that allow small mobile devices to process information not previously possible. Our goal is to develop, implement and evaluate a mobile device (for example, PDA, mobile phone) food record that will translate to an accurate account of daily food and nutrient intake among adolescents. This mobile computing device will include digital images, a nutrient database and image analysis for identification and quantification of food consumption. Mobile computing devices provide a unique vehicle for collecting dietary information that reduces the burden on record keepers. Images of food can be marked with a variety of input methods that link the item for image processing and analysis to estimate the amount of food. Images before and after the foods are eaten can estimate the amount of food consumed. The initial stages and potential of this project will be described. PMID:19190645

  11. Use of technology in children's dietary assessment.

    PubMed

    Boushey, C J; Kerr, D A; Wright, J; Lutes, K D; Ebert, D S; Delp, E J

    2009-02-01

    Information on dietary intake provides some of the most valuable insights for mounting intervention programmes for the prevention of chronic diseases. With the growing concern about adolescent overweight, the need to accurately measure diet becomes imperative. Assessment among adolescents is problematic as this group has irregular eating patterns and they have less enthusiasm for recording food intake. We used qualitative and quantitative techniques among adolescents to assess their preferences for dietary assessment methods. Dietary assessment methods using technology, for example, a personal digital assistant (PDA) or a disposable camera, were preferred over the pen and paper food record. There was a strong preference for using methods that incorporate technology such as capturing images of food. This suggests that for adolescents, dietary methods that incorporate technology may improve cooperation and accuracy. Current computing technology includes higher resolution images, improved memory capacity and faster processors that allow small mobile devices to process information not previously possible. Our goal is to develop, implement and evaluate a mobile device (for example, PDA, mobile phone) food record that will translate to an accurate account of daily food and nutrient intake among adolescents. This mobile computing device will include digital images, a nutrient database and image analysis for identification and quantification of food consumption. Mobile computing devices provide a unique vehicle for collecting dietary information that reduces the burden on record keepers. Images of food can be marked with a variety of input methods that link the item for image processing and analysis to estimate the amount of food. Images before and after the foods are eaten can estimate the amount of food consumed. The initial stages and potential of this project will be described.

  12. 49 CFR 1104.3 - Copies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...-compatible format. All databases must be supported with adequate documentation on data attributes, SQL...

  13. A New Methodology for Systematic Exploitation of Technology Databases.

    ERIC Educational Resources Information Center

    Bedecarrax, Chantal; Huot, Charles

    1994-01-01

    Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)

  14. Curriculum Connection. Take Technology Outdoors.

    ERIC Educational Resources Information Center

    Dean, Bruce Robert

    1992-01-01

    Technology can support hands-on science as elementary students use computers to formulate field guides to nature surrounding their school. Students examine other field guides; open databases for recording information; collect, draw, and identify plants, insects, and animals; enter data into the database; then generate a computerized field guide.…

  15. Massage therapy for children with autism spectrum disorders: a systematic review.

    PubMed

    Lee, Myeong Soo; Kim, Jong-In; Ernst, Edzard

    2011-03-01

    We aimed to assess the effectiveness of massage as a treatment option for autism. We searched the following electronic databases using the time of their inception through March 2010: MEDLINE, AMED, CINAHL, EMBASE, PsycINFO, Health Technology Assessment, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, Psychology and Behavioral Sciences Collection, 6 Korean medical databases (KSI, DBpia, KISTEP, RISS, KoreaMed, and National Digital Library), China Academic Journal (through China National Knowledge Infrastructure), and 3 Japanese medical databases (Journal@rchive, Science Links Japan, and Japan Science & Technology link). The search phrase used was "(massage OR touch OR acupressure) AND (autistic OR autism OR Asperger's syndrome OR pervasive developmental disorder)." The references in all located articles were also searched. No language restrictions were imposed. Prospective controlled clinical studies of any type of massage therapy for autistic patients were included. Trials in which massage was part of a complex intervention were also included. Case studies, case series, qualitative studies, uncontrolled trials, studies that failed to provide detailed results, and trials that compared one type of massage with another were excluded. All articles were read by 2 independent reviewers (M.S.L. and J-I.K.), who extracted data from the articles according to predefined criteria. Risk of bias was assessed using the Cochrane classification. Of 132 articles, only 6 studies met our inclusion criteria. One randomized clinical trial found that massage plus conventional language therapy was superior to conventional language therapy alone for symptom severity (P < .05) and communication attitude (P < .01). Two randomized clinical trials reported a significant benefit of massage for sensory profile (P < .01), adaptive behavior (P < .05), and language and social abilities (P < .01) as compared with a special education program. The fourth randomized clinical trial showed beneficial effects of massage for social communication (P < .05). Two nonrandomized controlled clinical trials suggested that massage therapy is effective. However, all of the included trials have high risk of bias. The main limitations of the included studies were small sample sizes, predefined primary outcome measures, inadequate control for nonspecific effects, and a lack of power calculations or adequate follow-up. Limited evidence exists for the effectiveness of massage as a symptomatic treatment of autism. Because the risk of bias was high, firm conclusions cannot be drawn. Future, more rigorous randomized clinical trials seem to be warranted. © Copyright 2011 Physicians Postgraduate Press, Inc.

  16. An international aerospace information system - A cooperative opportunity

    NASA Technical Reports Server (NTRS)

    Blados, Walter R.; Cotter, Gladys A.

    1992-01-01

    This paper presents for consideration new possibilities for uniting the various aerospace database efforts toward a cooperative international aerospace database initiative that can optimize the cost-benefit equation for all members. The development of astronautics and aeronautics in individual nations has led to initiatives for national aerospace databases. Technological developments in information technology and science, as well as the reality of scarce resources, makes it necessary to reconsider the mutually beneficial possibilities offered by cooperation and international resource sharing.

  17. Quality Attribute-Guided Evaluation of NoSQL Databases: A Case Study

    DTIC Science & Technology

    2015-01-16

    evaluations of NoSQL databases specifically, and big data systems in general, that have become apparent during our study. Keywords—NoSQL, distributed...technology, namely that of big data , software systems [1]. At the heart of big data systems are a collection of database technologies that are more...born organizations such as Google and Amazon [3][4], along with those of numerous other big data innovators, have created a variety of open source and

  18. The National Landslide Database of Great Britain: Acquisition, communication and the role of social media

    NASA Astrophysics Data System (ADS)

    Pennington, Catherine; Freeborough, Katy; Dashwood, Claire; Dijkstra, Tom; Lawrie, Kenneth

    2015-11-01

    The British Geological Survey (BGS) is the national geological agency for Great Britain that provides geoscientific information to government, other institutions and the public. The National Landslide Database has been developed by the BGS and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 17,000 records of landslide events to date, each documented as fully as possible for inland, coastal and artificial slopes. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and using citizen science through social media and other online resources. This information is invaluable for directing the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domains map currently under development, as well as regional mapping campaigns, rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures, an understanding of causative factors, their spatial distribution and likely impacts, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) and Hazard Impact Model contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP partnership and data collected for the National Landslide Database are used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a variety of stakeholders for research purposes.

  19. An Improved Database System for Program Assessment

    ERIC Educational Resources Information Center

    Haga, Wayne; Morris, Gerard; Morrell, Joseph S.

    2011-01-01

    This research paper presents a database management system for tracking course assessment data and reporting related outcomes for program assessment. It improves on a database system previously presented by the authors and in use for two years. The database system presented is specific to assessment for ABET (Accreditation Board for Engineering and…

  20. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  1. Health technology management: a database analysis as support of technology managers in hospitals.

    PubMed

    Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi

    2011-01-01

    Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy.

  2. Optoelectronics-related competence building in Japanese and Western firms

    NASA Astrophysics Data System (ADS)

    Miyazaki, Kumiko

    1992-05-01

    In this paper, an analysis is made of how different firms in Japan and the West have developed competence related to optoelectronics on the basis of their previous experience and corporate strategies. The sample consists of a set of seven Japanese and four Western firms in the industrial, consumer electronics and materials sectors. Optoelectronics is divided into subfields including optical communications systems, optical fibers, optoelectronic key components, liquid crystal displays, optical disks, and others. The relative strengths and weaknesses of companies in the various subfields are determined using the INSPEC database, from 1976 to 1989. Parallel data are analyzed using OTAF U.S. patent statistics and the two sets of data are compared. The statistical analysis from the database is summarized for firms in each subfield in the form of an intra-firm technology index (IFTI), a new technique introduced to assess the revealed technology advantage of firms. The quantitative evaluation is complemented by results from intensive interviews with the management and scientists of the firms involved. The findings show that there is a marked variation in the way firms' technological trajectories have evolved giving rise to strength in some and weakness in other subfields for the different companies, which are related to their accumulated core competencies, previous core business activities, organizational, marketing, and competitive factors.

  3. Methods for systematic reviews of health economic evaluations: a systematic review, comparison, and synthesis of method literature.

    PubMed

    Mathes, Tim; Walgenbach, Maren; Antoine, Sunya-Lee; Pieper, Dawid; Eikermann, Michaela

    2014-10-01

    The quality of systematic reviews of health economic evaluations (SR-HE) is often limited because of methodological shortcomings. One reason for this poor quality is that there are no established standards for the preparation of SR-HE. The objective of this study is to compare existing methods and suggest best practices for the preparation of SR-HE. To identify the relevant methodological literature on SR-HE, a systematic literature search was performed in Embase, Medline, the National Health System Economic Evaluation Database, the Health Technology Assessment Database, and the Cochrane methodology register, and webpages of international health technology assessment agencies were searched. The study selection was performed independently by 2 reviewers. Data were extracted by one reviewer and verified by a second reviewer. On the basis of the overlaps in the recommendations for the methods of SR-HE in the included papers, suggestions for best practices for the preparation of SR-HE were developed. Nineteen relevant publications were identified. The recommendations within them often differed. However, for most process steps there was some overlap between recommendations for the methods of preparation. The overlaps were taken as basis on which to develop suggestions for the following process steps of preparation: defining the research question, developing eligibility criteria, conducting a literature search, selecting studies, assessing the methodological study quality, assessing transferability, and synthesizing data. The differences in the proposed recommendations are not always explainable by the focus on certain evaluation types, target audiences, or integration in the decision process. Currently, there seem to be no standard methods for the preparation of SR-HE. The suggestions presented here can contribute to the harmonization of methods for the preparation of SR-HE. © The Author(s) 2014.

  4. Directory of Assistive Technology: Data Sources.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    The annotated directory describes in detail both on-line and print databases in the area of assistive technology for individuals with disabilities. For each database, the directory provides the name, address, and telephone number of the sponsoring organization; disability areas served; number of hardware and software products; types of information…

  5. The Effect of Relational Database Technology on Administrative Computing at Carnegie Mellon University.

    ERIC Educational Resources Information Center

    Golden, Cynthia; Eisenberger, Dorit

    1990-01-01

    Carnegie Mellon University's decision to standardize its administrative system development efforts on relational database technology and structured query language is discussed and its impact is examined in one of its larger, more widely used applications, the university information system. Advantages, new responsibilities, and challenges of the…

  6. Utility-Scale Energy Technology Capacity Factors | Energy Analysis | NREL

    Science.gov Websites

    Transparent Cost Database Button This chart indicates the range of recent capacity factor estimates for utility-scale technology cost and performance estimates, please visit the Transparent Cost Database website for NREL's information regarding vehicles, biofuels, and electricity generation. Capital Cost

  7. Understanding transit accidents using the National Transit Database and the role of Transit Intelligent Vehicle Initiative Technology in reducing accidents

    DOT National Transportation Integrated Search

    2004-06-01

    This report documents the results of bus accident data analysis using the 2002 National Transit Database (NTD) and discusses the potential of using advanced technology being studied and developed under the U.S. Department of Transportations (U.S. ...

  8. Research on sudden environmental pollution public service platform construction based on WebGIS

    NASA Astrophysics Data System (ADS)

    Bi, T. P.; Gao, D. Y.; Zhong, X. Y.

    2016-08-01

    In order to actualize the social sharing and service of the emergency-response information for sudden pollution accidents, the public can share the risk source information service, dangerous goods control technology service and so on, The SQL Server and ArcSDE software are used to establish a spatial database to restore all kinds of information including risk sources, hazardous chemicals and handling methods in case of accidents. Combined with Chinese atmospheric environmental assessment standards, the SCREEN3 atmospheric dispersion model and one-dimensional liquid diffusion model are established to realize the query of related information and the display of the diffusion effect under B/S structure. Based on the WebGIS technology, C#.Net language is used to develop the sudden environmental pollution public service platform. As a result, the public service platform can make risk assessments and provide the best emergency processing services.

  9. Is home health technology adequate for proactive self-care?

    PubMed

    Horwitz, C M; Mueller, M; Wiley, D; Tentler, A; Bocko, M; Chen, L; Leibovici, A; Quinn, J; Shar, A; Pentland, A P

    2008-01-01

    To understand whether home health technology in the market and in development can satisfy the needs of patients and their non-professional caregivers for proactive support in managing health and chronic conditions in the home. A panel of clinical providers and technology researchers was assembled to examine whether home health technology addresses consumer-defined requirements for self-care devices. A lexicon of home care and self-care technology terms was then created. A global survey of home health technology for patients with heart disease and dementia was conducted. The 254 items identified were categorized by conditions treated, primary user, function, and purpose. A focus group of patients and caregivers was convened to describe their expectations of self-care technology. Items identified in the database were then assessed for these attributes. Patients and family caregivers indicated a need for intelligent self-care technology which supports early diagnosis of health changes, intervention enablement, and improvement of communication quality among patients and the health care system. Of these, only intervention enablement was commonly found in the home health technology items identified. An opportunity exists to meet consumer self-care needs through increased research and development in intelligent self-care technology.

  10. Regional Educational Laboratory Electronic Network Phase 2 System

    NASA Technical Reports Server (NTRS)

    Cradler, John

    1995-01-01

    The Far West Laboratory in collaboration with the other regional educational laboratories is establishing a regionally coordinated telecommunication network to electronically interconnect each of the ten regional laboratories with educators and education stakeholders from the school to the state level. For the national distributed information database, each lab is working with mid-level networks to establish a common interface for networking throughout the country and include topics of importance to education reform as assessment and technology planning.

  11. Remote Sensing Applied to Geology (Latest Citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The bibliography contains citations concerning the use of remote sensing in geological resource exploration. Technologies discussed include thermal, optical, photographic, and electronic imaging using ground-based, aerial, and satellite-borne devices. Analog and digital techniques to locate, classify, and assess geophysical features, structures, and resources are also covered. Application of remote sensing to petroleum and minerals exploration is treated in a separate bibliography. (Contains 50-250 citations and includes a subject term index and title list.)

  12. Comparing the sustainability impacts of solar thermal and natural gas combined cycle for electricity production in Mexico: Accounting for decision makers' priorities

    NASA Astrophysics Data System (ADS)

    Rodríguez-Serrano, Irene; Caldés, Natalia; Oltra, Christian; Sala, Roser

    2017-06-01

    The aim of this paper is to conduct a comprehensive sustainability assessment of the electricity generation with two alternative electricity generation technologies by estimating its economic, environmental and social impacts through the "Framework for Integrated Sustainability Assessment" (FISA). Based on a Multiregional Input Output (MRIO) model linked to a social risk database (Social Hotspot Database), the framework accounts for up to fifteen impacts across the three sustainability pillars along the supply chain of the electricity production from Solar Thermal Electricity (STE) and Natural Gas Combined Cycle (NGCC) technologies in Mexico. Except for value creation, results show larger negative impacts for NGCC, particularly in the environmental pillar. Next, these impacts are transformed into "Aggregated Sustainability Endpoints" (ASE points) as a way to support the decision making in selecting the best sustainable project. ASE points obtained are later compared to the resulting points weighted by the reported priorities of Mexican decision makers in the energy sector obtained from a questionnaire survey. The comparison shows that NGCC achieves a 1.94 times worse negative score than STE, but after incorporating decision makerś priorities, the ratio increases to 2.06 due to the relevance given to environmental impacts such as photochemical oxidants formation and climate change potential, as well as social risks like human rights risks.

  13. Methods to elicit probability distributions from experts: a systematic review of reported practice in health technology assessment.

    PubMed

    Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken

    2013-11-01

    Elicitation is a technique that can be used to obtain probability distribution from experts about unknown quantities. We conducted a methodology review of reports where probability distributions had been elicited from experts to be used in model-based health technology assessments. Databases including MEDLINE, EMBASE and the CRD database were searched from inception to April 2013. Reference lists were checked and citation mapping was also used. Studies describing their approach to the elicitation of probability distributions were included. Data was abstracted on pre-defined aspects of the elicitation technique. Reports were critically appraised on their consideration of the validity, reliability and feasibility of the elicitation exercise. Fourteen articles were included. Across these studies, the most marked features were heterogeneity in elicitation approach and failure to report key aspects of the elicitation method. The most frequently used approaches to elicitation were the histogram technique and the bisection method. Only three papers explicitly considered the validity, reliability and feasibility of the elicitation exercises. Judged by the studies identified in the review, reports of expert elicitation are insufficient in detail and this impacts on the perceived usability of expert-elicited probability distributions. In this context, the wider credibility of elicitation will only be improved by better reporting and greater standardisation of approach. Until then, the advantage of eliciting probability distributions from experts may be lost.

  14. ICT-based applications to improve social health and social participation in older adults with dementia. A systematic literature review.

    PubMed

    Pinto-Bruno, Ángel C; García-Casal, J Antonio; Csipke, Emese; Jenaro-Río, Cristina; Franco-Martín, Manuel

    2017-01-01

    Information and communication technologies (ICT) developers, together with dementia experts have created several technological solutions to improve and facilitate social health and social participation and quality of life of older adults living with dementia. However, there is a need to carry out a systematic literature review that focuses on the validity and efficacy of these new technologies assessing their utility to promote 'social health' and 'active ageing' in people with dementia. Searches in electronic databases identified 3824 articles of which 6 met the inclusion criteria and were coded according to their methodological approach, sample sizes, type of outcomes and results. Six papers were identified reporting the use of 10 different interventions with people with dementia. Qualitative studies (four) showed a benefit of the use of technologies to foster social participation in people with dementia. At the same time, barriers to a widespread use of these technologies in this population were identified. A quantitative study and a mixed-method study with quantitative outcomes showed that ICT-based interventions promote more social behaviours than non-technology-based interventions. In the last years, several technological devices for living independently and fostering social health and social participation in people with dementia have been developed. However, specific outcome measures to assess social health and social participation are needed. Even though the analysed studies provided some evidence-base for the use of technology in this field, there is an urge to develop high quality studies and specific outcome measures.

  15. Mobile and Web 2.0 interventions for weight management: an overview of review evidence and its methodological quality

    PubMed Central

    Smith, Jane R.; Samaha, Laya; Abraham, Charles

    2016-01-01

    Abstract Background : The use of Internet and related technologies for promoting weight management (WM), physical activity (PA), or dietary-related behaviours has been examined in many articles and systematic reviews. This overview aims to summarize and assess the quality of the review evidence specifically focusing on mobile and Web 2.0 technologies, which are the most utilized, currently available technologies. Methods: Following a registered protocol (CRD42014010323), we searched 16 databases for articles published in English until 31 December 2014 discussing the use of either mobile or Web 2.0 technologies to promote WM or related behaviors, i.e. diet and physical activity (PA). Two reviewers independently selected reviews and assessed their methodological quality using the AMSTAR checklist. Citation matrices were used to determine the overlap among reviews. Results: Forty-four eligible reviews were identified, 39 of which evaluated the effects of interventions using mobile or Web 2.0 technologies. Methodological quality was generally low with only 7 reviews (16%) meeting the highest standards. Suggestive evidence exists for positive effects of mobile technologies on weight-related outcomes and, to a lesser extent, PA. Evidence is inconclusive regarding Web 2.0 technologies. Conclusions : Reviews on mobile and Web 2.0 interventions for WM and related behaviors suggest that these technologies can, under certain circumstances, be effective, but conclusions are limited by poor review quality based on a heterogeneous evidence base. PMID:27335330

  16. Application and Exploration of Big Data Mining in Clinical Medicine.

    PubMed

    Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling

    2016-03-20

    To review theories and technologies of big data mining and their application in clinical medicine. Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster-Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Big data mining has the potential to play an important role in clinical medicine.

  17. FJET Database Project: Extract, Transform, and Load

    NASA Technical Reports Server (NTRS)

    Samms, Kevin O.

    2015-01-01

    The Data Mining & Knowledge Management team at Kennedy Space Center is providing data management services to the Frangible Joint Empirical Test (FJET) project at Langley Research Center (LARC). FJET is a project under the NASA Engineering and Safety Center (NESC). The purpose of FJET is to conduct an assessment of mild detonating fuse (MDF) frangible joints (FJs) for human spacecraft separation tasks in support of the NASA Commercial Crew Program. The Data Mining & Knowledge Management team has been tasked with creating and managing a database for the efficient storage and retrieval of FJET test data. This paper details the Extract, Transform, and Load (ETL) process as it is related to gathering FJET test data into a Microsoft SQL relational database, and making that data available to the data users. Lessons learned, procedures implemented, and programming code samples are discussed to help detail the learning experienced as the Data Mining & Knowledge Management team adapted to changing requirements and new technology while maintaining flexibility of design in various aspects of the data management project.

  18. Development of Climate Change Adaptation Platform using Spatial Information

    NASA Astrophysics Data System (ADS)

    Lee, J.; Oh, K. Y.; Lee, M. J.; Han, W. J.

    2014-12-01

    Climate change adaptation has attracted growing attention with the recent extreme weather conditions that affect people around the world. More and more countries, including the Republic of Korea, have begun to hatch adaptation plan to resolve these matters of great concern. They all, meanwhile, have mentioned that it should come first to integrate climate information in all analysed areas. That's because climate information is not independently made through one source; that is to say, the climate information is connected one another in a complicated way. That is the reason why we have to promote integrated climate change adaptation platform before setting up climate change adaptation plan. Therefore, the large-scaled project has been actively launched and worked on. To date, we researched 620 literatures and interviewed 51 government organizations. Based on the results of the researches and interviews, we obtained 2,725 impacts about vulnerability assessment information such as Monitoring and Forecasting, Health, Disaster, Agriculture, Forest, Water Management, Ecosystem, Ocean/Fisheries, Industry/Energy. Among 2,725 impacts, 995 impacts are made into a database until now. This database is made up 3 sub categories like Climate-Exposure, Sensitivity, Adaptive capacity, presented by IPCC. Based on the constructed database, vulnerability assessments were carried out in order to evaluate climate change capacity of local governments all over the country. These assessments were conducted by using web-based vulnerability assessment tool which was newly developed through this project. These results have shown that, metropolitan areas like Seoul, Pusan, Inchon, and so on have high risks more than twice than rural areas. Acknowledgements: The authors appreciate the support that this study has received from "Development of integrated model for climate change impact and vulnerability assessment and strengthening the framework for model implementation ", an initiative of the Korea Environmental & Industry Technology Institute .

  19. Computer-assisted history-taking systems (CAHTS) in health care: benefits, risks and potential for further development.

    PubMed

    Pappas, Yannis; Anandan, Chantelle; Liu, Joseph; Car, Josip; Sheikh, Aziz; Majeed, Azeem

    2011-01-01

    A computer-assisted history-taking system (CAHTS) is a tool that aids clinicians in gathering data from patients to inform a diagnosis or treatment plan. Despite the many possible applications and even though CAHTS have been available for nearly three decades, these remain underused in routine clinical practice. Through an interpretative review of the literature, we provide an overview of the field of CAHTS, which also offers an understanding of the impact of these systems on policy, practice and research. We conducted a search and critique of the literature on CAHTS. Using a comprehensive set of terms, we searched: MEDLINE, EMBASE, The Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, The Cochrane Central Register of Controlled Trials, The Cochrane Methodology Register, Health Technology Assessment Database and the NHS Economic Evaluation Database over a ten-year period (January 1997 to May 2007) to identify systematic reviews, technical reports and health technology assessments, and randomised controlled trials. The systematic review of the literature suggests that CAHTS can save professionals' time, improve delivery of care to those with special needs and also facilitate the collection of information, especially potentially sensitive information (e.g. sexual history, alcohol consumption). The use of CAHTS also has disadvantages that impede the process of history taking and may pose risks to patients. CAHTS are inherently limited when detecting non-verbal communication, may pose irrelevant questions and frustrate the users with technical problems. Our review suggests that barriers such as a preference for pen-and-paper methods and concerns about data loss and security still exist and affect the adoption of CAHTS. In terms of policy and practice, CAHTS make input of data from disparate sites possible, which facilitates work from disparate sites and the collection of data for nationwide screening programmes such as the vascular risk assessment programme for people aged 40-74, now starting in England. Our review shows that for CAHTS to be adopted in mainstream health care, important changes should take place in how we conceive, plan and conduct primary and secondary research on the topic so that we provide the framework for a comprehensive evaluation that will lead to an evidence base to inform policy and practice.

  20. Realization of Real-Time Clinical Data Integration Using Advanced Database Technology

    PubMed Central

    Yoo, Sooyoung; Kim, Boyoung; Park, Heekyong; Choi, Jinwook; Chun, Jonghoon

    2003-01-01

    As information & communication technologies have advanced, interest in mobile health care systems has grown. In order to obtain information seamlessly from distributed and fragmented clinical data from heterogeneous institutions, we need solutions that integrate data. In this article, we introduce a method for information integration based on real-time message communication using trigger and advanced database technologies. Messages were devised to conform to HL7, a standard for electronic data exchange in healthcare environments. The HL7 based system provides us with an integrated environment in which we are able to manage the complexities of medical data. We developed this message communication interface to generate and parse HL7 messages automatically from the database point of view. We discuss how easily real time data exchange is performed in the clinical information system, given the requirement for minimum loading of the database system. PMID:14728271

  1. Distribution System Upgrade Unit Cost Database

    DOE Data Explorer

    Horowitz, Kelsey

    2017-11-30

    This database contains unit cost information for different components that may be used to integrate distributed photovotaic (D-PV) systems onto distribution systems. Some of these upgrades and costs may also apply to integration of other distributed energy resources (DER). Which components are required, and how many of each, is system-specific and should be determined by analyzing the effects of distributed PV at a given penetration level on the circuit of interest in combination with engineering assessments on the efficacy of different solutions to increase the ability of the circuit to host additional PV as desired. The current state of the distribution system should always be considered in these types of analysis. The data in this database was collected from a variety of utilities, PV developers, technology vendors, and published research reports. Where possible, we have included information on the source of each data point and relevant notes. In some cases where data provided is sensitive or proprietary, we were not able to specify the source, but provide other information that may be useful to the user (e.g. year, location where equipment was installed). NREL has carefully reviewed these sources prior to inclusion in this database. Additional information about the database, data sources, and assumptions is included in the "Unit_cost_database_guide.doc" file included in this submission. This guide provides important information on what costs are included in each entry. Please refer to this guide before using the unit cost database for any purpose.

  2. Evaluation of a Computerized Clinical Information System (Micromedex).

    PubMed Central

    Lundsgaarde, H. P.; Moreshead, G. E.

    1991-01-01

    This paper summarizes data collected as part of a project designed to identify and assess the technical and organizational problems associated with the implementation and evaluation of a Computerized Clinical Information System (CCIS), Micromedex, in three U.S. Department of Veterans Affairs Medical Centers (VAMCs). The study began in 1987 as a national effort to implement decision support technologies in the Veterans Administration Decentralized Hospital Computer Program (DHCP). The specific objectives of this project were to (1) examine one particular decision support technology, (2) identify the technical and organizational barriers to the implementation of a CCIS in the VA host environment, (3) assess the possible benefits of this system to VA clinicians in terms of therapeutic decision making, and (4) develop new methods for identifying the clinical utility of a computer program designed to provide clinicians with a new information tool. The project was conducted intermittently over a three-year period at three VA medical centers chosen as implementation and evaluation test sites for Micromedex. Findings from the Kansas City Medical Center in Missouri are presented to illustrate some of the technical problems associated with the implementation of a commercial database program in the DHCP host environment, the organizational factors influencing clinical use of the system, and the methods used to evaluate its use. Data from 4581 provider encounters with the CCIS are summarized. Usage statistics are presented to illustrate the methodological possibilities for assessing the "benefits and burdens" of a computerized information system by using an automated collection of user demographics and program audit trails that allow evaluators to monitor user interactions with different segments of the database. PMID:1807583

  3. Evaluation of a Computerized Clinical Information System (Micromedex).

    PubMed

    Lundsgaarde, H P; Moreshead, G E

    1991-01-01

    This paper summarizes data collected as part of a project designed to identify and assess the technical and organizational problems associated with the implementation and evaluation of a Computerized Clinical Information System (CCIS), Micromedex, in three U.S. Department of Veterans Affairs Medical Centers (VAMCs). The study began in 1987 as a national effort to implement decision support technologies in the Veterans Administration Decentralized Hospital Computer Program (DHCP). The specific objectives of this project were to (1) examine one particular decision support technology, (2) identify the technical and organizational barriers to the implementation of a CCIS in the VA host environment, (3) assess the possible benefits of this system to VA clinicians in terms of therapeutic decision making, and (4) develop new methods for identifying the clinical utility of a computer program designed to provide clinicians with a new information tool. The project was conducted intermittently over a three-year period at three VA medical centers chosen as implementation and evaluation test sites for Micromedex. Findings from the Kansas City Medical Center in Missouri are presented to illustrate some of the technical problems associated with the implementation of a commercial database program in the DHCP host environment, the organizational factors influencing clinical use of the system, and the methods used to evaluate its use. Data from 4581 provider encounters with the CCIS are summarized. Usage statistics are presented to illustrate the methodological possibilities for assessing the "benefits and burdens" of a computerized information system by using an automated collection of user demographics and program audit trails that allow evaluators to monitor user interactions with different segments of the database.

  4. Visibility of medical informatics regarding bibliometric indices and databases

    PubMed Central

    2011-01-01

    Background The quantitative study of the publication output (bibliometrics) deeply influences how scientific work is perceived (bibliometric visibility). Recently, new bibliometric indices and databases have been established, which may change the visibility of disciplines, institutions and individuals. This study examines the effects of the new indices on the visibility of Medical Informatics. Methods By objective criteria, three sets of journals are chosen, two representing Medical Informatics and a third addressing Internal Medicine as a benchmark. The availability of index data (index coverage) and the aggregate scores of these corpora are compared for journal-related (Journal impact factor, Eigenfactor metrics, SCImago journal rank) and author-related indices (Hirsch-index, Egghes G-index). Correlation analysis compares the dependence of author-related indices. Results The bibliometric visibility depended on the research focus and the citation database: Scopus covers more journals relevant for Medical Informatics than ISI/Thomson Reuters. Journals focused on Medical Informatics' methodology were negatively affected by the Eigenfactor metrics, while the visibility profited from an interdisciplinary research focus. The correlation between Hirsch-indices computed on citation databases and the Internet was strong. Conclusions The visibility of smaller technology-oriented disciplines like Medical Informatics is changed by the new bibliometric indices and databases possibly leading to suitably changed publication strategies. Freely accessible author-related indices enable an easy and adequate individual assessment. PMID:21496230

  5. Automating Relational Database Design for Microcomputer Users.

    ERIC Educational Resources Information Center

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  6. Marine and Hydrokinetic Data | Geospatial Data Science | NREL

    Science.gov Websites

    . wave energy resource using a 51-month Wavewatch III hindcast database developed by the National Database The U.S. Department of Energy's Marine and Hydrokinetic Technology Database provides information database includes wave, tidal, current, and ocean thermal energy and contains information about energy

  7. The impact of parallel regulatory-health technology assessment scientific advice on clinical development. Assessing the uptake of regulatory and health technology assessment recommendations.

    PubMed

    Tafuri, Giovanni; Lucas, Inês; Estevão, Steve; Moseley, Jane; d'Andon, Anne; Bruehl, Hannah; Gajraj, Elangovan; Garcia, Sonia; Hedberg, Niklas; Massari, Marco; Molina, Andrea; Obach, Mercè; Osipenko, Leeza; Petavy, Frank; Petschulies, Marco; Pontes, Caridad; Russo, Pierluigi; Schiel, Anja; Van de Casteele, Marc; Zebedin-Brandl, Eva-Maria; Rasi, Guido; Vamvakas, Spiros

    2018-05-01

    The parallel regulatory-health technology assessment scientific advice (PSA) procedure allows manufacturers to receive simultaneous feedback from both EU regulators and health technology assessment (HTA) bodies on development plans for new medicines. The primary objective of the present study is to investigate whether PSA is integrated in the clinical development programmes for which advice was sought. Contents of PSA provided by regulators and HTA bodies for each procedure between 2010 and 2015 were analysed. The development of all clinical studies for which PSA had been sought was tracked using three different databases. The rate of uptake of the advice provided by regulators and HTA bodies was assessed on two key variables: comparator/s and primary endpoint. In terms of uptake of comparator recommendations at the time of PSA in the actual development, our analysis showed that manufacturers implemented comparators to address both the needs of regulators and of at least one HTA body in 12 of 21 studies. For primary endpoints, in all included studies manufacturers addressed both the needs of the regulators and at least one HTA body. One of the key findings of this analysis is that manufacturers tend to implement changes to the development programme based on both regulatory and HTA advice with regards to the choice of primary endpoint and comparator. It also confirms the challenging choice of the study comparator, for which manufacturers seem to be more inclined to satisfy the regulatory advice. Continuous research efforts in this area are of paramount importance from a public health perspective. © 2018 The British Pharmacological Society.

  8. Energy science and technology database (on the internet). Online data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The Energy Science and Technology Database (EDB) is a multidisciplinary file containing worldwide references to basic and applied scientific and technical research literature. The information is collected for use by government managers, researchers at the national laboratories, and other research efforts sponsored by the U.S. Department of Energy, and the results of this research are transferred to the public. Abstracts are included for records from 1976 to the present. The EDB also contains the Nuclear Science Abstracts which is a comprehensive abstract and index collection to the international nuclear science and technology literature for the period 1948 through 1976. Includedmore » are scientific and technical reports of the U.S. Atomic Energy Commission, U.S. Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Approximately 25% of the records in the file contain abstracts. Nuclear Science Abstracts contains over 900,000 bibliographic records. The entire Energy Science and Technology Database contains over 3 million bibliographic records. This database is now available for searching through the GOV. Research-Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less

  9. Cost collection and analysis for health economic evaluation.

    PubMed

    Smith, Kristine A; Rudmik, Luke

    2013-08-01

    To improve the understanding of common health care cost collection, estimation, analysis, and reporting methodologies. Ovid MEDLINE (1947 to December 2012), Cochrane Central register of Controlled Trials, Database of Systematic Reviews, Health Technology Assessment, and National Health Service Economic Evaluation Database. This article discusses the following cost collection methods: defining relevant resources, quantification of consumed resources, and resource valuation. It outlines the recommendations for cost reporting in economic evaluations and reviews the techniques on how to handle cost data uncertainty. Last, it discusses the controversial topics of future costs and patient productivity losses. Health care cost collection and estimation can be challenging, and an organized approach is required to optimize accuracy of economic evaluation outcomes. Understanding health care cost collection and estimation techniques will improve both critical appraisal and development of future economic evaluations.

  10. Reflections on CD-ROM: Bridging the Gap between Technology and Purpose.

    ERIC Educational Resources Information Center

    Saviers, Shannon Smith

    1987-01-01

    Provides a technological overview of CD-ROM (Compact Disc-Read Only Memory), an optically-based medium for data storage offering large storage capacity, computer-based delivery system, read-only medium, and economic mass production. CD-ROM database attributes appropriate for information delivery are also reviewed, including large database size,…

  11. Proposal for Implementing Multi-User Database (MUD) Technology in an Academic Library.

    ERIC Educational Resources Information Center

    Filby, A. M. Iliana

    1996-01-01

    Explores the use of MOO (multi-user object oriented) virtual environments in academic libraries to enhance reference services. Highlights include the development of multi-user database (MUD) technology from gaming to non-recreational settings; programming issues; collaborative MOOs; MOOs as distinguished from other types of virtual reality; audio…

  12. Psychometric Properties of Patient-Facing eHealth Evaluation Measures: Systematic Review and Analysis.

    PubMed

    Wakefield, Bonnie J; Turvey, Carolyn L; Nazi, Kim M; Holman, John E; Hogan, Timothy P; Shimada, Stephanie L; Kennedy, Diana R

    2017-10-11

    Significant resources are being invested into eHealth technology to improve health care. Few resources have focused on evaluating the impact of use on patient outcomes A standardized set of metrics used across health systems and research will enable aggregation of data to inform improved implementation, clinical practice, and ultimately health outcomes associated with use of patient-facing eHealth technologies. The objective of this project was to conduct a systematic review to (1) identify existing instruments for eHealth research and implementation evaluation from the patient's point of view, (2) characterize measurement components, and (3) assess psychometrics. Concepts from existing models and published studies of technology use and adoption were identified and used to inform a search strategy. Search terms were broadly categorized as platforms (eg, email), measurement (eg, survey), function/information use (eg, self-management), health care occupations (eg, nurse), and eHealth/telemedicine (eg, mHealth). A computerized database search was conducted through June 2014. Included articles (1) described development of an instrument, or (2) used an instrument that could be traced back to its original publication, or (3) modified an instrument, and (4) with full text in English language, and (5) focused on the patient perspective on technology, including patient preferences and satisfaction, engagement with technology, usability, competency and fluency with technology, computer literacy, and trust in and acceptance of technology. The review was limited to instruments that reported at least one psychometric property. Excluded were investigator-developed measures, disease-specific assessments delivered via technology or telephone (eg, a cancer-coping measure delivered via computer survey), and measures focused primarily on clinician use (eg, the electronic health record). The search strategy yielded 47,320 articles. Following elimination of duplicates and non-English language publications (n=14,550) and books (n=27), another 31,647 articles were excluded through review of titles. Following a review of the abstracts of the remaining 1096 articles, 68 were retained for full-text review. Of these, 16 described an instrument and six used an instrument; one instrument was drawn from the GEM database, resulting in 23 articles for inclusion. None included a complete psychometric evaluation. The most frequently assessed property was internal consistency (21/23, 91%). Testing for aspects of validity ranged from 48% (11/23) to 78% (18/23). Approximately half (13/23, 57%) reported how to score the instrument. Only six (26%) assessed the readability of the instrument for end users, although all the measures rely on self-report. Although most measures identified in this review were published after the year 2000, rapidly changing technology makes instrument development challenging. Platform-agnostic measures need to be developed that focus on concepts important for use of any type of eHealth innovation. At present, there are important gaps in the availability of psychometrically sound measures to evaluate eHealth technologies. ©Bonnie J Wakefield, Carolyn L Turvey, Kim M Nazi, John E Holman, Timothy P Hogan, Stephanie L Shimada, Diana R Kennedy. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 11.10.2017.

  13. Erbium Laser Technology vs Traditional Drilling for Caries Removal: A Systematic Review with Meta-Analysis.

    PubMed

    Tao, Siying; Li, Lan; Yuan, He; Tao, Sibei; Cheng, Yiming; He, Libang; Li, Jiyao

    2017-12-01

    The study aimed to assess the efficacy of erbium laser technology compared with traditional drilling for caries removal. A systematic search was conducted through Medline via PubMed, Embase, Cochrane databases, CNKI till December 2016. Randomised controlled trials, quasi-randomized controlled trials, or controlled clinical trials with data comparing the efficacy of erbium laser technology versus traditional drilling for caries removal were included. Fourteen studies were selected in our meta-analysis. Erbium laser technology showed an increased time when removing caries compared with drilling (mean difference: 3.48, 95% confidence interval: 1.90-5.06, P < .0001). However, erbium laser technology reduced the requirement for local anesthesia (risk ratio: 0.28, 95% confidence interval: 0.13-0.62, P = .002). Erbium laser technology was also not significantly different to traditional drilling with regard to restoration loss, pulpal vitality, and postoperative sensitivity. Erbium laser technology showed an increased time for cavity preparation compared with traditional drilling. However, erbium laser technology reduced the requirement for local anesthesia. There was no significant difference between erbium laser technology and traditional drilling regarding restoration loss, pulpal vitality, and postoperative sensitivity. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. USGS international activities in coal resources

    USGS Publications Warehouse

    ,

    1999-01-01

    During the last 30 years the U.S. Geological Survey (USGS) has been engaged in coal exploration and characterization in more that 30 foreign countries, including India, Pakistan, China, Turkey, several Eastern European countries, Russia, and other former Soviet Union countries. Through this work, the USGS has developed an internationally recognized capability for assessing coal resources and defining their geochemical and physical characteristics. More recently, these data have been incorporated into digital databases and Geographic Information System (GIS) digital map products. The USGS has developed a high level of expertise in assessing the technological, economic, environmental, and human health impacts of coal occurrences and utilization based on comprehensive characterization of representative coal samples.

  15. Academic medical center libraries on the Web.

    PubMed Central

    Tannery, N H; Wessel, C B

    1998-01-01

    Academic medical center libraries are moving towards publishing electronically, utilizing networked technologies, and creating digital libraries. The catalyst for this movement has been the Web. An analysis of academic medical center library Web pages was undertaken to assess the information created and communicated in early 1997. A summary of present uses and suggestions for future applications is provided. A method for evaluating and describing the content of library Web sites was designed. The evaluation included categorizing basic information such as description and access to library services, access to commercial databases, and use of interactive forms. The main goal of the evaluation was to assess original resources produced by these libraries. PMID:9803298

  16. A Knowledge Database on Thermal Control in Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Hirasawa, Shigeki; Satoh, Isao

    A prototype version of a knowledge database on thermal control in manufacturing processes, specifically, molding, semiconductor manufacturing, and micro-scale manufacturing has been developed. The knowledge database has search functions for technical data, evaluated benchmark data, academic papers, and patents. The database also displays trends and future roadmaps for research topics. It has quick-calculation functions for basic design. This paper summarizes present research topics and future research on thermal control in manufacturing engineering to collate the information to the knowledge database. In the molding process, the initial mold and melt temperatures are very important parameters. In addition, thermal control is related to many semiconductor processes, and the main parameter is temperature variation in wafers. Accurate in-situ temperature measurment of wafers is important. And many technologies are being developed to manufacture micro-structures. Accordingly, the knowledge database will help further advance these technologies.

  17. Old Dog New Tricks: Use of Point-based Crop Models in Grid-based Regional Assessment of Crop Management Technologies Impact on Future Food Security

    NASA Astrophysics Data System (ADS)

    Koo, J.; Wood, S.; Cenacchi, N.; Fisher, M.; Cox, C.

    2012-12-01

    HarvestChoice (harvestchoice.org) generates knowledge products to guide strategic investments to improve the productivity and profitability of smallholder farming systems in sub-Saharan Africa (SSA). A keynote component of the HarvestChoice analytical framework is a grid-based overlay of SSA - a cropping simulation platform powered by process-based, crop models. Calibrated around the best available representation of cropping production systems in SSA, the simulation platform engages the DSSAT Crop Systems Model with the CENTURY Soil Organic Matter model (DSSAT-CENTURY) and provides a virtual experimentation module with which to explore the impact of a range of technological, managerial and environmental metrics on future crop productivity and profitability, as well as input use. For each of 5 (or 30) arc-minute grid cells in SSA, a stack of model input underlies it: datasets that cover soil properties and fertility, historic and future climate scenarios and farmers' management practices; all compiled from analyses of existing global and regional databases and consultations with other CGIAR centers. Running a simulation model is not always straightforward, especially when certain cropping systems or management practices are not even practiced by resource-poor farmers yet (e.g., precision agriculture) or they were never included in the existing simulation framework (e.g., water harvesting). In such cases, we used DSSAT-CENTURY as a function to iteratively estimate relative responses of cropping systems to technology-driven changes in water and nutrient balances compared to zero-adoption by farmers, while adjusting model input parameters to best mimic farmers' implementation of technologies in the field. We then fed the results of the simulation into to the economic and food trade model framework, IMPACT, to assess the potential implications on future food security. The outputs of the overall simulation analyses are packaged as a web-accessible database and published on the web with an interface that allows users to explore the simulation results in each country with user-defined baseline and what-if scenarios. The results are dynamically presented on maps, charts, and tables. This paper discusses the development of the simulation platform and its underlying data layers, a case study that assessed the role of potential crop management technology development, and the development of a web-based application that visualizes the simulation results.

  18. Worldwide nanotechnology development: a comparative study of USPTO, EPO, and JPO patents (1976-2004)

    NASA Astrophysics Data System (ADS)

    Li, Xin; Lin, Yiling; Chen, Hsinchun; Roco, Mihail C.

    2007-12-01

    To assess worldwide development of nanotechnology, this paper compares the numbers and contents of nanotechnology patents in the United States Patent and Trademark Office (USPTO), European Patent Office (EPO), and Japan Patent Office (JPO). It uses the patent databases as indicators of nanotechnology trends via bibliographic analysis, content map analysis, and citation network analysis on nanotechnology patents per country, institution, and technology field. The numbers of nanotechnology patents published in USPTO and EPO have continued to increase quasi-exponentially since 1980, while those published in JPO stabilized after 1993. Institutions and individuals located in the same region as a repository's patent office have a higher contribution to the nanotechnology patent publication in that repository ("home advantage" effect). The USPTO and EPO databases had similar high-productivity contributing countries and technology fields with large number of patents, but quite different high-impact countries and technology fields after the average number of received cites. Bibliographic analysis on USPTO and EPO patents shows that researchers in the United States and Japan published larger numbers of patents than other countries, and that their patents were more frequently cited by other patents. Nanotechnology patents covered physics research topics in all three repositories. In addition, USPTO showed the broadest representation in coverage in biomedical and electronics areas. The analysis of citations by technology field indicates that USPTO had a clear pattern of knowledge diffusion from highly cited fields to less cited fields, while EPO showed knowledge exchange mainly occurred among highly cited fields.

  19. Integrating evidence-based teaching into to clinical practice should improve outcomes.

    PubMed

    Richards, Derek

    2005-01-01

    Sources used were Medline, Embase, the Education Resources Information Centre , Cochrane Controlled Trials Register, Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects, Health Technology Assessment database, Best Evidence, Best Evidence Medical Education and Science Citation Index, along with reference lists of known systematic reviews. Studies were chosen for inclusion if they evaluated the effects of postgraduate evidence-based medicine (EBM) or critical appraisal teaching in comparison with a control group or baseline before teaching, using a measure of participants' learning achievements or patients' health gains as outcomes. Articles were graded as either level 1 (randomised controlled trials (RCT)) or level 2 (non-randomised studies that either had a comparison with a control group), or a before and after comparison without a control group. Learning achievement was assessed separately for knowledge, critical appraisal skills, attitudes and behaviour. Because of obvious heterogeneity in the features of individual studies, their quality and assessment tools used, a meta-analysis could not be carried out. Conclusions were weighted by methodological quality. Twenty-three relevant studies were identified, comprising four RCT, seven non-RCT, and 12 before and after comparison studies. Eighteen studies (including two RCT) evaluated a standalone teaching method and five studies (including two RCT) evaluated a clinically integrated teaching method. Standalone teaching improved knowledge but not skills, attitudes or behaviour. Clinically integrated teaching improved knowledge, skills, attitudes and behaviour. Teaching of EBM should be moved from classrooms to clinical practice to achieve improvements in substantial outcomes.

  20. Smartphone Apps for Schizophrenia: A Systematic Review

    PubMed Central

    2015-01-01

    Background There is increasing interest in using mobile technologies such as smartphones for improving the care of patients with schizophrenia. However, less is known about the current clinical evidence for the feasibility and effectiveness of smartphone apps in this population. Objective To review the published literature of smartphone apps applied for the care of patients with schizophrenia and other psychotic disorders. Methods An electronic database search of Ovid MEDLINE, the Cochrane Central Register of Controlled Trials, Health Technology Assessment Database, Allied and Complementary Medicine, Health and Psychosocial Instruments, PsycINFO, and Embase was conducted on May 24, 2015. All eligible studies were systematically reviewed, and proportional meta-analyses were applied to pooled data on recruitment, retention, and adherence to examine the overall feasibility of smartphone interventions for schizophrenia. Results Our search produced 226 results from which 7 eligible articles were identified, reporting on 5 studies of smartphone apps for patients with schizophrenia. All examined feasibility, and one assessed the preliminary efficacy of a smartphone intervention for schizophrenia. Study lengths varied between 6 and 130 days. Overall retention was 92% (95% CI 82-98%). Participants consistently used the smartphone apps on more than 85% of days during the study period, averaging 3.95 interactions per person per day. Furthermore, participants responded to 71.9% of automated prompts (95% CI 65.7-77.8%). Participants reported a range of potential benefits from the various interventions, and user experience was largely positive. Conclusions Although small, the current published literature demonstrates strong evidence for the feasibility of using smartphones to enhance the care of people with schizophrenia. High rates of engagement and satisfaction with a broad range of apps suggest the nascent potential of this mobile technology. However, there remains limited data on the efficacy of such interventions. PMID:26546039

  1. On Noise Assessment for Blended Wing Body Aircraft

    NASA Technical Reports Server (NTRS)

    Guo, Yueping; Burley, Casey L; Thomas, Russell H.

    2014-01-01

    A system noise study is presented for the blended-wing-body (BWB) aircraft configured with advanced technologies that are projected to be available in the 2025 timeframe of the NASA N+2 definition. This system noise assessment shows that the noise levels of the baseline configuration, measured by the cumulative Effective Perceived Noise Level (EPNL), have a large margin of 34 dB to the aircraft noise regulation of Stage 4. This confirms the acoustic benefits of the BWB shielding of engine noise, as well as other projected noise reduction technologies, but the noise margins are less than previously published assessments and are short of meeting the NASA N+2 noise goal. In establishing the relevance of the acoustic assessment framework, the design of the BWB configuration, the technical approach of the noise analysis, the databases and prediction tools used in the assessment are first described and discussed. The predicted noise levels and the component decomposition are then analyzed to identify the ranking order of importance of various noise components, revealing the prominence of airframe noise, which holds up the levels at all three noise certification locations and renders engine noise reduction technologies less effective. When projected airframe component noise reduction is added to the HWB configuration, it is shown that the cumulative noise margin to Stage 4 can reach 41.6 dB, nearly at the NASA goal. These results are compared with a previous NASA assessment with a different study framework. The approaches that yield projections of such low noise levels are discussed including aggressive assumptions on future technologies, assumptions on flight profile management, engine installation, and component noise reduction technologies. It is shown that reliable predictions of component noise also play an important role in the system noise assessment. The comparisons and discussions illustrate the importance of practical feasibilities and constraints in aircraft system noise studies, which include aerodynamic performance, propulsion efficiency, flight profile limitation and many other factors. For a future aircraft concept to achieve the NASA N+2 noise goal it will require a range of fully successful noise reduction technology developments.

  2. Analysis of 2015 Winter In-Flight Icing Case Studies with Ground-Based Remote Sensing Systems Compared to In-Situ SLW Sondes

    NASA Technical Reports Server (NTRS)

    Serke, David J.; King, Michael Christopher; Hansen, Reid; Reehorst, Andrew L.

    2016-01-01

    National Aeronautics and Space Administration (NASA) and the National Center for Atmospheric Research (NCAR) have developed an icing remote sensing technology that has demonstrated skill at detecting and classifying icing hazards in a vertical column above an instrumented ground station. This technology has recently been extended to provide volumetric coverage surrounding an airport. Building on the existing vertical pointing system, the new method for providing volumetric coverage utilizes a vertical pointing cloud radar, a multi-frequency microwave radiometer with azimuth and elevation pointing, and a NEXRAD radar. The new terminal area icing remote sensing system processes the data streams from these instruments to derive temperature, liquid water content, and cloud droplet size for each examined point in space. These data are then combined to ultimately provide icing hazard classification along defined approach paths into an airport. To date, statistical comparisons of the vertical profiling technology have been made to Pilot Reports and Icing Forecast Products. With the extension into relatively large area coverage and the output of microphysical properties in addition to icing severity, the use of these comparators is not appropriate and a more rigorous assessment is required. NASA conducted a field campaign during the early months of 2015 to develop a database to enable the assessment of the new terminal area icing remote sensing system and further refinement of terminal area icing weather information technologies in general. In addition to the ground-based remote sensors listed earlier, in-situ icing environment measurements by weather balloons were performed to produce a comprehensive comparison database. Balloon data gathered consisted of temperature, humidity, pressure, super-cooled liquid water content, and 3-D position with time. Comparison data plots of weather balloon and remote measurements, weather balloon flight paths, bulk comparisons of integrated liquid water content and icing cloud extent agreement, and terminal-area hazard displays are presented. Discussions of agreement quality and paths for future development are also included.

  3. Wenxin Keli for atrial fibrillation

    PubMed Central

    He, Zhuogen; Zheng, Minan; Xie, Pingchang; Wang, Yuanping; Yan, Xia; Deng, Dingwei

    2018-01-01

    Abstract Background: Atrial fibrillation (AF) is a most common cardiac arrhythmia in clinical practice. In China, Wenxin Keli (WXKL) therapy is a common treatment for AF, but its effects and safety remain uncertain. This protocol is to provide the methods used to assess the effectiveness and safety of WXKL for the treatment of patients with AF. Methods: We will search comprehensively the 4 English databases EMBASE, the Cochrane Central Register of Controlled Trials (Cochrane Library), PubMed, and Medline and 3 Chinese databases China National Knowledge Infrastructure (CNKI), Chinese Biomedical Literature Database (CBM), and Chinese Science and Technology Periodical database (VIP) on computer on March 2018 for the randomized controlled trials (RCTs) regarding WXKL for AF. The therapeutic effects according to the sinus rhythm and p-wave dispersion (Pwd) will be accepted as the primary outcomes. We will use RevMan V.5.3 software as well to compute the data synthesis carefully when a meta-analysis is allowed. Results: This study will provide a high-quality synthesis of current evidence of WXKL for AF. Conclusion: The conclusion of our systematic review will provide evidence to judge whether WXKL is an effective intervention for patient with AF. PROSPERO registration number: PROSPERO CRD 42018082045. PMID:29702984

  4. Using Technology to Deliver Mental Health Services to Children and Youth: A Scoping Review

    PubMed Central

    Boydell, Katherine M.; Hodgins, Michael; Pignatiello, Antonio; Teshima, John; Edwards, Helen; Willis, David

    2014-01-01

    Objective: To conduct a scoping review on the use of technology to deliver mental health services to children and youth in order to identify the breadth of peer-reviewed literature, summarize findings and identify gaps. Method: A literature database search identified 126 original studies meeting criteria for review. Descriptive numerical summary and thematic analyses were conducted. Two reviewers independently extracted data. Results: Studies were characterized by diverse technologies including videoconferencing, telephone and mobile phone applications and Internet-based applications such as email, web sites and CD-ROMs. Conclusion: The use of technologies plays a major role in the delivery of mental health services and supports to children and youth in providing prevention, assessment, diagnosis, counseling and treatment programs. Strategies are growing exponentially on a global basis, thus it is critical to study the impact of these technologies on child and youth mental health service delivery. An in-depth review and synthesis of the quality of findings of studies on effectiveness of the use of technologies in service delivery are also warranted. A full systematic review would provide that opportunity. PMID:24872824

  5. Using technology to deliver mental health services to children and youth: a scoping review.

    PubMed

    Boydell, Katherine M; Hodgins, Michael; Pignatiello, Antonio; Teshima, John; Edwards, Helen; Willis, David

    2014-05-01

    To conduct a scoping review on the use of technology to deliver mental health services to children and youth in order to identify the breadth of peer-reviewed literature, summarize findings and identify gaps. A literature database search identified 126 original studies meeting criteria for review. Descriptive numerical summary and thematic analyses were conducted. Two reviewers independently extracted data. Studies were characterized by diverse technologies including videoconferencing, telephone and mobile phone applications and Internet-based applications such as email, web sites and CD-ROMs. The use of technologies plays a major role in the delivery of mental health services and supports to children and youth in providing prevention, assessment, diagnosis, counseling and treatment programs. Strategies are growing exponentially on a global basis, thus it is critical to study the impact of these technologies on child and youth mental health service delivery. An in-depth review and synthesis of the quality of findings of studies on effectiveness of the use of technologies in service delivery are also warranted. A full systematic review would provide that opportunity.

  6. Using X-band Weather Radar Measurements to Monitor the Integrity of Digital Elevation Models for Synthetic Vision Systems

    NASA Technical Reports Server (NTRS)

    Young, Steve; UijtdeHaag, Maarten; Sayre, Jonathon

    2003-01-01

    Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data representing terrain, obstacles, and cultural features. As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. Further, updates to the databases may not be provided as changes occur. These issues limit the certification level and constrain the operational context of SVS for civil aviation. Previous work demonstrated the feasibility of using a realtime monitor to bound the integrity of Digital Elevation Models (DEMs) by using radar altimeter measurements during flight. This paper describes an extension of this concept to include X-band Weather Radar (WxR) measurements. This enables the monitor to detect additional classes of DEM errors and to reduce the exposure time associated with integrity threats. Feature extraction techniques are used along with a statistical assessment of similarity measures between the sensed and stored features that are detected. Recent flight-testing in the area around the Juneau, Alaska Airport (JNU) has resulted in a comprehensive set of sensor data that is being used to assess the feasibility of the proposed monitor technology. Initial results of this assessment are presented.

  7. Assessment of Graph Databases as a Viable Materiel Solution for the Army’s Dynamic Force Structure (DFS) Portal Implementation

    DTIC Science & Technology

    2018-03-30

    information : Francisco L. Loaiza-Lemos, Project Leader floaiza@ida.org, 703-845-687 Margaret E. Myers, Director, Information Technology and Systems...analysis is aligned with the goals and objectives of the Department of Defense (DoD) as expressed in its Global Force Management Data Initiative...the previous phases of the analysis and how can they help inform the decision process for determining the optimal mix needed to implement the planned

  8. Collaborative Biomechanics Data Network (CBDN): Promoting Human Protection and Performance in Hazardous Environments Through Modeling and Data Mining of Human-Centric Data Bases

    DTIC Science & Technology

    2011-09-01

    unique data requirements. For an excellent primer on the CBDN process see Keller and Plaga [2010]. This report shows an example of a database, in...data [Buhrman, Plaga , Cheng, Mosher, 2001]. 2.1.2.2 Web application development All technologies are based upon and run on top of the .NET...builds). Better manikin design leads to better assessment of personnel equipment under acceleration [such as helmet systems, see Plaga and Boehmer

  9. Antimicrobial Resistance in the Environment.

    PubMed

    Waseem, Hassan; Williams, Maggie R; Stedtfeld, Robert D; Hashsham, Syed A

    2017-10-01

    This review summarizes selected publications of 2016 with emphasis on occurrence and treatment of antibiotic resistance genes and bacteria in the aquatic environment and wastewater and drinking water treatment plants. The review is conducted with emphasis on fate, modeling, risk assessment and data analysis methodologies for characterizing abundance. After providing a brief introduction, the review is divided into the following four sections: i) Occurrence of AMR in the Environment, ii) Treatment Technologies for AMR, iii) Modeling of Fate, Risk, and Environmental Impact of AMR, and iv) ARG Databases and Pipelines.

  10. An Examination of Selected Software Testing Tools: 1992

    DTIC Science & Technology

    1992-12-01

    Report ....................................................... 27-19 Figure 27-17. Metrics Manager Database Full Report...historical test database , the test management and problem reporting tools were examined using the sample test database provided by each supplier. 4-4...track the impact of new methods, organi- zational structures, and technologies. Metrics Manager is supported by an industry database that allows

  11. e-Addictology: An Overview of New Technologies for Assessing and Intervening in Addictive Behaviors.

    PubMed

    Ferreri, Florian; Bourla, Alexis; Mouchabac, Stephane; Karila, Laurent

    2018-01-01

    New technologies can profoundly change the way we understand psychiatric pathologies and addictive disorders. New concepts are emerging with the development of more accurate means of collecting live data, computerized questionnaires, and the use of passive data. Digital phenotyping , a paradigmatic example, refers to the use of computerized measurement tools to capture the characteristics of different psychiatric disorders. Similarly, machine learning-a form of artificial intelligence-can improve the classification of patients based on patterns that clinicians have not always considered in the past. Remote or automated interventions (web-based or smartphone-based apps), as well as virtual reality and neurofeedback, are already available or under development. These recent changes have the potential to disrupt practices, as well as practitioners' beliefs, ethics and representations, and may even call into question their professional culture. However, the impact of new technologies on health professionals' practice in addictive disorder care has yet to be determined. In the present paper, we therefore present an overview of new technology in the field of addiction medicine. Using the keywords [e-health], [m-health], [computer], [mobile], [smartphone], [wearable], [digital], [machine learning], [ecological momentary assessment], [biofeedback] and [virtual reality], we searched the PubMed database for the most representative articles in the field of assessment and interventions in substance use disorders. We screened 595 abstracts and analyzed 92 articles, dividing them into seven categories: e-health program and web-based interventions, machine learning, computerized adaptive testing, wearable devices and digital phenotyping, ecological momentary assessment, biofeedback, and virtual reality. This overview shows that new technologies can improve assessment and interventions in the field of addictive disorders. The precise role of connected devices, artificial intelligence and remote monitoring remains to be defined. If they are to be used effectively, these tools must be explained and adapted to the different profiles of physicians and patients. The involvement of patients, caregivers and other health professionals is essential to their design and assessment.

  12. Wearable technology for spine movement assessment: A systematic review.

    PubMed

    Papi, Enrica; Koh, Woon Senn; McGregor, Alison H

    2017-11-07

    Continuous monitoring of spine movement function could enhance our understanding of low back pain development. Wearable technologies have gained popularity as promising alternative to laboratory systems in allowing ambulatory movement analysis. This paper aims to review the state of art of current use of wearable technology to assess spine kinematics and kinetics. Four electronic databases and reference lists of relevant articles were searched to find studies employing wearable technologies to assess the spine in adults performing dynamic movements. Two reviewers independently identified relevant papers. Customised data extraction and quality appraisal form were developed to extrapolate key details and identify risk of biases of each study. Twenty-two articles were retrieved that met the inclusion criteria: 12 were deemed of medium quality (score 33.4-66.7%), and 10 of high quality (score >66.8%). The majority of articles (19/22) reported validation type studies. Only 6 reported data collection in real-life environments. Multiple sensors type were used: electrogoniometers (3/22), strain gauges based sensors (3/22), textile piezoresistive sensor (1/22) and accelerometers often used with gyroscopes and magnetometers (15/22). Two sensors units were mainly used and placing was commonly reported on the spine lumbar and sacral regions. The sensors were often wired to data transmitter/logger resulting in cumbersome systems. Outcomes were mostly reported relative to the lumbar segment and in the sagittal plane, including angles, range of motion, angular velocity, joint moments and forces. This review demonstrates the applicability of wearable technology to assess the spine, although this technique is still at an early stage of development. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Veterans Administration Databases

    Cancer.gov

    The Veterans Administration Information Resource Center provides database and informatics experts, customer service, expert advice, information products, and web technology to VA researchers and others.

  14. Directory of On-Line Networks, Databases and Bulletin Boards on Assistive Technology. Second Edition. RESNA Technical Assistance Project.

    ERIC Educational Resources Information Center

    RESNA: Association for the Advancement of Rehabilitation Technology, Washington, DC.

    This resource directory provides a selective listing of electronic networks, online databases, and bulletin boards that highlight technology-related services and products. For each resource, the following information is provided: name, address, and telephone number; description; target audience; hardware/software needs to access the system;…

  15. An Overview to Research on Education Technology Based on Constructivist Learning Approach

    ERIC Educational Resources Information Center

    Asiksoy, Gulsum; Ozdamli, Fezile

    2017-01-01

    The aim of this research is to determine the trends of education technology researches on Constructivist Learning Approach, which were published on database of ScienceDirect between 2010 and 2016. It also aims to guide researchers who will do studies in this field. After scanning the database, 81 articles published on ScienceDirect's data base…

  16. Environmental impact assessment of european non-ferro mining industries through life-cycle assessment

    NASA Astrophysics Data System (ADS)

    Hisan Farjana, Shahjadi; Huda, Nazmul; Parvez Mahmud, M. A.

    2018-05-01

    European mining industries are the vast industrial sector which contributes largely on their economy which constitutes of ferro and non-ferro metals and minerals industries. The non-ferro metals extraction and processing industries require focus of attention due to sustainability concerns as their manufacturing processes are highly energy intensive and impacts globally on environment. This paper analyses major environmental effects caused by European metal industries based on the life-cycle impact analysis technologies. This research work is the first work in considering the comparative environmental impact analysis of European non-ferro metal industries which will reveal their technological similarities and dissimilarities to assess their environmental loads. The life-cycle inventory datasets are collected from the EcoInvent database while the analysis is done using the CML baseline and ReCipe endpoint method using SimaPro software version 8.4. The CML and ReCipe method are chosen because they are specialized impact assessment methods for European continent. The impact categories outlined for discussion here are human health, global warming and ecotoxicity. The analysis results reveal that the gold industry is vulnerable for the environment due to waste emission and similar result retained by silver mines a little bit. But copper, lead, manganese and zinc mining processes and industries are environment friendly in terms of metal extraction technologies and waste emissions.

  17. Critical Needs for Robust and Reliable Database for Design and Manufacturing of Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Singh, M.

    1999-01-01

    Ceramic matrix composite (CMC) components are being designed, fabricated, and tested for a number of high temperature, high performance applications in aerospace and ground based systems. The critical need for and the role of reliable and robust databases for the design and manufacturing of ceramic matrix composites are presented. A number of issues related to engineering design, manufacturing technologies, joining, and attachment technologies, are also discussed. Examples of various ongoing activities in the area of composite databases. designing to codes and standards, and design for manufacturing are given.

  18. Development of the Tailored Rett Intervention and Assessment Longitudinal (TRIAL) database and the Rett Evaluation of Symptoms and Treatments (REST) Questionnaire.

    PubMed

    Santosh, Paramala; Lievesley, Kate; Fiori, Federico; Singh, Jatinder

    2017-06-21

    Rett syndrome (RTT) is a pervasive neurodevelopmental disorder that presents with deficits in brain functioning leading to language and learning regression, characteristic hand stereotypies and developmental delay. Different mutations in the gene implicated in RTT- methyl-CpG-binding protein 2 ( MECP2 ) establishes RTT as a disorder with divergent symptomatology ranging from individuals with severe to milder phenotypes. A reliable and single multidimensional questionnaire is needed that can embrace all symptoms, and the relationships between them, and can map clinically meaningful data to symptomatology across the lifespan in patients with RTT. As part of the HealthTracker-based Tailored Rett Intervention and Assessment Longitudinal (TRIAL) database, the Rett Evaluation of Symptoms and Treatments (REST) Questionnaire will be able to marry with the physiological aspects of the disease obtained using wearable sensor technology, along with genetic and psychosocial data to stratify patients. Taken together, the web-based TRIAL database will empower clinicians and researchers with the confidence to delineate between different aspects of disorder symptomatology to streamline care pathways for individuals or for those patients entering clinical trials. This protocol describes the anticipated development of the REST questionnaire and the TRIAL database which links with the outcomes of the wearable sensor technology, and will serve as a barometer for longitudinal patient monitoring in patients with RTT. The US Food and Drug Administration Guidance for Patient-Reported Outcome Measures will be used as a template to inform the methodology of the study. It will follow an iterative framework that will include item/concept identification, item/concept elicitation in parent/carer-mediated focus groups, expert clinician feedback, web-based presentation of questionnaires, initial scale development, instrument refinement and instrument validation. The study has received favourable opinion from the National Health Service (NHS) Research Ethics Committee (REC): NHS Research Ethics Committee (REC)-London, Bromley Research Ethics Committee (reference: 15/LO/1772). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Internet-based distributed collaborative environment for engineering education and design

    NASA Astrophysics Data System (ADS)

    Sun, Qiuli

    2001-07-01

    This research investigates the use of the Internet for engineering education, design, and analysis through the presentation of a Virtual City environment. The main focus of this research was to provide an infrastructure for engineering education, test the concept of distributed collaborative design and analysis, develop and implement the Virtual City environment, and assess the environment's effectiveness in the real world. A three-tier architecture was adopted in the development of the prototype, which contains an online database server, a Web server as well as multi-user servers, and client browsers. The environment is composed of five components, a 3D virtual world, multiple Internet-based multimedia modules, an online database, a collaborative geometric modeling module, and a collaborative analysis module. The environment was designed using multiple Intenet-based technologies, such as Shockwave, Java, Java 3D, VRML, Perl, ASP, SQL, and a database. These various technologies together formed the basis of the environment and were programmed to communicate smoothly with each other. Three assessments were conducted over a period of three semesters. The Virtual City is open to the public at www.vcity.ou.edu. The online database was designed to manage the changeable data related to the environment. The virtual world was used to implement 3D visualization and tie the multimedia modules together. Students are allowed to build segments of the 3D virtual world upon completion of appropriate undergraduate courses in civil engineering. The end result is a complete virtual world that contains designs from all of their coursework and is viewable on the Internet. The environment is a content-rich educational system, which can be used to teach multiple engineering topics with the help of 3D visualization, animations, and simulations. The concept of collaborative design and analysis using the Internet was investigated and implemented. Geographically dispersed users can build the same geometric model simultaneously over the Internet and communicate with each other through a chat room. They can also conduct finite element analysis collaboratively on the same object over the Internet. They can mesh the same object, apply and edit the same boundary conditions and forces, obtain the same analysis results, and then discuss the results through the Internet.

  20. Initial experiences with building a health care infrastructure based on Java and object-oriented database technology.

    PubMed

    Dionisio, J D; Sinha, U; Dai, B; Johnson, D B; Taira, R K

    1999-01-01

    A multi-tiered telemedicine system based on Java and object-oriented database technology has yielded a number of practical insights and experiences on their effectiveness and suitability as implementation bases for a health care infrastructure. The advantages and drawbacks to their use, as seen within the context of the telemedicine system's development, are discussed. Overall, these technologies deliver on their early promise, with a few remaining issues that are due primarily to their relative newness.

  1. High-Performance Secure Database Access Technologies for HEP Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysismore » capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure authorization is pushed into the database engine will eliminate inefficient data transfer bottlenecks. Furthermore, traditionally separated database and security layers provide an extra vulnerability, leaving a weak clear-text password authorization as the only protection on the database core systems. Due to the legacy limitations of the systems’ security models, the allowed passwords often can not even comply with the DOE password guideline requirements. We see an opportunity for the tight integration of the secure authorization layer with the database server engine resulting in both improved performance and improved security. Phase I has focused on the development of a proof-of-concept prototype using Argonne National Laboratory’s (ANL) Argonne Tandem-Linac Accelerator System (ATLAS) project as a test scenario. By developing a grid-security enabled version of the ATLAS project’s current relation database solution, MySQL, PIOCON Technologies aims to offer a more efficient solution to secure database access.« less

  2. Life Cycle Assessment of Titania Perovskite Solar Cell Technology for Sustainable Design and Manufacturing.

    PubMed

    Zhang, Jingyi; Gao, Xianfeng; Deng, Yelin; Li, Bingbing; Yuan, Chris

    2015-11-01

    Perovskite solar cells have attracted enormous attention in recent years due to their low cost and superior technical performance. However, the use of toxic metals, such as lead, in the perovskite dye and toxic chemicals in perovskite solar cell manufacturing causes grave concerns for its environmental performance. To understand and facilitate the sustainable development of perovskite solar cell technology from its design to manufacturing, a comprehensive environmental impact assessment has been conducted on titanium dioxide nanotube based perovskite solar cells by using an attributional life cycle assessment approach, from cradle to gate, with manufacturing data from our laboratory-scale experiments and upstream data collected from professional databases and the literature. The results indicate that the perovskite dye is the primary source of environmental impact, associated with 64.77% total embodied energy and 31.38% embodied materials consumption, contributing to more than 50% of the life cycle impact in almost all impact categories, although lead used in the perovskite dye only contributes to about 1.14% of the human toxicity potential. A comparison of perovskite solar cells with commercial silicon and cadmium-tellurium solar cells reveals that perovskite solar cells could be a promising alternative technology for future large-scale industrial applications. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. The data and system Nikkei Telecom "Industry/Technology Information Service"

    NASA Astrophysics Data System (ADS)

    Kurata, Shizuya; Sueyoshi, Yukio

    Nihoh Keizai Shimbun started supplying "Industry/Technology Information Service" from July 1989 as a part of Nikkei Telecom Package, which is online information service using personal computers for its terminals. Previously Nikkei's database service mainly covered such areas as economy, corporations and markets. On the other hand, the new "Industry/Technology Information Service" (main data covers industry by industry information-semi macro) is attracting a good deal of attention as it is the first to supply science and technology related database which has not been touched before. Moreover it is attracting attention technically as it has an access by gateway system to JOIS which is the first class science technology file in Japan. This report introduces data and system of "Industry/Technology Information Service" briefly.

  4. The use of artificial intelligence technology to predict lymph node spread in men with clinically localized prostate carcinoma.

    PubMed

    Crawford, E D; Batuello, J T; Snow, P; Gamito, E J; McLeod, D G; Partin, A W; Stone, N; Montie, J; Stock, R; Lynch, J; Brandt, J

    2000-05-01

    The current study assesses artificial intelligence methods to identify prostate carcinoma patients at low risk for lymph node spread. If patients can be assigned accurately to a low risk group, unnecessary lymph node dissections can be avoided, thereby reducing morbidity and costs. A rule-derivation technology for simple decision-tree analysis was trained and validated using patient data from a large database (4,133 patients) to derive low risk cutoff values for Gleason sum and prostate specific antigen (PSA) level. An empiric analysis was used to derive a low risk cutoff value for clinical TNM stage. These cutoff values then were applied to 2 additional, smaller databases (227 and 330 patients, respectively) from separate institutions. The decision-tree protocol derived cutoff values of < or = 6 for Gleason sum and < or = 10.6 ng/mL for PSA. The empiric analysis yielded a clinical TNM stage low risk cutoff value of < or = T2a. When these cutoff values were applied to the larger database, 44% of patients were classified as being at low risk for lymph node metastases (0.8% false-negative rate). When the same cutoff values were applied to the smaller databases, between 11 and 43% of patients were classified as low risk with a false-negative rate of between 0.0 and 0.7%. The results of the current study indicate that a population of prostate carcinoma patients at low risk for lymph node metastases can be identified accurately using a simple decision algorithm that considers preoperative PSA, Gleason sum, and clinical TNM stage. The risk of lymph node metastases in these patients is < or = 1%; therefore, pelvic lymph node dissection may be avoided safely. The implications of these findings in surgical and nonsurgical treatment are significant.

  5. Using non-local databases for the environmental assessment of industrial activities: The case of Latin America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osses de Eicker, Margarita, E-mail: Margarita.Osses@empa.c; Hischier, Roland, E-mail: Roland.Hischier@empa.c; Hurni, Hans, E-mail: Hans.Hurni@cde.unibe.c

    2010-04-15

    Nine non-local databases were evaluated with respect to their suitability for the environmental assessment of industrial activities in Latin America. Three assessment methods were considered, namely Life Cycle Assessment (LCA), Environmental Impact Assessment (EIA) and air emission inventories. The analysis focused on data availability in the databases and the applicability of their international data to Latin American industry. The study showed that the European EMEP/EEA Guidebook and the U.S. EPA AP-42 database are the most suitable ones for air emission inventories, whereas the LCI database Ecoinvent is the most suitable one for LCA and EIA. Due to the data coveragemore » in the databases, air emission inventories are easier to develop than LCA or EIA, which require more comprehensive information. One strategy to overcome the limitations of non-local databases for Latin American industry is the combination of validated data from international databases with newly developed local datasets.« less

  6. Assessment of IT solutions used in the Hungarian income tax microsimulation system

    NASA Astrophysics Data System (ADS)

    Molnar, I.; Hardhienata, S.

    2017-01-01

    This paper focuses on the use of information technology (IT) in diverse microsimulation studies and presents state-of-the-art solutions in the traditional application field of personal income tax simulation. The aim of the paper is to promote solutions, which can improve the efficiency and quality of microsimulation model implementation, assess their applicability and help to shift attention from microsimulation model implementation and data analysis towards experiment design and model use. First, the authors shortly discuss the relevant characteristics of the microsimulation application field and the managerial decision-making problem. After examination of the salient problems, advanced IT solutions, such as meta-database and service-oriented architecture are presented. The authors show how selected technologies can be applied to support both data- and behavior-driven and even agent-based personal income tax microsimulation model development. Finally, examples are presented and references made to the Hungarian Income Tax Simulator (HITS) models and their results. The paper concludes with a summary of the IT assessment and application-related author remarks dedicated to an Indonesian Income Tax Microsimulation Model.

  7. Economic evidence for the prevention and treatment of atopic eczema: a protocol for a systematic review.

    PubMed

    Sach, Tracey Helen; McManus, Emma; Mcmonagle, Christopher; Levell, Nick

    2016-05-27

    Eczema, synonymous with atopic eczema or atopic dermatitis, is a chronic skin disease that has a similar impact on health-related quality of life as other chronic diseases. The proposed research aims to provide a comprehensive systematic assessment of the economic evidence base available to inform economic modelling and decision making on interventions to prevent and treat eczema at any stage of the life course. Whilst the Global Resource of Eczema Trials (GREAT) database collects together the effectiveness evidence for eczema, there is currently no such systematic resource on the economics of eczema. It is important to gain an overview of the current state of the art of economic methods in the field of eczema in order to strengthen the economic evidence base further. The proposed study is a systematic review of the economic evidence surrounding interventions for the prevention and treatment of eczema. Relevant search terms will be used to search MEDLINE, EMBASE, Database of Abstracts of Reviews of Effects, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, National Health Service (NHS) Economic Evaluation Database, Health Technology Assessment, Cumulative Index to Nursing and Allied Health Literature, EconLit, Scopus, Cost-Effectiveness Analysis Registry and Web of Science in order to identify relevant evidence. To be eligible for inclusion studies will be primary empirical studies evaluating the cost, utility or full economic evaluation of interventions for preventing or treating eczema. Two reviewers will independently assess studies for eligibility and perform data abstraction. Evidence tables will be produced presenting details of study characteristics, costing methods, outcome methods and quality assessment. The methodological quality of studies will be assessed using accepted checklists. The systematic review is being undertaken to identify the type of economic evidence available, summarise the results of the available economic evidence and critically appraise the quality of economic evidence currently available to inform future economic modelling and resource allocation decisions about interventions to prevent or treat eczema. We aim to use the review to offer guidance about how to gather economic evidence in studies of eczema and/or what further research is necessary in order to inform this. PROSPERO CRD42015024633.

  8. Reactome graph database: Efficient access to complex pathway data

    PubMed Central

    Korninger, Florian; Viteri, Guilherme; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D’Eustachio, Peter

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types. PMID:29377902

  9. Reactome graph database: Efficient access to complex pathway data.

    PubMed

    Fabregat, Antonio; Korninger, Florian; Viteri, Guilherme; Sidiropoulos, Konstantinos; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D'Eustachio, Peter; Hermjakob, Henning

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  10. Empowering radiologic education on the Internet: a new virtual website technology for hosting interactive educational content on the World Wide Web.

    PubMed

    Frank, M S; Dreyer, K

    2001-06-01

    We describe a virtual web site hosting technology that enables educators in radiology to emblazon and make available for delivery on the world wide web their own interactive educational content, free from dependencies on in-house resources and policies. This suite of technologies includes a graphically oriented software application, designed for the computer novice, to facilitate the input, storage, and management of domain expertise within a database system. The database stores this expertise as choreographed and interlinked multimedia entities including text, imagery, interactive questions, and audio. Case-based presentations or thematic lectures can be authored locally, previewed locally within a web browser, then uploaded at will as packaged knowledge objects to an educator's (or department's) personal web site housed within a virtual server architecture. This architecture can host an unlimited number of unique educational web sites for individuals or departments in need of such service. Each virtual site's content is stored within that site's protected back-end database connected to Internet Information Server (Microsoft Corp, Redmond WA) using a suite of Active Server Page (ASP) modules that incorporate Microsoft's Active Data Objects (ADO) technology. Each person's or department's electronic teaching material appears as an independent web site with different levels of access--controlled by a username-password strategy--for teachers and students. There is essentially no static hypertext markup language (HTML). Rather, all pages displayed for a given site are rendered dynamically from case-based or thematic content that is fetched from that virtual site's database. The dynamically rendered HTML is displayed within a web browser in a Socratic fashion that can assess the recipient's current fund of knowledge while providing instantaneous user-specific feedback. Each site is emblazoned with the logo and identification of the participating institution. Individuals with teacher-level access can use a web browser to upload new content as well as manage content already stored on their virtual site. Each virtual site stores, collates, and scores participants' responses to the interactive questions posed on line. This virtual web site strategy empowers the educator with an end-to-end solution for creating interactive educational content and hosting that content within the educator's personalized and protected educational site on the world wide web, thus providing a valuable outlet that can magnify the impact of his or her talents and contributions.

  11. Rhinoplasty perioperative database using a personal digital assistant.

    PubMed

    Kotler, Howard S

    2004-01-01

    To construct a reliable, accurate, and easy-to-use handheld computer database that facilitates the point-of-care acquisition of perioperative text and image data specific to rhinoplasty. A user-modified database (Pendragon Forms [v.3.2]; Pendragon Software Corporation, Libertyville, Ill) and graphic image program (Tealpaint [v.4.87]; Tealpaint Software, San Rafael, Calif) were used to capture text and image data, respectively, on a Palm OS (v.4.11) handheld operating with 8 megabytes of memory. The handheld and desktop databases were maintained secure using PDASecure (v.2.0) and GoldSecure (v.3.0) (Trust Digital LLC, Fairfax, Va). The handheld data were then uploaded to a desktop database of either FileMaker Pro 5.0 (v.1) (FileMaker Inc, Santa Clara, Calif) or Microsoft Access 2000 (Microsoft Corp, Redmond, Wash). Patient data were collected from 15 patients undergoing rhinoplasty in a private practice outpatient ambulatory setting. Data integrity was assessed after 6 months' disk and hard drive storage. The handheld database was able to facilitate data collection and accurately record, transfer, and reliably maintain perioperative rhinoplasty data. Query capability allowed rapid search using a multitude of keyword search terms specific to the operative maneuvers performed in rhinoplasty. Handheld computer technology provides a method of reliably recording and storing perioperative rhinoplasty information. The handheld computer facilitates the reliable and accurate storage and query of perioperative data, assisting the retrospective review of one's own results and enhancement of surgical skills.

  12. Technology in Science and Mathematics Education.

    ERIC Educational Resources Information Center

    Buccino, Alphonse

    Provided are several perspectives on technology, addressing changes in learners related to technology, changes in contemporary life related to technology, and changes in subject areas related to technology (indicating that technology has created such new tools for inquiry as computer programming, word processing, online database searches, and…

  13. [Health technology assessment in Ecuador's ministry of public health as a tool for drug purchasing from 2012 to 2015].

    PubMed

    Armijos, Luciana; Escalante, Santiago; Villacrés, Tatiana

    2017-06-08

    Learn how the Ministry of Public Health (MSP, the Spanish acronym) of Ecuador uses health technology assessment (HTA) in decision-making on the purchase of drugs that are not on the National List of Essential Medicines (NLEM). Information from databases of the Health Intelligence Directorate (DIS, the Spanish acronym) and the National Directorate of Drugs and Medical Devices (DNMDM, the Spanish acronym), was used to compare decisions made by both entities, to learn about the use and consistency of HTA reports in decisions on purchasing drugs not included in the NLEM. From 2012 to 2015, 227 reports were issued, of which 87 cover drugs; 36, devices; 29: medical procedures; 34: health programs; and 41: other medical technologies. The DNMDM requested 59 of the reports on drugs. There was 80% agreement in decisions made by the two directorates that participate in the process. The MSP, through the DIS, began using HTA in 2012. Given that the majority of reports evaluate drugs, it is essential that reports be prepared for other types of medical technologies and that they be prepared and used as widely as possible. Despite a high level of agreement in decisions, it is important to continue to improve the reports' scope and quality, and to monitor adoption and dissemination of authorized and funded technologies to learn the effectiveness and impact of HTA in Ecuador.

  14. Marine Hydrokinetic Resource Assessment for Domestic Army, Air Force, and Coast Guard Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robichaud, Robi J; Ingram, Michael

    NREL/DOE undertook a study for the US Army, Coast Guard and Air Force to investigate the potential for marine hydrokinetic (MHK) devices to meet the energy load at coastal bases in the future as MHK technology evolves. A wide range of data from tidal and wave, environmental, shipping, etc. databases were used to screen the DOD bases. A series of scoring algorithms were developed to facilitate site review to lead to eventual down select for more detailed, site specific bathymetric tidal resource evaluation. The Army's Camp Edwards, MA and the Coast Guard's Training Center Cape May, NJ (TRACEN Cape May)more » were selected and the Georgia Institute of Technology performed the analyses. An NREL/DOE MHK team visited the bases to further discuss with the base personnel MHK technology's potential for providing power to the bases in the future and frame the potential impact to existing power systems.« less

  15. Development Status of the Advanced Life Support On-Line Project Information System

    NASA Technical Reports Server (NTRS)

    Levri, Julie A.; Hogan, John A.; Cavazzoni, Jim; Brodbeck, Christina; Morrow, Rich; Ho, Michael; Kaehms, Bob; Whitaker, Dawn R.

    2005-01-01

    The Advanced Life Support Program has recently accelerated an effort to develop an On-line Project Information System (OPIS) for research project and technology development data centralization and sharing. The core functionality of OPIS will launch in October of 2005. This paper presents the current OPIS development status. OPIS core functionality involves a Web-based annual solicitation of project and technology data directly from ALS Principal Investigators (PIS) through customized data collection forms. Data provided by PIs will be reviewed by a Technical Task Monitor (TTM) before posting the information to OPIS for ALS Community viewing via the Web. The data will be stored in an object-oriented relational database (created in MySQL(R)) located on a secure server at NASA ARC. Upon launch, OPIS can be utilized by Managers to identify research and technology development gaps and to assess task performance. Analysts can employ OPIS to obtain.

  16. Quality of life in children with adverse drug reactions: a narrative and systematic review.

    PubMed

    Del Pozzo-Magaña, Blanca R; Rieder, Michael J; Lazo-Langner, Alejandro

    2015-10-01

    Adverse drug reactions are a common problem affecting adults and children. The economic impact of the adverse drug reactions has been widely evaluated; however, studies of the impact on the quality of life of children with adverse drug reactions are scarce. The aim was to evaluate studies assessing the health-related quality of life of children with adverse drug reactions. We conducted a systematic review that included the following electronic databases: MEDLINE, EMBASE and the Cochrane Library (including the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects, the Cochrane Controlled Trials Register and the Health Technology Assessment Databases). Nine studies were included. Four of the studies were conducted in children with epilepsy; the rest of them involved children with chronic viral hepatitis, Crohn's disease, paediatric cancer and multiple adverse drug reactions compared with healthy children. Based on their findings, authors of all studies concluded that adverse drug reactions had a negative impact on the quality of life of children. No meta-analysis was conducted given the heterogeneous nature of the studies. To date, there is no specific instrument that measures quality of life of children with adverse drug reactions, and the information available is poor and variable. In general, adverse drug reactions have a negative impact on the quality of life of affected children. For those interested in this area, more work needs to be done to improve tools that help to evaluate efficiently the health-related quality of life of children with adverse drug reactions and chronic diseases. © 2014 The British Pharmacological Society.

  17. Quality of life in children with adverse drug reactions: a narrative and systematic review

    PubMed Central

    Del Pozzo-Magaña, Blanca R; Rieder, Michael J; Lazo-Langner, Alejandro

    2015-01-01

    Aims Adverse drug reactions are a common problem affecting adults and children. The economic impact of the adverse drug reactions has been widely evaluated; however, studies of the impact on the quality of life of children with adverse drug reactions are scarce. The aim was to evaluate studies assessing the health-related quality of life of children with adverse drug reactions. Methods We conducted a systematic review that included the following electronic databases: MEDLINE, EMBASE and the Cochrane Library (including the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects, the Cochrane Controlled Trials Register and the Health Technology Assessment Databases). Results Nine studies were included. Four of the studies were conducted in children with epilepsy; the rest of them involved children with chronic viral hepatitis, Crohn’s disease, paediatric cancer and multiple adverse drug reactions compared with healthy children. Based on their findings, authors of all studies concluded that adverse drug reactions had a negative impact on the quality of life of children. No meta-analysis was conducted given the heterogeneous nature of the studies. Conclusions To date, there is no specific instrument that measures quality of life of children with adverse drug reactions, and the information available is poor and variable. In general, adverse drug reactions have a negative impact on the quality of life of affected children. For those interested in this area, more work needs to be done to improve tools that help to evaluate efficiently the health-related quality of life of children with adverse drug reactions and chronic diseases. PMID:24833305

  18. Investigation of an artificial intelligence technology--Model trees. Novel applications for an immediate release tablet formulation database.

    PubMed

    Shao, Q; Rowe, R C; York, P

    2007-06-01

    This study has investigated an artificial intelligence technology - model trees - as a modelling tool applied to an immediate release tablet formulation database. The modelling performance was compared with artificial neural networks that have been well established and widely applied in the pharmaceutical product formulation fields. The predictability of generated models was validated on unseen data and judged by correlation coefficient R(2). Output from the model tree analyses produced multivariate linear equations which predicted tablet tensile strength, disintegration time, and drug dissolution profiles of similar quality to neural network models. However, additional and valuable knowledge hidden in the formulation database was extracted from these equations. It is concluded that, as a transparent technology, model trees are useful tools to formulators.

  19. Data logger technologies for manual wheelchairs: A scoping review.

    PubMed

    Routhier, François; Lettre, Josiane; Miller, William C; Borisoff, Jaimie F; Keetch, Kate; Mitchell, Ian M; Research Team, CanWheel

    2018-01-01

    In recent years, studies have increasingly employed data logger technologies to record objective driving and physiological characteristics of manual wheelchair users. However, the technologies used offer significant differences in characteristics, such as measured outcomes, ease of use, and level of burden. In order to identify and describe the extent of published research activity that relies on data logger technologies for manual wheelchair users, we performed a scoping review of the scientific and gray literature. Five databases were searched: Medline, Compendex, CINAHL, EMBASE, and Google Scholar. The 119 retained papers document a wide variety of logging devices and sensing technologies measuring a range of outcomes. The most commonly used technologies were accelerometers installed on the user (18.8%), odometers installed on the wheelchair (12.4%), accelerometers installed on the wheelchair (9.7%), and heart monitors (9.7%). Not surprisingly, the most reported outcomes were distance, mobility events, heart rate, speed/velocity, acceleration, and driving time. With decreasing costs and technological improvements, data loggers are likely to have future widespread clinical (and even personal) use. Future research may be needed to assess the usefulness of different outcomes and to develop methods more appropriate to wheelchair users in order to optimize the practicality of wheelchair data loggers.

  20. Mental health economics: insights from Brazil.

    PubMed

    Cruz, Luciane; Lima, Ana Flavia Da Silva; Graeff-Martins, Ana; Maia, Carlos Renato Moreira; Ziegelmann, Patricia; Miguel, Sandoro; Fleck, Marcelo; Polanczyk, Carisi

    2013-04-01

    As the responsibility and demand on health care grows and resources do not increase at the same pace, the healthcare system has been forced to reconsider the benefits and costs of their actions, to ensure a rational and effective decision-making process regarding the adoption of interventions and allocation of resources. Cost-effectiveness (CE) studies represent one of the basic tools to achieve this goal. To present the current state of Health Technology Assessment (HTA) and health economics in mental health in Brazil and its importance to the decision-making process. Descriptive paper on HTA and health economics in Brazil. Databases from government and universities as well as some scientific databases to assess the information are presented. Economic analysis to evaluate interventions in mental health care is a relatively recent addition to the field of health economics; in Brazil, it is also considered a topic within Epidemiology research area. There have been an increased number of studies developed in high-income countries. However, there are fewer CE studies in low- and middle-income ones. Psychiatric disorders represent a significant burden in developing countries, where resources devoted to health care are even scarcer.

  1. ENVI-PV: An Interactive Web Client for Multi-Criteria Life Cycle Assessment of Photovoltaic Systems Worldwide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Lopez, Paula; Gschwind, Benoit; Blanc, Philippe

    Solar photovoltaics (PV) is the second largest source of new capacity among renewable energies. The worldwide capacity encompassed 135 GW in 2013 and is estimated to increase to 1721 GW in 2030 and 4674 GW in 2050, according to a prospective high-renewable scenario. To achieve this production level while minimizing environmental impacts, decision makers must have access to environmental performance data that reflect their high spatial variability accurately. We propose ENVI-PV (http://viewer.webservice-energy.org/project_iea), a new interactive tool that provides maps and screening level data, based on weighted average supply chains, for the environmental performance of common PV technologies. Environmental impacts ofmore » PV systems are evaluated according to a life cycle assessment approach. ENVI-PV was developed using a state-of-the-art interoperable and open standard Web Service framework from the Open Geospatial Consortium (OGC). It combines the latest life cycle inventories, published in 2015 by the International Energy Agency (IEA) under the Photovoltaic Power Systems Program (PVPS) Task 12, and some inventories previously published from Ecoinvent v2.2 database with solar irradiation estimates computed from the worldwide NASA SSE database. ENVI-PV is the first tool to propose a worldwide coverage of environmental performance of PV systems using a multi-criteria assessment. The user can compare the PV environmental performance to the environmental footprint of country electricity mixes. ENVI-PV is designed as an environmental interactive tool to generate PV technological options and evaluate their performance in different spatial and techno-economic contexts. Its potential applications are illustrated in this paper with several examples.« less

  2. Geological and technological assessment of artificial reef sites, Louisiana outer continental shelf

    USGS Publications Warehouse

    Pope, D.L.; Moslow, T.F.; Wagner, J.B.

    1993-01-01

    This paper describes the general procedures used to select sites for obsolete oil and gas platforms as artificial reefs on the Louisiana outer continental shelf (OCS). The methods employed incorporate six basic steps designed to resolve multiple-use conflicts that might otherwise arise with daily industry and commercial fishery operations, and to identify and assess both geological and technological constraints that could affect placement of the structures. These steps include: (1) exclusion mapping; (2) establishment of artificial reef planning areas; (3) database compilation; (4) assessment and interpretation of database; (5) mapping of geological and man-made features within each proposed reef site; and (6) site selection. Nautical charts, bathymetric maps, and offshore oil and gas maps were used for exclusion mapping, and to select nine regional planning areas. Pipeline maps were acquired from federal agencies and private industry to determine their general locations within each planning area, and to establish exclusion fairways along each pipeline route. Approximately 1600 line kilometers of high-resolution geophysical data collected by federal agencies and private industry was acquired for the nine planning areas. These data were interpreted to determine the nature and extent of near-surface geologic features that could affect placement of the structures. Seismic reflection patterns were also characterized to evaluate near-bottom sedimentation processes in the vicinity of each reef site. Geotechnical borings were used to determine the lithological and physical properties of the sediment, and for correlation with the geophysical data. Since 1987, five sites containing 10 obsolete production platforms have been selected on the Louisiana OCS using these procedures. Industry participants have realized a total savings of approximately US $1 500 000 in salvaging costs by converting these structures into artificial reefs. ?? 1993.

  3. MiDAS 2.0: an ecosystem-specific taxonomy and online database for the organisms of wastewater treatment systems expanded for anaerobic digester groups

    PubMed Central

    McIlroy, Simon Jon; Kirkegaard, Rasmus Hansen; McIlroy, Bianca; Nierychlo, Marta; Kristensen, Jannie Munk; Karst, Søren Michael; Albertsen, Mads

    2017-01-01

    Abstract Wastewater is increasingly viewed as a resource, with anaerobic digester technology being routinely implemented for biogas production. Characterising the microbial communities involved in wastewater treatment facilities and their anaerobic digesters is considered key to their optimal design and operation. Amplicon sequencing of the 16S rRNA gene allows high-throughput monitoring of these systems. The MiDAS field guide is a public resource providing amplicon sequencing protocols and an ecosystem-specific taxonomic database optimized for use with wastewater treatment facility samples. The curated taxonomy endeavours to provide a genus-level-classification for abundant phylotypes and the online field guide links this identity to published information regarding their ecology, function and distribution. This article describes the expansion of the database resources to cover the organisms of the anaerobic digester systems fed primary sludge and surplus activated sludge. The updated database includes descriptions of the abundant genus-level-taxa in influent wastewater, activated sludge and anaerobic digesters. Abundance information is also included to allow assessment of the role of emigration in the ecology of each phylotype. MiDAS is intended as a collaborative resource for the progression of research into the ecology of wastewater treatment, by providing a public repository for knowledge that is accessible to all interested in these biotechnologically important systems. Database URL: http://www.midasfieldguide.org PMID:28365734

  4. Construction of databases: advances and significance in clinical research.

    PubMed

    Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian

    2015-12-01

    Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.

  5. Pain Assessment–Can it be Done with a Computerised System? A Systematic Review and Meta-Analysis

    PubMed Central

    Pombo, Nuno; Garcia, Nuno; Bousson, Kouamana; Spinsante, Susanna; Chorbev, Ivan

    2016-01-01

    Background: Mobile and web technologies are becoming increasingly used to support the treatment of chronic pain conditions. However, the subjectivity of pain perception makes its management and evaluation very difficult. Pain treatment requires a multi-dimensional approach (e.g., sensory, affective, cognitive) whence the evidence of technology effects across dimensions is lacking. This study aims to describe computerised monitoring systems and to suggest a methodology, based on statistical analysis, to evaluate their effects on pain assessment. Methods: We conducted a review of the English-language literature about computerised systems related to chronic pain complaints that included data collected via mobile devices or Internet, published since 2000 in three relevant bibliographical databases such as BioMed Central, PubMed Central and ScienceDirect. The extracted data include: objective and duration of the study, age and condition of the participants, and type of collected information (e.g., questionnaires, scales). Results: Sixty-two studies were included, encompassing 13,338 participants. A total of 50 (81%) studies related to mobile systems, and 12 (19%) related to web-based systems. Technology and pen-and-paper approaches presented equivalent outcomes related with pain intensity. Conclusions: The adoption of technology was revealed as accurate and feasible as pen-and-paper methods. The proposed assessment model based on data fusion combined with a qualitative assessment method was revealed to be suitable. Data integration raises several concerns and challenges to the design, development and application of monitoring systems applied to pain. PMID:27089351

  6. Inorganic Crystal Structure Database (ICSD)

    National Institute of Standards and Technology Data Gateway

    SRD 84 FIZ/NIST Inorganic Crystal Structure Database (ICSD) (PC database for purchase)   The Inorganic Crystal Structure Database (ICSD) is produced cooperatively by the Fachinformationszentrum Karlsruhe(FIZ) and the National Institute of Standards and Technology (NIST). The ICSD is a comprehensive collection of crystal structure data of inorganic compounds containing more than 140,000 entries and covering the literature from 1915 to the present.

  7. Satellite Communications Technology Database. Part 2

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The Satellite Communications Technology Database is a compilation of data on state-of-the-art Ka-band technologies current as of January 2000. Most U.S. organizations have not published much of their Ka-band technology data, and so the great majority of this data is drawn largely from Japanese, European, and Canadian publications and Web sites. The data covers antennas, high power amplifiers, low noise amplifiers, MMIC devices, microwave/IF switch matrices, SAW devices, ASIC devices, power and data storage. The data herein is raw, and is often presented simply as the download of a table or figure from a site, showing specified technical characteristics, with no further explanation.

  8. New generic indexing technology

    NASA Technical Reports Server (NTRS)

    Freeston, Michael

    1996-01-01

    There has been no fundamental change in the dynamic indexing methods supporting database systems since the invention of the B-tree twenty-five years ago. And yet the whole classical approach to dynamic database indexing has long since become inappropriate and increasingly inadequate. We are moving rapidly from the conventional one-dimensional world of fixed-structure text and numbers to a multi-dimensional world of variable structures, objects and images, in space and time. But, even before leaving the confines of conventional database indexing, the situation is highly unsatisfactory. In fact, our research has led us to question the basic assumptions of conventional database indexing. We have spent the past ten years studying the properties of multi-dimensional indexing methods, and in this paper we draw the strands of a number of developments together - some quite old, some very new, to show how we now have the basis for a new generic indexing technology for the next generation of database systems.

  9. Application and Exploration of Big Data Mining in Clinical Medicine

    PubMed Central

    Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling

    2016-01-01

    Objective: To review theories and technologies of big data mining and their application in clinical medicine. Data Sources: Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Study Selection: Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. Results: This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster–Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Conclusion: Big data mining has the potential to play an important role in clinical medicine. PMID:26960378

  10. Summary of performance data for technologies to control gaseous, odor, and particulate emissions from livestock operations: Air management practices assessment tool (AMPAT)

    PubMed Central

    Maurer, Devin L.; Koziel, Jacek A.; Harmon, Jay D.; Hoff, Steven J.; Rieck-Hinz, Angela M.; Andersen, Daniel S.

    2016-01-01

    The livestock and poultry production industry, regulatory agencies, and researchers lack a current, science-based guide and data base for evaluation of air quality mitigation technologies. Data collected from science-based review of mitigation technologies using practical, stakeholders-oriented evaluation criteria to identify knowledge gaps/needs and focuses for future research efforts on technologies and areas with the greatest impact potential is presented in the Literature Database tab on the air management practices tool (AMPAT). The AMPAT is web-based (available at www.agronext.iastate.edu/ampat) and provides an objective overview of mitigation practices best suited to address odor, gaseous, and particulate matter (PM) emissions at livestock operations. The data was compiled into Excel spreadsheets from a literature review of 265 papers was performed to (1) evaluate mitigation technologies performance for emissions of odor, volatile organic compounds (VOCs), ammonia (NH3), hydrogen sulfide (H2S), particulate matter (PM), and greenhouse gases (GHGs) and to (2) inform future research needs. PMID:27158660

  11. The effect of technology on student science achievement

    NASA Astrophysics Data System (ADS)

    Hilton, June Kraft

    2003-10-01

    Prior research indicates that technology has had little effect on raising student achievement. Little empirical research exists, however, studying the effects of technology as a tool to improve student achievement through development of higher order thinking skills. Also, prior studies have not focused on the manner in which technology is being used in the classroom and at home to enhance teaching and learning. Empirical data from a secondary school representative of those in California were analyzed to determine the effects of technology on student science achievement. The quantitative analysis methods for the school data study included a multiple linear path analysis, using final course grade as the ultimate exogenous variable. In addition, empirical data from a nationwide survey on how Americans use the Internet were disaggregated by age and analyzed to determine the relationships between computer and Internet experience and (a) Internet use at home for school assignments and (b) more general computer use at home for school assignments for school age children. Analysis of data collected from the a "A Nation Online" Survey conducted by the United States Census Bureau assessed these relationships via correlations and cross-tabulations. Finally, results from these data analyses were assessed in conjunction with systemic reform efforts from 12 states designed to address improvements in science and mathematics education in light of the Third International Mathematics and Science Survey (TIMSS). Examination of the technology efforts in those states provided a more nuanced understanding of the impact technology has on student achievement. Key findings included evidence that technology training for teachers increased their use of the computer for instruction but students' final science course grade did not improve; school age children across the country did not use the computer at home for such higher-order cognitive activities as graphics and design or spreadsheets/databases; and states whose systemic reform initiatives included a mix of capacity building and alignment to state standards realized improved student achievement on the 2000 NAEP Science Assessment.

  12. Some Reliability Issues in Very Large Databases.

    ERIC Educational Resources Information Center

    Lynch, Clifford A.

    1988-01-01

    Describes the unique reliability problems of very large databases that necessitate specialized techniques for hardware problem management. The discussion covers the use of controlled partial redundancy to improve reliability, issues in operating systems and database management systems design, and the impact of disk technology on very large…

  13. Library Instruction and Online Database Searching.

    ERIC Educational Resources Information Center

    Mercado, Heidi

    1999-01-01

    Reviews changes in online database searching in academic libraries. Topics include librarians conducting all searches; the advent of end-user searching and the need for user instruction; compact disk technology; online public catalogs; the Internet; full text databases; electronic information literacy; user education and the remote library user;…

  14. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  15. EPAUS9R - An Energy Systems Database for use with the Market Allocation (MARKAL) Model

    EPA Pesticide Factsheets

    EPA’s MARKAL energy system databases estimate future-year technology dispersals and associated emissions. These databases are valuable tools for exploring a variety of future scenarios for the U.S. energy-production systems that can impact climate change c

  16. Workforce Professionalism in Drug Treatment Services: Impact of California’s Proposition 36

    PubMed Central

    Wu, Fei; Hser, Yih-Ing

    2011-01-01

    This article investigates whether California’s Proposition 36 has promoted the workforce professionalism of drug treatment services during its first five years of implementation. Program surveys inquiring about organizational information, Proposition 36 implementation, and staffing were conducted in 2003 and 2005 among all treatment providers serving Proposition 36 clients in five selected California counties (San Diego, Riverside, Kern, Sacramento, and San Francisco). A one-hour self-administered questionnaire was completed by 118 treatment providers representing 102 programs. This article examines five topics that are relevant to drug treatment workforce professionalism: resources and capability, standardized intake assessment and outcome evaluation, staff qualification, program accreditation, and information technology. Results suggest that Proposition 36 had a positive influence on the drug treatment workforce’s professionalism. Improvements have been observed in program resources, client intake assessment and outcome evaluation databases, staff professionalization, program accreditation, and information technology system. However, some areas remain problematic, including, for example, the consistent lack of adequate resources serving women with children. PMID:21036513

  17. The effect of care pathways for hip fractures: a systematic review.

    PubMed

    Leigheb, Fabrizio; Vanhaecht, Kris; Sermeus, Walter; Lodewijckx, Cathy; Deneckere, Svin; Boonen, Steven; Boto, Paulo Alexandre Faria; Mendes, Rita Veloso; Panella, Massimiliano

    2012-07-01

    We performed a systematic review for primary studies on care pathways (CPs) for hip fracture (HF). The online databases MEDLINE-PubMed, Ovid-EMBASE, CINAHL-EBSCO host, and The Cochrane Library (Cochrane Central Register of Clinical Trials, Health Technology Assessment Database, NHS Economic Evaluation Database) were searched. Two researchers reviewed the literature independently. Primary studies that met predefined inclusion criteria were assessed for their methodological quality. A total of 15 publications were included: 15 primary studies corresponding with 12 main investigations. Primary studies were evaluated for clinical outcomes, process outcomes, and economic outcomes. The studies assessed a wide range of outcome measures. While a number of divergent clinical outcomes were reported, most studies showed positive results of process management and health-services utilization. In terms of mortality, the results provided evidence for a positive impact of CPs on in-hospital mortality. Most studies also showed a significantly reduced risk of complications, including medical complications, wound infections, and pressure sores. Moreover, time-span process measures showed that an improvement in the organization of care was achieved through the use of CPs. Conflicting results were observed with regard to functional recovery and mobility between patients treated with CPs compared to usual care. Although our review suggests that CPs can have positive effects in patients with HF, the available evidence is insufficient for formal recommendations. There is a need for more research on CPs with selected process and outcome indicators, for in-hospital and postdischarge management of HF, with an emphasis on well-designed randomized trials.

  18. EPA Facility Registry Service (FRS): Facility Interests Dataset - Intranet

    EPA Pesticide Factsheets

    This web feature service consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers of haz

  19. EPA Facility Registry Service (FRS): Facility Interests Dataset - Intranet Download

    EPA Pesticide Factsheets

    This downloadable data package consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers

  20. EPA Facility Registry Service (FRS): Facility Interests Dataset Download

    EPA Pesticide Factsheets

    This downloadable data package consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers

  1. EPA Facility Registry Service (FRS): Facility Interests Dataset

    EPA Pesticide Factsheets

    This web feature service consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers of haz

  2. Next Generation Models for Storage and Representation of Microbial Biological Annotation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quest, Daniel J; Land, Miriam L; Brettin, Thomas S

    2010-01-01

    Background Traditional genome annotation systems were developed in a very different computing era, one where the World Wide Web was just emerging. Consequently, these systems are built as centralized black boxes focused on generating high quality annotation submissions to GenBank/EMBL supported by expert manual curation. The exponential growth of sequence data drives a growing need for increasingly higher quality and automatically generated annotation. Typical annotation pipelines utilize traditional database technologies, clustered computing resources, Perl, C, and UNIX file systems to process raw sequence data, identify genes, and predict and categorize gene function. These technologies tightly couple the annotation software systemmore » to hardware and third party software (e.g. relational database systems and schemas). This makes annotation systems hard to reproduce, inflexible to modification over time, difficult to assess, difficult to partition across multiple geographic sites, and difficult to understand for those who are not domain experts. These systems are not readily open to scrutiny and therefore not scientifically tractable. The advent of Semantic Web standards such as Resource Description Framework (RDF) and OWL Web Ontology Language (OWL) enables us to construct systems that address these challenges in a new comprehensive way. Results Here, we develop a framework for linking traditional data to OWL-based ontologies in genome annotation. We show how data standards can decouple hardware and third party software tools from annotation pipelines, thereby making annotation pipelines easier to reproduce and assess. An illustrative example shows how TURTLE (Terse RDF Triple Language) can be used as a human readable, but also semantically-aware, equivalent to GenBank/EMBL files. Conclusions The power of this approach lies in its ability to assemble annotation data from multiple databases across multiple locations into a representation that is understandable to researchers. In this way, all researchers, experimental and computational, will more easily understand the informatics processes constructing genome annotation and ultimately be able to help improve the systems that produce them.« less

  3. Economic evaluation in the field of mental health: conceptual basis.

    PubMed

    Lima, Ana Flávia Barros da Silva; Cruz, Luciane Nascimento; Polanczyk, Carisi Anne; Maia, Carlos Renato Moreira

    2013-01-01

    Technological advances in medicine have given rise to a dilemma concerning the use of new health technologies in a context of limited financial resources. In the field of psychiatry, health economic evaluation is a recent method that can assist in choosing interventions with different cost and/or effectiveness for specific populations or conditions. This article introduces clinicians to the fundamental concepts required for critical assessment of health economic evaluations. The authors conducted a review with systematic methods to assess the essential theoretical framework of health economic evaluation and mental health in Brazil through textbooks and studies indexed in the PubMed, Cochrane Central, LILACS, NHS CRD, and REBRATS databases. A total of 334 studies were found using the specified terms (MeSH - Mental Health AND Economic, Medical) and filters (Brazil AND Humans); however, only five Brazilian economic evaluations were found. Economic evaluation studies are growing exponentially in the medical literature. Publications focusing on health economics as applied to psychiatry are increasingly common, but Brazilian data are still very incipient. In a country where financial resources are so scarce, economic analyses are necessary to ensure better use of public resources and wider population access to effective health technologies.

  4. Developments in a centrifugal compressor surge control -- a technology assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Botros, K.K.; Henderson, J.F.

    1994-04-01

    There are a number of surge control schemes in current use for centrifugal compressors employed in natural gas transmission systems. Basically, these schemes consist of a set of detection devices that either anticipate surge or detect it at its inception, and a set of control devices that act to prevent surge from occurring. A patent search was conducted in an attempt to assess the level and direction of technology development over the last 20 years and to define the focus for future R D activities. In addition, the paper presents the current state of technology in three areas: surge control,more » surge detection, and surge suppression. Patent data obtained from on-line databases showed that most of the emphasis has been on surge control rather than on detection and control and that the current trend in surge control will likely continue toward incremental improvement of a basic or conventional surge control strategy. Various surge suppression techniques can be grouped in two categories: (1) those that are focused on better compressor interior design, and (2) others that attempt to suppress surge by external and operational means.« less

  5. An Animated Introduction to Relational Databases for Many Majors

    ERIC Educational Resources Information Center

    Dietrich, Suzanne W.; Goelman, Don; Borror, Connie M.; Crook, Sharon M.

    2015-01-01

    Database technology affects many disciplines beyond computer science and business. This paper describes two animations developed with images and color that visually and dynamically introduce fundamental relational database concepts and querying to students of many majors. The goal is for educators in diverse academic disciplines to incorporate the…

  6. Market Pressure and Government Intervention in the Administration and Development of Molecular Databases.

    ERIC Educational Resources Information Center

    Sillince, J. A. A.; Sillince, M.

    1993-01-01

    Discusses molecular databases and the role that government and private companies play in their administration and development. Highlights include copyright and patent issues relating to public databases and the information contained in them; data quality; data structures and technological questions; the international organization of molecular…

  7. Computer Security Products Technology Overview

    DTIC Science & Technology

    1988-10-01

    13 3. DATABASE MANAGEMENT SYSTEMS ................................... 15 Definition...this paper addresses fall into the areas of multi-user hosts, database management systems (DBMS), workstations, networks, guards and gateways, and...provide a portion of that protection, for example, a password scheme, a file protection mechanism, a secure database management system, or even a

  8. Database Software Selection for the Egyptian National STI Network.

    ERIC Educational Resources Information Center

    Slamecka, Vladimir

    The evaluation and selection of information/data management system software for the Egyptian National Scientific and Technical (STI) Network are described. An overview of the state-of-the-art of database technology elaborates on the differences between information retrieval and database management systems (DBMS). The desirable characteristics of…

  9. 76 FR 6789 - Unlicensed Operation in the TV Broadcast Bands

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-08

    ...., Spectrum Bridge Inc., Telcordia Technologies, and WSdb LLC--as TV bands device database administrators. The TV bands databases will be used by fixed and personal portable unlicensed devices to identify unused... administrators to develop the databases that are necessary to enable the introduction of this new class of...

  10. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  11. JPEG2000 and dissemination of cultural heritage over the Internet.

    PubMed

    Politou, Eugenia A; Pavlidis, George P; Chamzas, Christodoulos

    2004-03-01

    By applying the latest technologies in image compression for managing the storage of massive image data within cultural heritage databases and by exploiting the universality of the Internet we are now able not only to effectively digitize, record and preserve, but also to promote the dissemination of cultural heritage. In this work we present an application of the latest image compression standard JPEG2000 in managing and browsing image databases, focusing on the image transmission aspect rather than database management and indexing. We combine the technologies of JPEG2000 image compression with client-server socket connections and client browser plug-in, as to provide with an all-in-one package for remote browsing of JPEG2000 compressed image databases, suitable for the effective dissemination of cultural heritage.

  12. Innovative Technologies for Global Space Exploration

    NASA Technical Reports Server (NTRS)

    Hay, Jason; Gresham, Elaine; Mullins, Carie; Graham, Rachael; Williams-Byrd; Reeves, John D.

    2012-01-01

    Under the direction of NASA's Exploration Systems Mission Directorate (ESMD), Directorate Integration Office (DIO), The Tauri Group with NASA's Technology Assessment and Integration Team (TAIT) completed several studies and white papers that identify novel technologies for human exploration. These studies provide technical inputs to space exploration roadmaps, identify potential organizations for exploration partnerships, and detail crosscutting technologies that may meet some of NASA's critical needs. These studies are supported by a relational database of more than 400 externally funded technologies relevant to current exploration challenges. The identified technologies can be integrated into existing and developing roadmaps to leverage external resources, thereby reducing the cost of space exploration. This approach to identifying potential spin-in technologies and partnerships could apply to other national space programs, as well as international and multi-government activities. This paper highlights innovative technologies and potential partnerships from economic sectors that historically are less connected to space exploration. It includes breakthrough concepts that could have a significant impact on space exploration and discusses the role of breakthrough concepts in technology planning. Technologies and partnerships are from NASA's Technology Horizons and Technology Frontiers game-changing and breakthrough technology reports as well as the External Government Technology Dataset, briefly described in the paper. The paper highlights example novel technologies that could be spun-in from government and commercial sources, including virtual worlds, synthetic biology, and human augmentation. It will consider how these technologies can impact space exploration and will discuss ongoing activities for planning and preparing them.

  13. Spatial Query for Planetary Data

    NASA Technical Reports Server (NTRS)

    Shams, Khawaja S.; Crockett, Thomas M.; Powell, Mark W.; Joswig, Joseph C.; Fox, Jason M.

    2011-01-01

    Science investigators need to quickly and effectively assess past observations of specific locations on a planetary surface. This innovation involves a location-based search technology that was adapted and applied to planetary science data to support a spatial query capability for mission operations software. High-performance location-based searching requires the use of spatial data structures for database organization. Spatial data structures are designed to organize datasets based on their coordinates in a way that is optimized for location-based retrieval. The particular spatial data structure that was adapted for planetary data search is the R+ tree.

  14. [Meta-analysis for correlation between multiple lung lobe lesions and prognostic influence on acquired pneumonia in hospitalized elderly patients].

    PubMed

    Huang, Wenjie; Feng, Wei; Li, Yang; Chen, Yu

    2014-11-01

    To explore the correlation regarding the prognostic influence between multiple lung lobe lesions and acquired pneumonia in hospitalized elderly patients by a Meta-analysis. We collected all studies which investigated the correlation regarding the prognostic effect between multiple lung lobe lesions and acquired pneumonia by searching China National Knowledge Infrastructure, Wanfang Database, Chinese Science and Technology Periodical Database, Chinese Biological Medical Literature Database, PubMed, and EMBase in accordance with the inclusion and exclusion criteria. Th e retrieval limit time of searches was from databases establishment to July 2014. Th e Meta-analysis was performed by using RevMan5.2 soft ware. We calculated the odds ratio (OR) and 95% confidence interval (95% CI) by using heterogeneous tests. Publication bias was assessed by Egger's test and funnel plot, and the sensitivity was analyzed. Ten studies involving 1 836 patients were finally included, with 487 cases (the dead group) and 1 349 controls (the survival group). The Meta-analysis demonstrated that multiple lung lobe lesions was highly correlated with the prognosis for the aged acquired pneumonia (OR=3.22, 95% CI 1.84 to 5.63). Multiple lung lobe lesions increase the risk of death in the prognosis of the aged patients with acquired pneumonia.

  15. Retinal imaging using adaptive optics technology☆

    PubMed Central

    Kozak, Igor

    2014-01-01

    Adaptive optics (AO) is a technology used to improve the performance of optical systems by reducing the effect of wave front distortions. Retinal imaging using AO aims to compensate for higher order aberrations originating from the cornea and the lens by using deformable mirror. The main application of AO retinal imaging has been to assess photoreceptor cell density, spacing, and mosaic regularity in normal and diseased eyes. Apart from photoreceptors, the retinal pigment epithelium, retinal nerve fiber layer, retinal vessel wall and lamina cribrosa can also be visualized with AO technology. Recent interest in AO technology in eye research has resulted in growing number of reports and publications utilizing this technology in both animals and humans. With the availability of first commercially available instruments we are making transformation of AO technology from a research tool to diagnostic instrument. The current challenges include imaging eyes with less than perfect optical media, formation of normative databases for acquired images such as cone mosaics, and the cost of the technology. The opportunities for AO will include more detailed diagnosis with description of some new findings in retinal diseases and glaucoma as well as expansion of AO into clinical trials which has already started. PMID:24843304

  16. JNDMS Task Authorization 2 Report

    DTIC Science & Technology

    2013-10-01

    uses Barnyard to store alarms from all DREnet Snort sensors in a MySQL database. Barnyard is an open source tool designed to work with Snort to take...Technology ITI Information Technology Infrastructure J2EE Java 2 Enterprise Edition JAR Java Archive. This is an archive file format defined by Java ...standards. JDBC Java Database Connectivity JDW JNDMS Data Warehouse JNDMS Joint Network and Defence Management System JNDMS Joint Network Defence and

  17. Study protocol: differential effects of diet and physical activity based interventions in pregnancy on maternal and fetal outcomes--individual patient data (IPD) meta-analysis and health economic evaluation.

    PubMed

    Ruifrok, Anneloes E; Rogozinska, Ewelina; van Poppel, Mireille N M; Rayanagoudar, Girish; Kerry, Sally; de Groot, Christianne J M; Yeo, SeonAe; Molyneaux, Emma; McAuliffe, Fionnuala M; Poston, Lucilla; Roberts, Tracy; Riley, Richard D; Coomarasamy, Arri; Khan, Khalid; Mol, Ben Willem; Thangaratinam, Shakila

    2014-11-04

    Pregnant women who gain excess weight are at risk of complications during pregnancy and in the long term. Interventions based on diet and physical activity minimise gestational weight gain with varied effect on clinical outcomes. The effect of interventions on varied groups of women based on body mass index, age, ethnicity, socioeconomic status, parity, and underlying medical conditions is not clear. Our individual patient data (IPD) meta-analysis of randomised trials will assess the differential effect of diet- and physical activity-based interventions on maternal weight gain and pregnancy outcomes in clinically relevant subgroups of women. Randomised trials on diet and physical activity in pregnancy will be identified by searching the following databases: MEDLINE, EMBASE, BIOSIS, LILACS, Pascal, Science Citation Index, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Database of Abstracts of Reviews of Effects, and Health Technology Assessment Database. Primary researchers of the identified trials are invited to join the International Weight Management in Pregnancy Collaborative Network and share their individual patient data. We will reanalyse each study separately and confirm the findings with the original authors. Then, for each intervention type and outcome, we will perform as appropriate either a one-step or a two-step IPD meta-analysis to obtain summary estimates of effects and 95% confidence intervals, for all women combined and for each subgroup of interest. The primary outcomes are gestational weight gain and composite adverse maternal and fetal outcomes. The difference in effects between subgroups will be estimated and between-study heterogeneity suitably quantified and explored. The potential for publication bias and availability bias in the IPD obtained will be investigated. We will conduct a model-based economic evaluation to assess the cost effectiveness of the interventions to manage weight gain in pregnancy and undertake a value of information analysis to inform future research. PROSPERO 2013: CRD42013003804.

  18. A Novel Approach: Chemical Relational Databases, and the Role of the ISSCAN Database on Assessing Chemical Carcinogenity

    EPA Science Inventory

    Mutagenicity and carcinogenicity databases are crucial resources for toxicologists and regulators involved in chemicals risk assessment. Until recently, existing public toxicity databases have been constructed primarily as "look-up-tables" of existing data, and most often did no...

  19. DIMA quick start, database for inventory, monitoring and assessment

    USDA-ARS?s Scientific Manuscript database

    The Database for Inventory, Monitoring and Assessment (DIMA) is a highly-customized Microsoft Access database for collecting data electronically in the field and for organizing, storing and reporting those data for monitoring and assessment. While DIMA can be used for any number of different monito...

  20. An overview and methodological assessment of systematic reviews and meta-analyses of enhanced recovery programmes in colorectal surgery

    PubMed Central

    Chambers, Duncan; Paton, Fiona; Wilson, Paul; Eastwood, Alison; Craig, Dawn; Fox, Dave; Jayne, David; McGinnes, Erika

    2014-01-01

    Objectives To identify and critically assess the extent to which systematic reviews of enhanced recovery programmes for patients undergoing colorectal surgery differ in their methodology and reported estimates of effect. Design Review of published systematic reviews. We searched the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment (HTA) Database from 1990 to March 2013. Systematic reviews of enhanced recovery programmes for patients undergoing colorectal surgery were eligible for inclusion. Primary and secondary outcome measures The primary outcome was length of hospital stay. We assessed changes in pooled estimates of treatment effect over time and how these might have been influenced by decisions taken by researchers as well as by the availability of new trials. The quality of systematic reviews was assessed using the Centre for Reviews and Dissemination (CRD) DARE critical appraisal process. Results 10 systematic reviews were included. Systematic reviews of randomised controlled trials have consistently shown a reduction in length of hospital stay with enhanced recovery compared with traditional care. The estimated effect tended to increase from 2006 to 2010 as more trials were published but has not altered significantly in the most recent review, despite the inclusion of several unique trials. The best estimate appears to be an average reduction of around 2.5 days in primary postoperative length of stay. Differences between reviews reflected differences in interpretation of inclusion criteria, searching and analytical methods or software. Conclusions Systematic reviews of enhanced recovery programmes show a high level of research waste, with multiple reviews covering identical or very similar groups of trials. Where multiple reviews exist on a topic, interpretation may require careful attention to apparently minor differences between reviews. Researchers can help readers by acknowledging existing reviews and through clear reporting of key decisions, especially on inclusion/exclusion and on statistical pooling. PMID:24879828

  1. UTILIZATION OF GEOGRAPHIC INFORMATION SYSTEMS TECHNOLOGY IN THE ASSESSMENT OF REGIONAL GROUND-WATER QUALITY.

    USGS Publications Warehouse

    Nebert, Douglas; Anderson, Dean

    1987-01-01

    The U. S. Geological Survey (USGS) in cooperation with the U. S. Environmental Protection Agency Office of Pesticide Programs and several State agencies in Oregon has prepared a digital spatial database at 1:500,000 scale to be used as a basis for evaluating the potential for ground-water contamination by pesticides and other agricultural chemicals. Geographic information system (GIS) software was used to assemble, analyze, and manage spatial and tabular environmental data in support of this project. Physical processes were interpreted relative to published spatial data and an integrated database to support the appraisal of regional ground-water contamination was constructed. Ground-water sampling results were reviewed relative to the environmental factors present in several agricultural areas to develop an empirical knowledge base which could be used to assist in the selection of future sampling or study areas.

  2. Interpretation guidelines of a standard Y-chromosome STR 17-plex PCR-CE assay for crime casework.

    PubMed

    Roewer, Lutz; Geppert, Maria

    2012-01-01

    Y-STR analysis is an invaluable tool to examine evidence in sexual assault cases and in other forensic casework. Unambiguous detection of the male component in DNA mixtures with a high female background is still the main field of application of forensic Y-STR haplotyping. In the last years, powerful technologies including a 17-locus multiplex PCR assay have been introduced in the forensic laboratories. At the same time, statistical methods have been developed and adapted for interpretation of a nonrecombining, linear marker as the Y-chromosome which shows a strongly clustered geographical distribution due to the linear inheritance and the patrilocality of ancestral groups. Large population databases, namely the Y-STR Haplotype Reference Database (YHRD), have been established to assess the evidentiary value of Y-STR matches by means of frequency estimation methods (counting and extrapolation).

  3. Development of a database of instruments for resource-use measurement: purpose, feasibility, and design.

    PubMed

    Ridyard, Colin H; Hughes, Dyfrig A

    2012-01-01

    Health economists frequently rely on methods based on patient recall to estimate resource utilization. Access to questionnaires and diaries, however, is often limited. This study examined the feasibility of establishing an open-access Database of Instruments for Resource-Use Measurement, identified relevant fields for data extraction, and outlined its design. An electronic survey was sent to authors of full UK economic evaluations listed in the National Health Service Economic Evaluation Database (2008-2010), authors of monographs of Health Technology Assessments (1998-2010), and subscribers to the JISCMail health economics e-mailing list. The survey included questions on piloting, validation, recall period, and data capture method. Responses were analyzed and data extracted to generate relevant fields for the database. A total of 143 responses to the survey provided data on 54 resource-use instruments for inclusion in the database. All were reliant on patient or carer recall, and a majority (47) were questionnaires. Thirty-seven were designed for self-completion by the patient, carer, or guardian, and the remainder were designed for completion by researchers or health care professionals while interviewing patients. Methods of development were diverse, particularly in areas such as the planning of resource itemization (evident in 25 instruments), piloting (25), and validation (29). On the basis of the present analysis, we developed a Web-enabled Database of Instruments for Resource-Use Measurement, accessible via www.DIRUM.org. This database may serve as a practical resource for health economists, as well as a means to facilitate further research in the area of resource-use data collection. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. Global Collaboration Enhances Technology Literacy

    ERIC Educational Resources Information Center

    Cook, Linda A.; Bell, Meredith L.; Nugent, Jill; Smith, Walter S.

    2016-01-01

    Today's learners routinely use technology outside of school to communicate, collaborate, and gather information about the world around them. Classroom learning experiences are relevant when they include communication technologies such as social networking, blogging, and video conferencing, and information technologies such as databases, browsers,…

  5. Integration of an Evidence Base into a Probabilistic Risk Assessment Model. The Integrated Medical Model Database: An Organized Evidence Base for Assessing In-Flight Crew Health Risk and System Design

    NASA Technical Reports Server (NTRS)

    Saile, Lynn; Lopez, Vilma; Bickham, Grandin; FreiredeCarvalho, Mary; Kerstman, Eric; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) database, which is an organized evidence base for assessing in-flight crew health risk. The database is a relational database accessible to many people. The database quantifies the model inputs by a ranking based on the highest value of the data as Level of Evidence (LOE) and the quality of evidence (QOE) score that provides an assessment of the evidence base for each medical condition. The IMM evidence base has already been able to provide invaluable information for designers, and for other uses.

  6. Database of Industrial Technological Information in Kanagawa : Networks for Technology Activities

    NASA Astrophysics Data System (ADS)

    Saito, Akira; Shindo, Tadashi

    This system is one of the databases which require participation by its members and of which premise is to open all the data in it. Aiming at free technological cooperation and exchange among industries it was constructed by Kanagawa Prefecture in collaboration with enterprises located in it. The input data is 36 items such as major product, special and advantageous technology, technolagy to be wanted for cooperation, facility and equipment, which technologically characterize each enterprise. They are expressed in 2,000 characters and written by natural language including Kanji except for some coded items. 24 search items are accessed by natural language so that in addition to interactive searching procedures including menu-type it enables extensive searching. The information service started in Oct., 1986 covering data from 2,000 enterprisen.

  7. Facilitating Collaboration, Knowledge Construction and Communication with Web-Enabled Databases.

    ERIC Educational Resources Information Center

    McNeil, Sara G.; Robin, Bernard R.

    This paper presents an overview of World Wide Web-enabled databases that dynamically generate Web materials and focuses on the use of this technology to support collaboration, knowledge construction, and communication. Database applications have been used in classrooms to support learning activities for over a decade, but, although business and…

  8. Implementing a Dynamic Database-Driven Course Using LAMP

    ERIC Educational Resources Information Center

    Laverty, Joseph Packy; Wood, David; Turchek, John

    2011-01-01

    This paper documents the formulation of a database driven open source architecture web development course. The design of a web-based curriculum faces many challenges: a) relative emphasis of client and server-side technologies, b) choice of a server-side language, and c) the cost and efficient delivery of a dynamic web development, database-driven…

  9. An Experimental Investigation of Complexity in Database Query Formulation Tasks

    ERIC Educational Resources Information Center

    Casterella, Gretchen Irwin; Vijayasarathy, Leo

    2013-01-01

    Information Technology professionals and other knowledge workers rely on their ability to extract data from organizational databases to respond to business questions and support decision making. Structured query language (SQL) is the standard programming language for querying data in relational databases, and SQL skills are in high demand and are…

  10. A UNIMARC Bibliographic Format Database for ABCD

    ERIC Educational Resources Information Center

    Megnigbeto, Eustache

    2012-01-01

    Purpose: ABCD is a web-based open and free software suite for library management derived from the UNESCO CDS/ISIS software technology. The first version was launched officially in December 2009 with a MARC 21 bibliographic format database. This paper aims to detail the building of the UNIMARC bibliographic format database for ABCD.…

  11. Generation of an Aerothermal Data Base for the X33 Spacecraft

    NASA Technical Reports Server (NTRS)

    Roberts, Cathy; Huynh, Loc

    1998-01-01

    The X-33 experimental program is a cooperative program between industry and NASA, managed by Lockheed-Martin Skunk Works to develop an experimental vehicle to demonstrate new technologies for a single-stage-to-orbit, fully reusable launch vehicle (RLV). One of the new technologies to be demonstrated is an advanced Thermal Protection System (TPS) being designed by BF Goodrich (formerly Rohr, Inc.) with support from NASA. The calculation of an aerothermal database is crucial to identifying the critical design environment data for the TPS. The NASA Ames X-33 team has generated such a database using Computational Fluid Dynamics (CFD) analyses, engineering analysis methods and various programs to compare and interpolate the results from the CFD and the engineering analyses. This database, along with a program used to query the database, is used extensively by several X-33 team members to help them in designing the X-33. This paper will describe the methods used to generate this database, the program used to query the database, and will show some of the aerothermal analysis results for the X-33 aircraft.

  12. SQL is Dead; Long-live SQL: Relational Database Technology in Science Contexts

    NASA Astrophysics Data System (ADS)

    Howe, B.; Halperin, D.

    2014-12-01

    Relational databases are often perceived as a poor fit in science contexts: Rigid schemas, poor support for complex analytics, unpredictable performance, significant maintenance and tuning requirements --- these idiosyncrasies often make databases unattractive in science contexts characterized by heterogeneous data sources, complex analysis tasks, rapidly changing requirements, and limited IT budgets. In this talk, I'll argue that although the value proposition of typical relational database systems are weak in science, the core ideas that power relational databases have become incredibly prolific in open source science software, and are emerging as a universal abstraction for both big data and small data. In addition, I'll talk about two open source systems we are building to "jailbreak" the core technology of relational databases and adapt them for use in science. The first is SQLShare, a Database-as-a-Service system supporting collaborative data analysis and exchange by reducing database use to an Upload-Query-Share workflow with no installation, schema design, or configuration required. The second is Myria, a service that supports much larger scale data, complex analytics, and supports multiple back end systems. Finally, I'll describe some of the ways our collaborators in oceanography, astronomy, biology, fisheries science, and more are using these systems to replace script-based workflows for reasons of performance, flexibility, and convenience.

  13. Achieving high confidence protein annotations in a sea of unknowns

    NASA Astrophysics Data System (ADS)

    Timmins-Schiffman, E.; May, D. H.; Noble, W. S.; Nunn, B. L.; Mikan, M.; Harvey, H. R.

    2016-02-01

    Increased sensitivity of mass spectrometry (MS) technology allows deep and broad insight into community functional analyses. Metaproteomics holds the promise to reveal functional responses of natural microbial communities, whereas metagenomics alone can only hint at potential functions. The complex datasets resulting from ocean MS have the potential to inform diverse realms of the biological, chemical, and physical ocean sciences, yet the extent of bacterial functional diversity and redundancy has not been fully explored. To take advantage of these impressive datasets, we need a clear bioinformatics pipeline for metaproteomics peptide identification and annotation with a database that can provide confident identifications. Researchers must consider whether it is sufficient to leverage the vast quantities of available ocean sequence data or if they must invest in site-specific metagenomic sequencing. We have sequenced, to our knowledge, the first western arctic metagenomes from the Bering Strait and the Chukchi Sea. We have addressed the long standing question: Is a metagenome required to accurately complete metaproteomics and assess the biological distribution of metabolic functions controlling nutrient acquisition in the ocean? Two different protein databases were constructed from 1) a site-specific metagenome and 2) subarctic/arctic groups available in NCBI's non-redundant database. Multiple proteomic search strategies were employed, against each individual database and against both databases combined, to determine the algorithm and approach that yielded the balance of high sensitivity and confident identification. Results yielded over 8200 confidently identified proteins. Our comparison of these results allows us to quantify the utility of investing resources in a metagenome versus using the constantly expanding and immediately available public databases for metaproteomic studies.

  14. The Effectiveness of eHealth Technologies on Weight Management in Pregnant and Postpartum Women: Systematic Review and Meta-Analysis.

    PubMed

    Sherifali, Diana; Nerenberg, Kara A; Wilson, Shanna; Semeniuk, Kevin; Ali, Muhammad Usman; Redman, Leanne M; Adamo, Kristi B

    2017-10-13

    The emergence and utilization of electronic health (eHealth) technologies has increased in a variety of health interventions. Exploiting the real-time advantages offered by mobile technologies during and after pregnancy has the potential to empower women and encourage behaviors that may improve maternal and child health. The objective of this study was to assess the effectiveness of eHealth technologies for weight management during pregnancy and the postpartum period and to review the efficacy of eHealth technologies on health behaviors, specifically nutrition and physical activity. A systematic search was conducted of the following databases: MEDLINE, EMBASE, Cochrane database of systematic reviews (CDSR), Cochrane central register of controlled trials (CENTRAL), CINAHL (Cumulative Index to Nursing and Allied Health Literature), and PsycINFO. The search included studies published from 1990 to July 5, 2016. All relevant primary studies that involved randomized controlled trials (RCTs), non-RCTs, before-and-after studies, historically controlled studies, and pilot studies were included. The study population was adult women of childbearing age either during pregnancy or the postpartum period. eHealth weight management intervention studies targeting physical activity, nutrition, or both, over a minimum 3-month period were included. Titles and abstracts, as well as full-text screening were conducted. Study quality was assessed using Cochrane's risk of bias tool. Data extraction was completed by a single reviewer, which was then verified by a second independent reviewer. Results were meta-analyzed to calculate pooled estimates of the effect, wherever possible. Overall, 1787 and 176 citations were reviewed at the abstract and full-text screening stages, respectively. A total of 10 studies met the inclusion criteria ranging from high to low risk of bias. Pooled estimates from studies of the effect for postpartum women resulted in a significant reduction in weight (-2.55 kg, 95% CI -3.81 to -1.28) after 3 to 12 months and six studies found a nonsignificant reduction in weight gain for pregnant women (-1.62 kg, 95% CI -3.57 to 0.33) at approximately 40 weeks. This review found evidence for benefits of eHealth technologies on weight management in postpartum women only. Further research is still needed regarding the use of these technologies during and after pregnancy. ©Diana Sherifali, Kara A Nerenberg, Shanna Wilson, Kevin Semeniuk, Muhammad Usman Ali, Leanne M Redman, Kristi B Adamo. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 13.10.2017.

  15. DataBase on Demand

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, R.; Gomez, D.; Coterillo Coz, I.; Wojcik, D.

    2012-12-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  16. The NASA ASTP Combined-Cycle Propulsion Database Project

    NASA Technical Reports Server (NTRS)

    Hyde, Eric H.; Escher, Daric W.; Heck, Mary T.; Roddy, Jordan E.; Lyles, Garry (Technical Monitor)

    2000-01-01

    The National Aeronautics and Space Administration (NASA) communicated its long-term R&D goals for aeronautics and space transportation technologies in its 1997-98 annual progress report (Reference 1). Under "Pillar 3, Goal 9" a 25-year-horizon set of objectives has been stated for the Generation 3 Reusable Launch Vehicle ("Gen 3 RLV") class of space transportation systems. An initiative referred to as "Spaceliner 100" is being conducted to identify technology roadmaps in support of these objectives. Responsibility for running "Spaceliner 100" technology development and demonstration activities have been assigned to NASA's agency-wide Advanced Space Transportation Program (ASTP) office located at the Marshall Space Flight Center. A key technology area in which advances will be required in order to meet these objectives is propulsion. In 1996, in order to expand their focus beyond "allrocket" propulsion systems and technologies (see Appendix A for further discussion), ASTP initiated technology development and demonstration work on combined-cycle airbreathing/rocket propulsion systems (ARTT Contracts NAS8-40890 through 40894). Combined-cycle propulsion (CCP) activities (see Appendix B for definitions) have been pursued in the U.S. for over four decades, resulting in a large documented knowledge base on this subject (see Reference 2). In the fall of 1999 the Combined-Cycle Propulsion Database (CCPD) project was established with the primary purpose of collecting and consolidating CCP related technical information in support of the ASTP's ongoing technology development and demonstration program. Science Applications International Corporation (SAIC) was selected to perform the initial development of the Database under its existing support contract with MSFC (Contract NAS8-99060) because of the company's unique combination of capabilities in database development, information technology (IT) and CCP knowledge. The CCPD is summarized in the descriptive 2-page flyer appended to this paper as Appendix C. The purpose of this paper is to provide the reader with an understanding of the objectives of the CCPD and relate the progress that has been made toward meeting those objectives.

  17. Computerized Design Synthesis (CDS), A database-driven multidisciplinary design tool

    NASA Technical Reports Server (NTRS)

    Anderson, D. M.; Bolukbasi, A. O.

    1989-01-01

    The Computerized Design Synthesis (CDS) system under development at McDonnell Douglas Helicopter Company (MDHC) is targeted to make revolutionary improvements in both response time and resource efficiency in the conceptual and preliminary design of rotorcraft systems. It makes the accumulated design database and supporting technology analysis results readily available to designers and analysts of technology, systems, and production, and makes powerful design synthesis software available in a user friendly format.

  18. Extending the data dictionary for data/knowledge management

    NASA Technical Reports Server (NTRS)

    Hydrick, Cecile L.; Graves, Sara J.

    1988-01-01

    Current relational database technology provides the means for efficiently storing and retrieving large amounts of data. By combining techniques learned from the field of artificial intelligence with this technology, it is possible to expand the capabilities of such systems. This paper suggests using the expanded domain concept, an object-oriented organization, and the storing of knowledge rules within the relational database as a solution to the unique problems associated with CAD/CAM and engineering data.

  19. Person-Generated Health Data in Simulated Rehabilitation Using Kinect for Stroke: Literature Review.

    PubMed

    Dimaguila, Gerardo Luis; Gray, Kathleen; Merolli, Mark

    2018-05-08

    Person- or patient-generated health data (PGHD) are health, wellness, and clinical data that people generate, record, and analyze for themselves. There is potential for PGHD to improve the efficiency and effectiveness of simulated rehabilitation technologies for stroke. Simulated rehabilitation is a type of telerehabilitation that uses computer technologies and interfaces to allow the real-time simulation of rehabilitation activities or a rehabilitation environment. A leading technology for simulated rehabilitation is Microsoft's Kinect, a video-based technology that uses infrared to track a user's body movements. This review attempts to understand to what extent Kinect-based stroke rehabilitation systems (K-SRS) have used PGHD and to what benefit. The review is conducted in two parts. In part 1, aspects of relevance for PGHD were searched for in existing systematic reviews on K-SRS. The following databases were searched: IEEE Xplore, Association of Computing Machinery Digital Library, PubMed, Biomed Central, Cochrane Library, and Campbell Collaboration. In part 2, original research papers that presented or used K-SRS were reviewed in terms of (1) types of PGHD, (2) patient access to PGHD, (3) PGHD use, and (4) effects of PGHD use. The search was conducted in the same databases as part 1 except Cochrane and Campbell Collaboration. Reference lists on K-SRS of the reviews found in part 1 were also included in the search for part 2. There was no date restriction. The search was closed in June 2017. The quality of the papers was not assessed, as it was not deemed critical to understanding PGHD access and use in studies that used K-SRS. In part 1, 192 papers were identified, and after assessment only 3 papers were included. Part 1 showed that previous reviews focused on technical effectiveness of K-SRS with some attention on clinical effectiveness. None of those reviews reported on home-based implementation or PGHD use. In part 2, 163 papers were identified and after assessment, 41 papers were included. Part 2 showed that there is a gap in understanding how PGHD use may affect patients using K-SRS and a lack of patient participation in the design of such systems. This paper calls specifically for further studies of K-SRS-and for studies of technologies that allow patients to generate their own health data in general-to pay more attention to how patients' own use of their data may influence their care processes and outcomes. Future studies that trial the effectiveness of K-SRS outside the clinic should also explore how patients and carers use PGHD in home rehabilitation programs. ©Gerardo Luis Dimaguila, Kathleen Gray, Mark Merolli. Originally published in JMIR Rehabilitation and Assistive Technology (http://rehab.jmir.org), 08.05.2018.

  20. Quality control and assurance for validation of DOS/I measurements

    NASA Astrophysics Data System (ADS)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  1. Nanotechnology in food processing sector-An assessment of emerging trends.

    PubMed

    Kalpana Sastry, R; Anshul, Shrivastava; Rao, N H

    2013-10-01

    Use of nanoscience based technology in the food industry is fast emerging as new area for research and development. Several research groups including private companies in the industry have initiated research programmes for exploring the wide scope of nanotechnology into the value chain of food processing and manufacturing. This paper discusses the current focus of research in this area and assesses its potential impacts. Using the developed relational database framework with R&D indicators like literature and patent documents for assessment of the potential of nanotechnology in food sector, a model to organize and map nanoresearch areas to the food processing sector was developed. The study indicates that the about five basic categories of nanotechnology applications and functionalities currently in the development of food sector, include food processing, packaging, nutraceuticals delivery, food safety and functional foods.

  2. DIMA.Tools: An R package for working with the database for inventory, monitoring, and assessment

    USDA-ARS?s Scientific Manuscript database

    The Database for Inventory, Monitoring, and Assessment (DIMA) is a Microsoft Access database used to collect, store and summarize monitoring data. This database is used by both local and national monitoring efforts within the National Park Service, the Forest Service, the Bureau of Land Management, ...

  3. Viscoelastic point-of-care testing to assist with the diagnosis, management and monitoring of haemostasis: a systematic review and cost-effectiveness analysis.

    PubMed

    Whiting, Penny; Al, Maiwenn; Westwood, Marie; Ramos, Isaac Corro; Ryder, Steve; Armstrong, Nigel; Misso, Kate; Ross, Janine; Severens, Johan; Kleijnen, Jos

    2015-07-01

    Patients with substantive bleeding usually require transfusion and/or (re-)operation. Red blood cell (RBC) transfusion is independently associated with a greater risk of infection, morbidity, increased hospital stay and mortality. ROTEM (ROTEM® Delta, TEM International GmbH, Munich, Germany; www.rotem.de), TEG (TEG® 5000 analyser, Haemonetics Corporation, Niles, IL, USA; www.haemonetics.com) and Sonoclot (Sonoclot® coagulation and platelet function analyser, Sienco Inc., Arvada, CO) are point-of-care viscoelastic (VE) devices that use thromboelastometry to test for haemostasis in whole blood. They have a number of proposed advantages over standard laboratory tests (SLTs): they provide a result much quicker, are able to identify what part of the clotting process is disrupted, and provide information on clot formation over time and fibrinolysis. This assessment aimed to assess the clinical effectiveness and cost-effectiveness of VE devices to assist with the diagnosis, management and monitoring of haemostasis disorders during and after cardiac surgery, trauma-induced coagulopathy and post-partum haemorrhage (PPH). Sixteen databases were searched to December 2013: MEDLINE (OvidSP), MEDLINE In-Process and Other Non-Indexed Citations and Daily Update (OvidSP), EMBASE (OvidSP), BIOSIS Previews (Web of Knowledge), Science Citation Index (SCI) (Web of Science), Conference Proceedings Citation Index (CPCI-S) (Web of Science), Cochrane Database of Systematic Reviews (CDSR), Cochrane Central Register of Controlled Trials (CENTRAL), Database of Abstracts of Reviews of Effects (DARE), Health Technology Assessment (HTA) database, Latin American and Caribbean Health Sciences Literature (LILACS), International Network of Agencies for Health Technology Assessment (INAHTA), National Institute for Health Research (NIHR) HTA programme, Aggressive Research Intelligence Facility (ARIF), Medion, and the International Prospective Register of Systematic Reviews (PROSPERO). Randomised controlled trials (RCTs) were assessed for quality using the Cochrane Risk of Bias tool. Prediction studies were assessed using QUADAS-2. For RCTs, summary relative risks (RRs) were estimated using random-effects models. Continuous data were summarised narratively. For prediction studies, the odds ratio (OR) was selected as the primary effect estimate. The health-economic analysis considered the costs and quality-adjusted life-years of ROTEM, TEG and Sonoclot compared with SLTs in cardiac surgery and trauma patients. A decision tree was used to take into account short-term complications and longer-term side effects from transfusion. The model assumed a 1-year time horizon. Thirty-one studies (39 publications) were included in the clinical effectiveness review. Eleven RCTs (n=1089) assessed VE devices in patients undergoing cardiac surgery; six assessed thromboelastography (TEG) and five assessed ROTEM. There was a significant reduction in RBC transfusion [RR 0.88, 95% confidence interval (CI) 0.80 to 0.96; six studies], platelet transfusion (RR 0.72, 95% CI 0.58 to 0.89; six studies) and fresh frozen plasma to transfusion (RR 0.47, 95% CI 0.35 to 0.65; five studies) in VE testing groups compared with control. There were no significant differences between groups in terms of other blood products transfused. Continuous data on blood product use supported these findings. Clinical outcomes did not differ significantly between groups. There were no apparent differences between ROTEM or TEG; none of the RCTs evaluated Sonoclot. There were no data on the clinical effectiveness of VE devices in trauma patients or women with PPH. VE testing was cost-saving and more effective than SLTs. For the cardiac surgery model, the cost-saving was £43 for ROTEM, £79 for TEG and £132 for Sonoclot. For the trauma population, the cost-savings owing to VE testing were more substantial, amounting to per-patient savings of £688 for ROTEM compared with SLTs, £721 for TEG, and £818 for Sonoclot. This finding was entirely dependent on material costs, which are slightly higher for ROTEM. VE testing remained cost-saving following various scenario analyses. VE testing is cost-saving and more effective than SLTs, in both patients undergoing cardiac surgery and trauma patients. However, there were no data on the clinical effectiveness of Sonoclot or of VE devices in trauma patients. This study is registered as PROSPERO CRD42013005623. The NIHR Health Technology Assessment programme.

  4. Construction of an ortholog database using the semantic web technology for integrative analysis of genomic data.

    PubMed

    Chiba, Hirokazu; Nishide, Hiroyo; Uchiyama, Ikuo

    2015-01-01

    Recently, various types of biological data, including genomic sequences, have been rapidly accumulating. To discover biological knowledge from such growing heterogeneous data, a flexible framework for data integration is necessary. Ortholog information is a central resource for interlinking corresponding genes among different organisms, and the Semantic Web provides a key technology for the flexible integration of heterogeneous data. We have constructed an ortholog database using the Semantic Web technology, aiming at the integration of numerous genomic data and various types of biological information. To formalize the structure of the ortholog information in the Semantic Web, we have constructed the Ortholog Ontology (OrthO). While the OrthO is a compact ontology for general use, it is designed to be extended to the description of database-specific concepts. On the basis of OrthO, we described the ortholog information from our Microbial Genome Database for Comparative Analysis (MBGD) in the form of Resource Description Framework (RDF) and made it available through the SPARQL endpoint, which accepts arbitrary queries specified by users. In this framework based on the OrthO, the biological data of different organisms can be integrated using the ortholog information as a hub. Besides, the ortholog information from different data sources can be compared with each other using the OrthO as a shared ontology. Here we show some examples demonstrating that the ortholog information described in RDF can be used to link various biological data such as taxonomy information and Gene Ontology. Thus, the ortholog database using the Semantic Web technology can contribute to biological knowledge discovery through integrative data analysis.

  5. The Design and Product of National 1:1000000 Cartographic Data of Topographic Map

    NASA Astrophysics Data System (ADS)

    Wang, Guizhi

    2016-06-01

    National administration of surveying, mapping and geoinformation started to launch the project of national fundamental geographic information database dynamic update in 2012. Among them, the 1:50000 database was updated once a year, furthermore the 1:250000 database was downsized and linkage-updated on the basis. In 2014, using the latest achievements of 1:250000 database, comprehensively update the 1:1000000 digital line graph database. At the same time, generate cartographic data of topographic map and digital elevation model data. This article mainly introduce national 1:1000000 cartographic data of topographic map, include feature content, database structure, Database-driven Mapping technology, workflow and so on.

  6. Recent Results of NASA's Space Environments and Effects Program

    NASA Technical Reports Server (NTRS)

    Minor, Jody L.; Brewer, Dana S.

    1998-01-01

    The Space Environments and Effects (SEE) Program is a multi-center multi-agency program managed by the NASA Marshall Space Flight Center. The program evolved from the Long Duration Exposure Facility (LDEF), analysis of LDEF data, and recognition of the importance of the environments and environmental effects on future space missions. It is a very comprehensive and focused approach to understanding the space environments, to define the best techniques for both flight and ground-based experimentation, to update the models which predict both the environments and the environmental effects on spacecraft, and finally to ensure that this information is properly maintained and inserted into spacecraft design programs. Formal funding of the SEE Program began initially in FY95. A NASA Research Announcement (NRA) solicited research proposals in the following categories: 1) Engineering environment definitions; 2) Environments and effects design guidelines; 3) Environments and effects assessment models and databases; and, 4) Flight/ground simulation/technology assessment data. This solicitation resulted in funding for eighteen technology development activities (TDA's). This paper will present and describe technical results rom the first set of TDA's of the SEE Program. It will also describe the second set of technology development activities which are expected to begin in January 1998. These new technology development activities will enable the SEE Program to start numerous new development activities in support of mission customer needs.

  7. A systematic review of patient tracking systems for use in the pediatric emergency department.

    PubMed

    Dobson, Ian; Doan, Quynh; Hung, Geoffrey

    2013-01-01

    Patient safety is of great importance in the pediatric emergency department (PED). The combination of acutely and critically ill patients and high patient volumes creates a need for systems to support physicians in making accurate and timely diagnoses. Electronic patient tracking systems can potentially improve PED safety by reducing overcrowding and enhancing security. To enhance our understanding of current electronic tracking technologies, how they are implemented in a clinical setting, and resulting effect on patient care outcomes including patient safety. Nine databases were searched. Two independent reviewers identified articles that contained reference to patient tracking technologies in pediatrics or emergency medicine. Quantitative studies were assessed independently for methodological strength by two reviewers using an external assessment tool. Of 2292 initial articles, 22 were deemed relevant. Seventeen were qualitative, and the remaining five quantitative articles were assessed as being methodologically weak. Existing patient tracking systems in the ED included: infant monitoring/abduction prevention; barcode identification; radiofrequency identification (RFID)- or infrared (IR)-based patient tracking. Twenty articles supported the use of tracking technology to enhance patient safety or improve efficiency. One article failed to support the use of IR patient sensors due to study design flaws. Support exists for the use of barcode-, IR-, and RFID-based patient tracking systems to improve ED patient safety and efficiency. A lack of methodologically strong studies indicates a need for further evidence-based support for the implementation of patient tracking technology in a clinical or research setting. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Evaluation of consumer drug information databases.

    PubMed

    Choi, J A; Sullivan, J; Pankaskie, M; Brufsky, J

    1999-01-01

    To evaluate prescription drug information contained in six consumer drug information databases available on CD-ROM, and to make health care professionals aware of the information provided, so that they may appropriately recommend these databases for use by their patients. Observational study of six consumer drug information databases: The Corner Drug Store, Home Medical Advisor, Mayo Clinic Family Pharmacist, Medical Drug Reference, Mosby's Medical Encyclopedia, and PharmAssist. Not applicable. Not applicable. Information on 20 frequently prescribed drugs was evaluated in each database. The databases were ranked using a point-scale system based on primary and secondary assessment criteria. For the primary assessment, 20 categories of information based on those included in the 1998 edition of the USP DI Volume II, Advice for the Patient: Drug Information in Lay Language were evaluated for each of the 20 drugs, and each database could earn up to 400 points (for example, 1 point was awarded if the database mentioned a drug's mechanism of action). For the secondary assessment, the inclusion of 8 additional features that could enhance the utility of the databases was evaluated (for example, 1 point was awarded if the database contained a picture of the drug), and each database could earn up to 8 points. The results of the primary and secondary assessments, listed in order of highest to lowest number of points earned, are as follows: Primary assessment--Mayo Clinic Family Pharmacist (379), Medical Drug Reference (251), PharmAssist (176), Home Medical Advisor (113.5), The Corner Drug Store (98), and Mosby's Medical Encyclopedia (18.5); secondary assessment--The Mayo Clinic Family Pharmacist (8), The Corner Drug Store (5), Mosby's Medical Encyclopedia (5), Home Medical Advisor (4), Medical Drug Reference (4), and PharmAssist (3). The Mayo Clinic Family Pharmacist was the most accurate and complete source of prescription drug information based on the USP DI Volume II and would be an appropriate database for health care professionals to recommend to patients.

  9. e-PTSD: an overview on how new technologies can improve prediction and assessment of Posttraumatic Stress Disorder (PTSD).

    PubMed

    Bourla, Alexis; Mouchabac, Stephane; El Hage, Wissam; Ferreri, Florian

    2018-01-01

    Background : New technologies may profoundly change our way of understanding psychiatric disorders including posttraumatic stress disorder (PTSD). Imaging and biomarkers, along with technological and medical informatics developments, might provide an answer regarding at-risk patient's identification. Recent advances in the concept of 'digital phenotype', which refers to the capture of characteristics of a psychiatric disorder by computerized measurement tools, is one paradigmatic example. Objective : The impact of the new technologies on health professionals practice in PTSD care remains to be determined. The recent evolutions could disrupt the clinical practices and practitioners in their beliefs, ethics and representations, going as far as questioning their professional culture. In the present paper, we conducted an extensive search to highlight the articles which reflect the potential of these new technologies. Method : We conducted an overview by querying PubMed database with the terms [PTSD] [Posttraumatic stress disorder] AND [Computer] OR [Computerized] OR [Mobile] OR [Automatic] OR [Automated] OR [Machine learning] OR [Sensor] OR [Heart rate variability] OR [HRV] OR [actigraphy] OR [actimetry] OR [digital] OR [motion] OR [temperature] OR [virtual reality]. Results : We summarized the synthesized literature in two categories: prediction and assessment (including diagnostic, screening and monitoring). Two independent reviewers screened, extracted data and quality appraised the sources. Results were synthesized narratively. Conclusions : This overview shows that many studies are underway allowing researchers to start building a PTSD digital phenotype using passive data obtained by biometric sensors. Active data obtained from Ecological Momentary Assessment (EMA) could allow clinicians to assess PTSD patients. The place of connected objects, Artificial Intelligence and remote monitoring of patients with psychiatric pathology remains to be defined. These tools must be explained and adapted to the different profiles of physicians and patients. The involvement of patients, caregivers and health professionals is essential to the design and evaluation of these new tools.

  10. Improving outcomes for people in mental health crisis: a rapid synthesis of the evidence for available models of care.

    PubMed

    Paton, Fiona; Wright, Kath; Ayre, Nigel; Dare, Ceri; Johnson, Sonia; Lloyd-Evans, Brynmor; Simpson, Alan; Webber, Martin; Meader, Nick

    2016-01-01

    Crisis Concordat was established to improve outcomes for people experiencing a mental health crisis. The Crisis Concordat sets out four stages of the crisis care pathway: (1) access to support before crisis point; (2) urgent and emergency access to crisis care; (3) quality treatment and care in crisis; and (4) promoting recovery. To evaluate the clinical effectiveness and cost-effectiveness of the models of care for improving outcomes at each stage of the care pathway. Electronic databases were searched for guidelines, reviews and, where necessary, primary studies. The searches were performed on 25 and 26 June 2014 for NHS Evidence, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, NHS Economic Evaluation Database, and the Health Technology Assessment (HTA) and PROSPERO databases, and on 11 November 2014 for MEDLINE, PsycINFO and the Criminal Justice Abstracts databases. Relevant reports and reference lists of retrieved articles were scanned to identify additional studies. When guidelines covered a topic comprehensively, further literature was not assessed; however, where there were gaps, systematic reviews and then primary studies were assessed in order of priority. Systematic reviews were critically appraised using the Risk Of Bias In Systematic reviews assessment tool, trials were assessed using the Cochrane risk-of-bias tool, studies without a control group were assessed using the National Institute for Health and Care Excellence (NICE) prognostic studies tool and qualitative studies were assessed using the Critical Appraisal Skills Programme quality assessment tool. A narrative synthesis was conducted for each stage of the care pathway structured according to the type of care model assessed. The type and range of evidence identified precluded the use of meta-analysis. One review of reviews, six systematic reviews, nine guidelines and 15 primary studies were included. There was very limited evidence for access to support before crisis point. There was evidence of benefits for liaison psychiatry teams in improving service-related outcomes in emergency departments, but this was often limited by potential confounding in most studies. There was limited evidence regarding models to improve urgent and emergency access to crisis care to guide police officers in their Mental Health Act responsibilities. There was positive evidence on clinical effectiveness and cost-effectiveness of crisis resolution teams but variability in implementation. Current work from the Crisis resolution team Optimisation and RElapse prevention study aims to improve fidelity in delivering these models. Crisis houses and acute day hospital care are also currently recommended by NICE. There was a large evidence base on promoting recovery with a range of interventions recommended by NICE likely to be important in helping people stay well. Most evidence was rated as low or very low quality, but this partly reflects the difficulty of conducting research into complex interventions for people in a mental health crisis and does not imply that all research was poorly conducted. However, there are currently important gaps in research for a number of stages of the crisis care pathway. Particular gaps in research on access to support before crisis point and urgent and emergency access to crisis care were found. In addition, more high-quality research is needed on the clinical effectiveness and cost-effectiveness of mental health crisis care, including effective components of inpatient care, post-discharge transitional care and Community Mental Health Teams/intensive case management teams. This study is registered as PROSPERO CRD42014013279. The National Institute for Health Research HTA programme.

  11. Distributed Episodic Exploratory Planning (DEEP)

    DTIC Science & Technology

    2008-12-01

    API). For DEEP, Hibernate offered the following advantages: • Abstracts SQL by utilizing HQL so any database with a Java Database Connectivity... Hibernate SQL ICCRTS International Command and Control Research and Technology Symposium JDB Java Distributed Blackboard JDBC Java Database Connectivity...selected because of its opportunistic reasoning capabilities and implemented in Java for platform independence. Java was chosen for ease of

  12. An Examination of Job Skills Posted on Internet Databases: Implications for Information Systems Degree Programs.

    ERIC Educational Resources Information Center

    Liu, Xia; Liu, Lai C.; Koong, Kai S.; Lu, June

    2003-01-01

    Analysis of 300 information technology job postings in two Internet databases identified the following skill categories: programming languages (Java, C/C++, and Visual Basic were most frequent); website development (57% sought SQL and HTML skills); databases (nearly 50% required Oracle); networks (only Windows NT or wide-area/local-area networks);…

  13. New data sources and derived products for the SRER digital spatial database

    Treesearch

    Craig Wissler; Deborah Angell

    2003-01-01

    The Santa Rita Experimental Range (SRER) digital database was developed to automate and preserve ecological data and increase their accessibility. The digital data holdings include a spatial database that is used to integrate ecological data in a known reference system and to support spatial analyses. Recently, the Advanced Resource Technology (ART) facility has added...

  14. Applying Cognitive Load Theory to the Redesign of a Conventional Database Systems Course

    ERIC Educational Resources Information Center

    Mason, Raina; Seton, Carolyn; Cooper, Graham

    2016-01-01

    Cognitive load theory (CLT) was used to redesign a Database Systems course for Information Technology students. The redesign was intended to address poor student performance and low satisfaction, and to provide a more relevant foundation in database design and use for subsequent studies and industry. The original course followed the conventional…

  15. Common Database Interface for Heterogeneous Software Engineering Tools.

    DTIC Science & Technology

    1987-12-01

    SUB-GROUP Database Management Systems ;Programming(Comuters); 1e 05 Computer Files;Information Transfer;Interfaces; 19. ABSTRACT (Continue on reverse...Air Force Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Information Systems ...Literature ..... 8 System 690 Configuration ......... 8 Database Functionis ............ 14 Software Engineering Environments ... 14 Data Manager

  16. Charting the Progress

    ERIC Educational Resources Information Center

    CURRENTS, 2010

    2010-01-01

    Advancement technology is reshaping the business of fundraising, alumni relations, communications, and marketing. Through all of these innovations, the backbone of advancement systems remains the constituent database. This article takes a look at advancement databases that track constituent data.

  17. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    EPA Science Inventory

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  18. Efficient data management tools for the heterogeneous big data warehouse

    NASA Astrophysics Data System (ADS)

    Alekseev, A. A.; Osipova, V. V.; Ivanov, M. A.; Klimentov, A.; Grigorieva, N. V.; Nalamwar, H. S.

    2016-09-01

    The traditional RDBMS has been consistent for the normalized data structures. RDBMS served well for decades, but the technology is not optimal for data processing and analysis in data intensive fields like social networks, oil-gas industry, experiments at the Large Hadron Collider, etc. Several challenges have been raised recently on the scalability of data warehouse like workload against the transactional schema, in particular for the analysis of archived data or the aggregation of data for summary and accounting purposes. The paper evaluates new database technologies like HBase, Cassandra, and MongoDB commonly referred as NoSQL databases for handling messy, varied and large amount of data. The evaluation depends upon the performance, throughput and scalability of the above technologies for several scientific and industrial use-cases. This paper outlines the technologies and architectures needed for processing Big Data, as well as the description of the back-end application that implements data migration from RDBMS to NoSQL data warehouse, NoSQL database organization and how it could be useful for further data analytics.

  19. Does filler database size influence identification accuracy?

    PubMed

    Bergold, Amanda N; Heaton, Paul

    2018-06-01

    Police departments increasingly use large photo databases to select lineup fillers using facial recognition software, but this technological shift's implications have been largely unexplored in eyewitness research. Database use, particularly if coupled with facial matching software, could enable lineup constructors to increase filler-suspect similarity and thus enhance eyewitness accuracy (Fitzgerald, Oriet, Price, & Charman, 2013). However, with a large pool of potential fillers, such technologies might theoretically produce lineup fillers too similar to the suspect (Fitzgerald, Oriet, & Price, 2015; Luus & Wells, 1991; Wells, Rydell, & Seelau, 1993). This research proposes a new factor-filler database size-as a lineup feature affecting eyewitness accuracy. In a facial recognition experiment, we select lineup fillers in a legally realistic manner using facial matching software applied to filler databases of 5,000, 25,000, and 125,000 photos, and find that larger databases are associated with a higher objective similarity rating between suspects and fillers and lower overall identification accuracy. In target present lineups, witnesses viewing lineups created from the larger databases were less likely to make correct identifications and more likely to select known innocent fillers. When the target was absent, database size was associated with a lower rate of correct rejections and a higher rate of filler identifications. Higher algorithmic similarity ratings were also associated with decreases in eyewitness identification accuracy. The results suggest that using facial matching software to select fillers from large photograph databases may reduce identification accuracy, and provides support for filler database size as a meaningful system variable. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. The effects of applying information technology on job empowerment dimensions.

    PubMed

    Ajami, Sima; Arab-Chadegani, Raziyeh

    2014-01-01

    Information Technology (IT) is known as a valuable tool for information dissemination. Today, information communication technology can be used as a powerful tool to improve employees' quality and efficiency. The increasing development of technology-based tools and their adaptation speed with human requirements has led to a new form of the learning environment and creative, active and inclusive interaction. These days, information is one of the most important power resources in every organization and accordingly, acquiring information, especially central or strategic one can help organizations to build a power base and influence others. The aim of this study was to identify the most important criteria in job empowerment using IT and also the advantages of assessing empowerment. This study was a narrative review. The literature was searched on databases and journals of Springer, Proquest, PubMed, science direct and scientific information database) with keywords including IT, empowerment and employees in the searching areas of titles, keywords, abstracts and full texts. The preliminary search resulted in 85 articles, books and conference proceedings in which published between 1983 and 2013 during July 2013. After a careful analysis of the content of each paper, a total of 40 papers and books were selected based on their relevancy. According to Ardalan Model IT plays a significant role in the fast data collection, global and fast access to a broad range of health information, a quick evaluation of information, better communication among health experts and more awareness through access to various information sources. IT leads to a better performance accompanied by higher efficiency in service providing all of which will cause more satisfaction from fast and high-quality services.

  1. The effects of applying information technology on job empowerment dimensions

    PubMed Central

    Ajami, Sima; Arab-Chadegani, Raziyeh

    2014-01-01

    Information Technology (IT) is known as a valuable tool for information dissemination. Today, information communication technology can be used as a powerful tool to improve employees’ quality and efficiency. The increasing development of technology-based tools and their adaptation speed with human requirements has led to a new form of the learning environment and creative, active and inclusive interaction. These days, information is one of the most important power resources in every organization and accordingly, acquiring information, especially central or strategic one can help organizations to build a power base and influence others. The aim of this study was to identify the most important criteria in job empowerment using IT and also the advantages of assessing empowerment. This study was a narrative review. The literature was searched on databases and journals of Springer, Proquest, PubMed, science direct and scientific information database) with keywords including IT, empowerment and employees in the searching areas of titles, keywords, abstracts and full texts. The preliminary search resulted in 85 articles, books and conference proceedings in which published between 1983 and 2013 during July 2013. After a careful analysis of the content of each paper, a total of 40 papers and books were selected based on their relevancy. According to Ardalan Model IT plays a significant role in the fast data collection, global and fast access to a broad range of health information, a quick evaluation of information, better communication among health experts and more awareness through access to various information sources. IT leads to a better performance accompanied by higher efficiency in service providing all of which will cause more satisfaction from fast and high-quality services. PMID:25250350

  2. [Health-related scientific and technological capabilities and university-industry research collaboration].

    PubMed

    Britto, Jorge; Vargas, Marco Antônio; Gadelha, Carlos Augusto Grabois; Costa, Laís Silveira

    2012-12-01

    To examine recent developments in health-related scientific capabilities, the impact of lines of incentives on reducing regional scientific imbalances, and university-industry research collaboration in Brazil. Data were obtained from the Conselho Nacional de Desenvolvimento Científico e Tecnológico (Brazilian National Council for Scientific and Technological Development) databases for the years 2000 to 2010. There were assessed indicators of resource mobilization, research network structuring, and knowledge transfer between science and industry initiatives. Based on the regional distribution map of health-related scientific and technological capabilities there were identified patterns of scientific capabilities and science-industry collaboration. There was relative spatial deconcentration of health research groups and more than 6% of them worked in six areas of knowledge areas: medicine, collective health, dentistry, veterinary medicine, ecology and physical education. Lines of incentives that were adopted from 2000 to 2009 contributed to reducing regional scientific imbalances and improving preexisting capabilities or, alternatively, encouraging spatial decentralization of these capabilities. Health-related scientific and technological capabilities remain highly spatially concentrated in Brazil and incentive policies have contributed to reduce to some extent these imbalances.

  3. Application of hybrid life cycle approaches to emerging energy technologies--the case of wind power in the UK.

    PubMed

    Wiedmann, Thomas O; Suh, Sangwon; Feng, Kuishuang; Lenzen, Manfred; Acquaye, Adolf; Scott, Kate; Barrett, John R

    2011-07-01

    Future energy technologies will be key for a successful reduction of man-made greenhouse gas emissions. With demand for electricity projected to increase significantly in the future, climate policy goals of limiting the effects of global atmospheric warming can only be achieved if power generation processes are profoundly decarbonized. Energy models, however, have ignored the fact that upstream emissions are associated with any energy technology. In this work we explore methodological options for hybrid life cycle assessment (hybrid LCA) to account for the indirect greenhouse gas (GHG) emissions of energy technologies using wind power generation in the UK as a case study. We develop and compare two different approaches using a multiregion input-output modeling framework - Input-Output-based Hybrid LCA and Integrated Hybrid LCA. The latter utilizes the full-sized Ecoinvent process database. We discuss significance and reliability of the results and suggest ways to improve the accuracy of the calculations. The comparison of hybrid LCA methodologies provides valuable insight into the availability and robustness of approaches for informing energy and environmental policy.

  4. Constantly evolving safety assessment protocols for GM foods.

    PubMed

    Sesikeran, B; Vasanthi, Siruguri

    2008-01-01

    he introduction of GM foods has led to the evolution of a food safety assessment paradigm that establishes safety of the GM food relative to its conventional counterpart. The GM foods currently approved and marketed in several countries have undergone extensive safety testing under a structured safety assessment framework evolved by international organizations like FAO, WHO, Codex and OECD. The major elements of safety assessment include molecular characterization of inserted genes and stability of the trait, toxicity and allergenicity potential of the expressed substances, compositional analysis, potential for gene transfer to gut microflora and unintentional effects of the genetic modification. As more number and type of food crops are being brought under the genetic modification regime, the adequacy of existing safety assessment protocols for establishing safety of these foods has been questioned. Such crops comprise GM crops with higher agronomic vigour, nutritional or health benefit/ by modification of plant metabolic pathways and those expressing bioactive substances and pharmaceuticals. The safety assessment challenges of these foods are the potential of the methods to detect unintentional effects with higher sensitivity and rigor. Development of databases on food compositions, toxicants and allergens is currently seen as an important aid to development of safety protocols. With the changing global trends in genetic modification technology future challenge would be to develop GM crops with minimum amount of inserted foreign DNA so as to reduce the burden of complex safety assessments while ensuring safety and utility of the technology.

  5. Consulting report on the NASA technology utilization network system

    NASA Technical Reports Server (NTRS)

    Hlava, Marjorie M. K.

    1992-01-01

    The purposes of this consulting effort are: (1) to evaluate the existing management and production procedures and workflow as they each relate to the successful development, utilization, and implementation of the NASA Technology Utilization Network System (TUNS) database; (2) to identify, as requested by the NASA Project Monitor, the strengths, weaknesses, areas of bottlenecking, and previously unaddressed problem areas affecting TUNS; (3) to recommend changes or modifications of existing procedures as necessary in order to effect corrections for the overall benefit of NASA TUNS database production, implementation, and utilization; and (4) to recommend the addition of alternative procedures, routines, and activities that will consolidate and facilitate the production, implementation, and utilization of the NASA TUNS database.

  6. National health care providers' database (NHCPD) of Slovenia--information technology solution for health care planning and management.

    PubMed

    Albreht, T; Paulin, M

    1999-01-01

    The article describes the possibilities of planning of the health care providers' network enabled by the use of information technology. The cornerstone of such planning is the development and establishment of a quality database on health care providers, health care professionals and their employment statuses. Based on the analysis of information needs, a new database was developed for various users in health care delivery as well as for those in health insurance. The method of information engineering was used in the standard four steps of the information system construction, while the whole project was run in accordance with the principles of two internationally approved project management methods. Special attention was dedicated to a careful analysis of the users' requirements and we believe the latter to be fulfilled to a very large degree. The new NHCPD is a relational database which is set up in two important state institutions, the National Institute of Public Health and the Health Insurance Institute of Slovenia. The former is responsible for updating the database, while the latter is responsible for the technological side as well as for the implementation of data security and protection. NHCPD will be inter linked with several other existing applications in the area of health care, public health and health insurance. Several important state institutions and professional chambers are users of the database in question, thus integrating various aspects of the health care system in Slovenia. The setting up of a completely revised health care providers' database in Slovenia is an important step in the development of a uniform and integrated information system that would support top decision-making processes at the national level.

  7. Receptivity of Librarians to Optical Information Technologies and Products.

    ERIC Educational Resources Information Center

    Eaton, Nancy

    1986-01-01

    Examines factors which may affect the receptivity of librarians to the use of optical disk technologies, including hardware and software issues, the content of currently available databases, and the integration of optical technologies into existing library services. (CLB)

  8. Kinect V2 Performance Assessment in Daily-Life Gestures: Cohort Study on Healthy Subjects for a Reference Database for Automated Instrumental Evaluations on Neurological Patients

    PubMed Central

    Malosio, Matteo; Molinari Tosatti, Lorenzo

    2017-01-01

    Background The increase of sanitary costs related to poststroke rehabilitation requires new sustainable and cost-effective strategies for promoting autonomous and dehospitalized motor training. In the Riprendo@Home and Future Home for Future Communities research projects, the promising approach of introducing low-cost technologies that promote home rehabilitation is exploited. In order to provide reliable evaluation of patients, a reference database of healthy people's performances is required and should consider variability related to healthy people performances. Methods 78 healthy subjects performed several repetitions of daily-life gestures, the reaching movement (RM) and hand-to-mouth (HtMM) movement with both the dominant and nondominant upper limbs. Movements were recorded with a Kinect V2. A synthetic biomechanical protocol based on kinematical, dynamical, and motor control parameters was used to assess motor performance of the healthy people. The investigation was conducted by clustering participants depending on their limb dominancy (right/left), gender (male/female), and age (young/middle/senior) as sources of variability. Results Results showed that limb dominancy has minor relevance in affecting RM and HtMM; gender has relevance in affecting the HtMM; age has major effect in affecting RM and HtMM. Conclusions An investigation of healthy subjects' upper limb performances during daily-life gestures was performed with the Kinect V2 sensor. Findings will be the basis for a database of normative data for neurological patients' motor evaluation. PMID:29358893

  9. What do we know about managing Dupuytren's disease cost-effectively?

    PubMed

    Dritsaki, Melina; Rivero-Arias, Oliver; Gray, Alastair; Ball, Catherine; Nanchahal, Jagdeep

    2018-01-25

    Dupuytren's disease (DD) is a common and progressive, fibroproliferative disorder of the palmar and digital fascia of the hand. Various treatments have been recommended for advanced disease or to retard progression of early disease and to prevent deterioration of the finger contracture and quality of life. Recent studies have tried to evaluate the clinical and cost-effectiveness of therapies for DD, but there is currently no systematic assessment and appraisal of the economic evaluations. A systematic literature review was conducted, following PRISMA guidelines, to identify studies reporting economic evaluations of interventions for managing DD. Databases searched included the Ovid MEDLINE/Embase (without time restriction), National Health Service (NHS) Economic Evaluation Database (all years) and the National Institute for Health Research (NIHR) Journals Library) Health Technology Assessment (HTA). Cost-effectiveness analyses of treating DD were identified and their quality was assessed using the CHEERS assessment tool for quality of reporting and Phillips checklist for model evaluation. A total of 103 studies were screened, of which 4 met the study inclusion criteria. Two studies were from the US, one from the UK and one from Canada. They all assessed the same interventions for advanced DD, namely collagenase Clostridium histolyticum injection, percutaneous needle fasciotomy and partial fasciectomy. All studies conducting a cost-utility analysis, two implemented a decision analytic model and two a Markov model approach. None of them were based on a single randomised controlled trial, but rather synthesised evidence from various sources. Studies varied in their time horizon, sources of utility estimates and perspective of analysis. The overall quality of study reporting was good based on the CHEERS checklist. The quality of the model reporting in terms of model structure, data synthesis and model consistency varied across the included studies. Cost-effectiveness analyses for patients with advanced DD are limited and have applied different approaches with respect to modelling. Future studies should improve the way they are conducted and report their findings according to established guidance for conducting economic modelling of health care technologies. The protocol was registered ( CRD42016032989 ; date 08/01/2016) with the PROSPERO international prospective register of systematic reviews.

  10. Enabling heterogenous multi-scale database for emergency service functions through geoinformation technologies

    NASA Astrophysics Data System (ADS)

    Bhanumurthy, V.; Venugopala Rao, K.; Srinivasa Rao, S.; Ram Mohan Rao, K.; Chandra, P. Satya; Vidhyasagar, J.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Geographical Information Science (GIS) is now graduated from traditional desktop system to Internet system. Internet GIS is emerging as one of the most promising technologies for addressing Emergency Management. Web services with different privileges are playing an important role in dissemination of the emergency services to the decision makers. Spatial database is one of the most important components in the successful implementation of Emergency Management. It contains spatial data in the form of raster, vector, linked with non-spatial information. Comprehensive data is required to handle emergency situation in different phases. These database elements comprise core data, hazard specific data, corresponding attribute data, and live data coming from the remote locations. Core data sets are minimum required data including base, thematic, infrastructure layers to handle disasters. Disaster specific information is required to handle a particular disaster situation like flood, cyclone, forest fire, earth quake, land slide, drought. In addition to this Emergency Management require many types of data with spatial and temporal attributes that should be made available to the key players in the right format at right time. The vector database needs to be complemented with required resolution satellite imagery for visualisation and analysis in disaster management. Therefore, the database is interconnected and comprehensive to meet the requirement of an Emergency Management. This kind of integrated, comprehensive and structured database with appropriate information is required to obtain right information at right time for the right people. However, building spatial database for Emergency Management is a challenging task because of the key issues such as availability of data, sharing policies, compatible geospatial standards, data interoperability etc. Therefore, to facilitate using, sharing, and integrating the spatial data, there is a need to define standards to build emergency database systems. These include aspects such as i) data integration procedures namely standard coding scheme, schema, meta data format, spatial format ii) database organisation mechanism covering data management, catalogues, data models iii) database dissemination through a suitable environment, as a standard service for effective service dissemination. National Database for Emergency Management (NDEM) is such a comprehensive database for addressing disasters in India at the national level. This paper explains standards for integrating, organising the multi-scale and multi-source data with effective emergency response using customized user interfaces for NDEM. It presents standard procedure for building comprehensive emergency information systems for enabling emergency specific functions through geospatial technologies.

  11. Review of telehealth stuttering management.

    PubMed

    Lowe, Robyn; O'Brian, Sue; Onslow, Mark

    2013-01-01

    Telehealth is the use of communication technology to provide health care services by means other than typical in-clinic attendance models. Telehealth is increasingly used for the management of speech, language and communication disorders. The aim of this article is to review telehealth applications to stuttering management. We conducted a search of peer-reviewed literature for the past 20 years using the Institute for Scientific Information Web of Science database, PubMed: The Bibliographic Database and a search for articles by hand. Outcomes for telehealth stuttering treatment were generally positive, but there may be a compromise of treatment efficiency with telehealth treatment of young children. Our search found no studies dealing with stuttering assessment procedures using telehealth models. No economic analyses of this delivery model have been reported. This review highlights the need for continued research about telehealth for stuttering management. Evidence from research is needed to inform the efficacy of assessment procedures using telehealth methods as well as guide the development of improved treatment procedures. Clinical and technical guidelines are urgently needed to ensure that the evolving and continued use of telehealth to manage stuttering does not compromise the standards of care afforded with standard in-clinic models.

  12. Blending Technology with Camp Tradition: Technology Can Simplify Camp Operations.

    ERIC Educational Resources Information Center

    Salzman, Jeff

    2000-01-01

    Discusses uses of technology appropriate for camps, which are service organizations based on building relationships. Describes relationship marketing and how it can be enhanced through use of Web sites, interactive brochures, and client databases. Outlines other technology uses at camp: automated dispensing of medications, satellite tracking of…

  13. The New Library, A Hybrid Organization.

    ERIC Educational Resources Information Center

    Waaijers, Leo

    This paper discusses changes in technology in libraries over the last decade, beginning with an overview of the impact of databases, the Internet, and the World Wide Web on libraries. The integration of technology at Delft University of Technology (Netherlands) is described, including use of scanning technology, fax, and e-mail for document…

  14. Exergame technology and interactive interventions for elderly fall prevention: A systematic literature review.

    PubMed

    Choi, Sang D; Guo, Liangjie; Kang, Donghun; Xiong, Shuping

    2017-11-01

    Training balance and promoting physical activities in the elderly can contribute to fall-prevention. Due to the low adherence of conventional physical therapy, fall interventions through exergame technologies are emerging. The purpose of this review study is to synthesize the available research reported on exergame technology and interactive interventions for fall prevention in the older population. Twenty-five relevant papers retrieved from five major databases were critically reviewed and analyzed. Results showed that the most common exergaming device for fall intervention was Nintendo Wii, followed by Xbox Kinect. Even though the exergame intervention protocols and outcome measures for assessing intervention effectiveness varied, the accumulated evidences revealed that exergame interventions improved physical or cognitive functions in the elderly. However, it remains inconclusive whether or not the exergame-based intervention for the elderly fall prevention is superior to conventional physical therapy and the effect mechanism of the exergaming on elderly's balance ability is still unclear. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. The use of PDAs to collect baseline survey data: lessons learned from a pilot project in Bolivia.

    PubMed

    Escandon, I N; Searing, H; Goldberg, R; Duran, R; Arce, J Monterrey

    2008-01-01

    We compared the use of personal digital assistants (PDAs) against the use of standard paper questionnaires for collecting survey data. The evaluation consisted of qualitative approaches to document the process of introducing PDAs. Fieldwork was carried out during June-July 2005 at 12 sites in Bolivia. Data collectors reacted positively to the use of the PDAs and noted the advantages and disadvantages of paper and PDA data collection. A number of difficulties encountered in the use of PDA technology serve as a warning for investigators planning its adoption. Problems included incompatible data files (which impeded the ability to interpret data), an inadequate back-up protocol, and lack of a good 'fit' between the technology and the study. Ensuring the existence of a back-end database, developing an appropriate and adequate back-up protocol, and assessing whether a technology 'fits' the project are important factors in weighing the decision to collect data using PDAs.

  16. Liz Torres | NREL

    Science.gov Websites

    of Expertise Customer service Technically savvy Event planning Word processing/desktop publishing Database management Research Interests Website design Database design Computational science Technology Consulting, Westminster, CO (2007-2012) Administrative Assistant, Source One Management, Denver, CO (2005

  17. Evaluation of linking pavement related databases.

    DOT National Transportation Integrated Search

    2007-03-01

    In general, the objectives of this study were to identify and solve various issues in linking pavement performance related database. The detailed objectives were: to evaluate the state-of-the-art in information technology for data integration and dat...

  18. Microcomputers in Libraries.

    ERIC Educational Resources Information Center

    Ertel, Monica M.

    1984-01-01

    This discussion of current microcomputer technologies available to libraries focuses on software applications in four major classifications: communications (online database searching); word processing; administration; and database management systems. Specific examples of library applications are given and six references are cited. (EJS)

  19. The health systems' priority setting criteria for selecting health technologies: A systematic review of the current evidence.

    PubMed

    Mobinizadeh, Mohammadreza; Raeissi, Pouran; Nasiripour, Amir Ashkan; Olyaeemanesh, Alireza; Tabibi, Seyed Jamaleddin

    2016-01-01

    In the recent years, using health technologies to diagnose and treat diseases has had a considerable and accelerated growth. The proper use of these technologies may considerably help in the diagnosis and treatment of different diseases. On the other hand, unlimited and unrestricted entry of these technologies may result in induced demand by service providers. The aim of this study was to determine the appropriate criteria used in health technologies priority-setting models in the world. Using MESH and free text, we sought and retrieved the relevant articles from the most appropriate medical databases (the Cochrane Library, PubMed and Scopus) through three separate search strategies up to March 2015. The inclusion criteria were as follows: 1) Studies with specific criteria; 2) Articles written in English; 3) Those articles conducted in compliance with priority setting of health technologies. Data were analyzed qualitatively using a thematic synthesis technique. After screening the retrieved papers via PRISMA framework, from the 7,012 papers, 40 studies were included in the final phase. Criteria for selecting health technologies (in pre assessment and in the assessment phase) were categorized into six main themes: 1) Health outcomes; 2) Disease and target population; 3) Technology alternatives; 4) Economic aspects; 5) Evidence; 6) and other factors. "Health effects/benefits" had the maximum frequency in health outcomes (8 studies); "disease severity" had the maximum frequency in disease and target population (12 studies); "the number of alternatives" had the maximum frequency in alternatives (2 studies); "cost-effectiveness" had the maximum frequency in economic aspects (15 studies); "quality of evidence" had the maximum frequency in evidence (4 studies); and "issues concerning the health system" had the maximum frequency in other factors (10 studies). The results revealed an increase in the number of studies on health technologies priority setting around the world, and emphasized the necessity of application of a multi- criteria approach for appropriate decision making about healthcare technologies in the health systems.

  20. Applying World Wide Web technology to the study of patients with rare diseases.

    PubMed

    de Groen, P C; Barry, J A; Schaller, W J

    1998-07-15

    Randomized, controlled trials of sporadic diseases are rarely conducted. Recent developments in communication technology, particularly the World Wide Web, allow efficient dissemination and exchange of information. However, software for the identification of patients with a rare disease and subsequent data entry and analysis in a secure Web database are currently not available. To study cholangiocarcinoma, a rare cancer of the bile ducts, we developed a computerized disease tracing system coupled with a database accessible on the Web. The tracing system scans computerized information systems on a daily basis and forwards demographic information on patients with bile duct abnormalities to an electronic mailbox. If informed consent is given, the patient's demographic and preexisting medical information available in medical database servers are electronically forwarded to a UNIX research database. Information from further patient-physician interactions and procedures is also entered into this database. The database is equipped with a Web user interface that allows data entry from various platforms (PC-compatible, Macintosh, and UNIX workstations) anywhere inside or outside our institution. To ensure patient confidentiality and data security, the database includes all security measures required for electronic medical records. The combination of a Web-based disease tracing system and a database has broad applications, particularly for the integration of clinical research within clinical practice and for the coordination of multicenter trials.

  1. Measuring diet in the 21st century: use of new technologies.

    PubMed

    Cade, Janet E

    2017-08-01

    The advent of the internet and smartphone technology has allowed dietary assessment to reach the 21st century! The variety of foods available on the supermarket shelf is now greater than ever before. New approaches to measuring diet may help to reduce measurement error and advance our understanding of nutritional determinants of disease. This advance provides the potential to capture detailed dietary data on large numbers of individuals without the need for costly and time-consuming manual nutrition coding. This aim of the present paper is to review the need for new technologies to measure diet with an overview of tools available. The three main areas will be addressed: (1) development of web-based tools to measure diet; (2) use of smartphone apps to self-monitor diet; (3) improving the quality of dietary assessment through development of an online library of tools. A practical example of the development of a web-based tool to assess diet myfood24 (www.myfood24.org) will be given exploring its potential, limitations and challenges. The development of a new food composition database using back-of-pack information will be described. Smartphone apps used to measure diet with a focus on obesity will be reviewed. Many apps are unreliable in terms of tracking, and most are not evaluated. Accurate and consistent measurement of diet is needed for public health and epidemiology. The choice of the most appropriate dietary assessment method tends to rely on experience. The DIET@NET partnership has developed best practice guidelines for selection of dietary assessment tools, which aim to improve the quality, consistency and comparability of dietary data. These developments provide us with a step-change in our ability to reliably characterise food and nutrient intake in population studies. The need for high-quality, validated systems will be important to fully realise the benefits of new technologies.

  2. Nailing Digital Jelly to a Virtual Tree: Tracking Emerging Technologies for Learning

    ERIC Educational Resources Information Center

    Serim, Ferdi; Schrock, Kathy

    2008-01-01

    Reliable information on emerging technologies for learning is as vital as it is difficult to come by. To meet this need, the International Society for Technology in Education organized the Emerging Technologies Task Force. Its goal is to create a database of contributions from educators highlighting their use of emerging technologies to support…

  3. Methods, procedures, and contextual characteristics of health technology assessment and health policy decision making: comparison of health technology assessment agencies in Germany, United Kingdom, France, and Sweden.

    PubMed

    Schwarzer, Ruth; Siebert, Uwe

    2009-07-01

    The objectives of this study were (i) to develop a systematic framework for describing and comparing different features of health technology assessment (HTA) agencies, (ii) to identify and describe similarities and differences between the agencies, and (iii) to draw conclusions both for producers and users of HTA in research, policy, and practice. We performed a systematic literature search, added information from HTA agencies, and developed a conceptual framework comprising eight main domains: organization, scope, processes, methods, dissemination, decision, implementation, and impact. We grouped relevant items of these domains in an evidence table and chose five HTA agencies to test our framework: DAHTA@DIMDI, HAS, IQWiG, NICE, and SBU. Item and domain similarity was assessed using the percentage of identical characteristics in pairwise comparisons across agencies. RESULTS were interpreted across agencies by demonstrating similarities and differences. Based on 306 included documents, we identified 90 characteristics of eight main domains appropriate for our framework. After applying the framework to the five agencies, we were able to show 40 percent similarities in "dissemination," 38 percent in "scope," 35 percent in "organization," 29 percent in "methods," 26 percent in "processes," 23 percent in "impact," 19 percent in "decision," and 17 percent in "implementation." We found considerably more differences than similarities of HTA features across agencies and countries. Our framework and comparison provides insights and clarification into the need for harmonization. Our findings could serve as descriptive database facilitating communication between producers and users.

  4. GenomeRNAi: a database for cell-based RNAi phenotypes.

    PubMed

    Horn, Thomas; Arziman, Zeynep; Berger, Juerg; Boutros, Michael

    2007-01-01

    RNA interference (RNAi) has emerged as a powerful tool to generate loss-of-function phenotypes in a variety of organisms. Combined with the sequence information of almost completely annotated genomes, RNAi technologies have opened new avenues to conduct systematic genetic screens for every annotated gene in the genome. As increasing large datasets of RNAi-induced phenotypes become available, an important challenge remains the systematic integration and annotation of functional information. Genome-wide RNAi screens have been performed both in Caenorhabditis elegans and Drosophila for a variety of phenotypes and several RNAi libraries have become available to assess phenotypes for almost every gene in the genome. These screens were performed using different types of assays from visible phenotypes to focused transcriptional readouts and provide a rich data source for functional annotation across different species. The GenomeRNAi database provides access to published RNAi phenotypes obtained from cell-based screens and maps them to their genomic locus, including possible non-specific regions. The database also gives access to sequence information of RNAi probes used in various screens. It can be searched by phenotype, by gene, by RNAi probe or by sequence and is accessible at http://rnai.dkfz.de.

  5. GenomeRNAi: a database for cell-based RNAi phenotypes

    PubMed Central

    Horn, Thomas; Arziman, Zeynep; Berger, Juerg; Boutros, Michael

    2007-01-01

    RNA interference (RNAi) has emerged as a powerful tool to generate loss-of-function phenotypes in a variety of organisms. Combined with the sequence information of almost completely annotated genomes, RNAi technologies have opened new avenues to conduct systematic genetic screens for every annotated gene in the genome. As increasing large datasets of RNAi-induced phenotypes become available, an important challenge remains the systematic integration and annotation of functional information. Genome-wide RNAi screens have been performed both in Caenorhabditis elegans and Drosophila for a variety of phenotypes and several RNAi libraries have become available to assess phenotypes for almost every gene in the genome. These screens were performed using different types of assays from visible phenotypes to focused transcriptional readouts and provide a rich data source for functional annotation across different species. The GenomeRNAi database provides access to published RNAi phenotypes obtained from cell-based screens and maps them to their genomic locus, including possible non-specific regions. The database also gives access to sequence information of RNAi probes used in various screens. It can be searched by phenotype, by gene, by RNAi probe or by sequence and is accessible at PMID:17135194

  6. Evolving the US Army Research Laboratory (ARL) Technical Communication Strategy

    DTIC Science & Technology

    2016-10-01

    of added value and enhanced tech transfer, and strengthened relationships with academic and industry collaborators. In support of increasing ARL’s...communication skills; and Prong 3: Promote a Stakeholder Database to implement a stakeholder database (including names and preferences) and use a...Group, strategic planning, communications strategy, stakeholder database , workforce improvement, science and technology, S&T 16. SECURITY

  7. NIST Gas Hydrate Research Database and Web Dissemination Channel.

    PubMed

    Kroenlein, K; Muzny, C D; Kazakov, A; Diky, V V; Chirico, R D; Frenkel, M; Sloan, E D

    2010-01-01

    To facilitate advances in application of technologies pertaining to gas hydrates, a freely available data resource containing experimentally derived information about those materials was developed. This work was performed by the Thermodynamic Research Center (TRC) paralleling a highly successful database of thermodynamic and transport properties of molecular pure compounds and their mixtures. Population of the gas-hydrates database required development of guided data capture (GDC) software designed to convert experimental data and metadata into a well organized electronic format, as well as a relational database schema to accommodate all types of numerical and metadata within the scope of the project. To guarantee utility for the broad gas hydrate research community, TRC worked closely with the Committee on Data for Science and Technology (CODATA) task group for Data on Natural Gas Hydrates, an international data sharing effort, in developing a gas hydrate markup language (GHML). The fruits of these efforts are disseminated through the NIST Sandard Reference Data Program [1] as the Clathrate Hydrate Physical Property Database (SRD #156). A web-based interface for this database, as well as scientific results from the Mallik 2002 Gas Hydrate Production Research Well Program [2], is deployed at http://gashydrates.nist.gov.

  8. An Index to PGE-Ni-Cr Deposits and Occurrences in Selected Mineral-Occurrence Databases

    USGS Publications Warehouse

    Causey, J. Douglas; Galloway, John P.; Zientek, Michael L.

    2009-01-01

    Databases of mineral deposits and occurrences are essential to conducting assessments of undiscovered mineral resources. In the USGS's (U.S. Geological Survey) global assessment of undiscovered resources of copper, potash, and the platinum-group elements (PGE), only a few mineral deposit types will be evaluated. For example, only porphyry-copper and sediment-hosted copper deposits will be considered for the copper assessment. To support the global assessment, the USGS prepared comprehensive compilations of the occurrences of these two deposit types in order to develop grade and tonnage models and delineate permissive areas for undiscovered deposits of those types. This publication identifies previously published databases and database records that describe PGE, nickel, and chromium deposits and occurrences. Nickel and chromium were included in this overview because of the close association of PGE with nickel and chromium mineralization. Users of this database will need to refer to the original databases for detailed information about the deposits and occurrences. This information will be used to develop a current and comprehensive global database of PGE deposits and occurrences.

  9. Technologies to Support Community-Dwelling Persons With Dementia: A Position Paper on Issues Regarding Development, Usability, Effectiveness and Cost-Effectiveness, Deployment, and Ethics

    PubMed Central

    Innes, Anthea; Mountain, Gail; Robinson, Louise; van der Roest, Henriëtte; García-Casal, J Antonio; Gove, Dianne; Thyrian, Jochen René; Evans, Shirley; Dröes, Rose-Marie; Kelly, Fiona; Kurz, Alexander; Casey, Dympna; Szcześniak, Dorota; Dening, Tom; Craven, Michael P; Span, Marijke; Felzmann, Heike; Tsolaki, Magda; Franco-Martin, Manuel

    2017-01-01

    Background With the expected increase in the numbers of persons with dementia, providing timely, adequate, and affordable care and support is challenging. Assistive and health technologies may be a valuable contribution in dementia care, but new challenges may emerge. Objective The aim of our study was to review the state of the art of technologies for persons with dementia regarding issues on development, usability, effectiveness and cost-effectiveness, deployment, and ethics in 3 fields of application of technologies: (1) support with managing everyday life, (2) support with participating in pleasurable and meaningful activities, and (3) support with dementia health and social care provision. The study also aimed to identify gaps in the evidence and challenges for future research. Methods Reviews of literature and expert opinions were used in our study. Literature searches were conducted on usability, effectiveness and cost-effectiveness, and ethics using PubMed, Embase, CINAHL, and PsycINFO databases with no time limit. Selection criteria in our selected technology fields were reviews in English for community-dwelling persons with dementia. Regarding deployment issues, searches were done in Health Technology Assessment databases. Results According to our results, persons with dementia want to be included in the development of technologies; there is little research on the usability of assistive technologies; various benefits are reported but are mainly based on low-quality studies; barriers to deployment of technologies in dementia care were identified, and ethical issues were raised by researchers but often not studied. Many challenges remain such as including the target group more often in development, performing more high-quality studies on usability and effectiveness and cost-effectiveness, creating and having access to high-quality datasets on existing technologies to enable adequate deployment of technologies in dementia care, and ensuring that ethical issues are considered an important topic for researchers to include in their evaluation of assistive technologies. Conclusions Based on these findings, various actions are recommended for development, usability, effectiveness and cost-effectiveness, deployment, and ethics of assistive and health technologies across Europe. These include avoiding replication of technology development that is unhelpful or ineffective and focusing on how technologies succeed in addressing individual needs of persons with dementia. Furthermore, it is suggested to include these recommendations in national and international calls for funding and assistive technology research programs. Finally, practitioners, policy makers, care insurers, and care providers should work together with technology enterprises and researchers to prepare strategies for the implementation of assistive technologies in different care settings. This may help future generations of persons with dementia to utilize available and affordable technologies and, ultimately, to benefit from them. PMID:28582262

  10. Draft secure medical database standard.

    PubMed

    Pangalos, George

    2002-01-01

    Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.

  11. Advanced technologies for scalable ATLAS conditions database access on the grid

    NASA Astrophysics Data System (ADS)

    Basset, R.; Canali, L.; Dimitrov, G.; Girone, M.; Hawkings, R.; Nevski, P.; Valassi, A.; Vaniachine, A.; Viegas, F.; Walker, R.; Wong, A.

    2010-04-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysis of server performance under stress tests indicates that Conditions Db data access is limited by the disk I/O throughput. An unacceptable side-effect of the disk I/O saturation is a degradation of the WLCG 3D Services that update Conditions Db data at all ten ATLAS Tier-1 sites using the technology of Oracle Streams. To avoid such bottlenecks we prototyped and tested a novel approach for database peak load avoidance in Grid computing. Our approach is based upon the proven idea of pilot job submission on the Grid: instead of the actual query, an ATLAS utility library sends to the database server a pilot query first.

  12. A Collaborative Data Scientist Framework for both Primary and Secondary Education

    NASA Astrophysics Data System (ADS)

    Branch, B. D.

    2011-12-01

    The earth science data educational pipeline may be dependent on K-20 outcomes. Thus, a challenge for earth science and space informatics education or generational knowledge transfer consideration may be a non-existing or cost prohibitive pedagogical earth science reality. Such may require a technological infrastructure, a validated assessment system, and collaboration among stakeholders of primary and secondary education. Moreover, the K-20 paradigms may engage separate science and technology preparation standards when fundamental informatics requires an integrated pedagogical approach. In simple terms, a collaborative earth science training program for a subset of disciplines may a pragmatics means for formal data scientist training that is sustainable as technology evolves and data-sharing policy becomes a norm of data literacy. As the Group Earth Observation Systems of Systems (GEOSS) has a 10-work plan, educational stakeholders may find funding avenues if government can see earth science data training as a valuable job skill and societal need. This proposed framework suggested that ontological literacy, database management and storage management and data sharing capability are fundamental informatics concepts of this proposed framework where societal engagement is incited. Here all STEM disciplines could incite an integrated approach to mature such as learning metrics in their matriculation and assessment systems. The NSF's Earth Cube and Europe's WISE may represent best cased for such framework implementation.

  13. Development and application of basis database for materials life cycle assessment in china

    NASA Astrophysics Data System (ADS)

    Li, Xiaoqing; Gong, Xianzheng; Liu, Yu

    2017-03-01

    As the data intensive method, high quality environmental burden data is an important premise of carrying out materials life cycle assessment (MLCA), and the reliability of data directly influences the reliability of the assessment results and its application performance. Therefore, building Chinese MLCA database is the basic data needs and technical supports for carrying out and improving LCA practice. Firstly, some new progress on database which related to materials life cycle assessment research and development are introduced. Secondly, according to requirement of ISO 14040 series standards, the database framework and main datasets of the materials life cycle assessment are studied. Thirdly, MLCA data platform based on big data is developed. Finally, the future research works were proposed and discussed.

  14. DSSTox and Chemical Information Technologies in Support of PredictiveToxicology

    EPA Science Inventory

    The EPA NCCT Distributed Structure-Searchable Toxicity (DSSTox) Database project initially focused on the curation and publication of high-quality, standardized, chemical structure-annotated toxicity databases for use in structure-activity relationship (SAR) modeling. In recent y...

  15. Teaching Historians with Databases.

    ERIC Educational Resources Information Center

    Burton, Vernon

    1993-01-01

    Asserts that, although pressures to publish have detracted from the quality of teaching at the college level, recent innovations in educational technology have created opportunities for instructional improvement. Describes the use of computer-assisted instruction and databases in college-level history courses. (CFR)

  16. SPECIES DATABASES AND THE BIOINFORMATICS REVOLUTION.

    EPA Science Inventory

    Biological databases are having a growth spurt. Much of this results from research in genetics and biodiversity, coupled with fast-paced developments in information technology. The revolution in bioinformatics, defined by Sugden and Pennisi (2000) as the "tools and techniques for...

  17. Evaluation of "shotgun" proteomics for identification of biological threat agents in complex environmental matrixes: experimental simulations.

    PubMed

    Verberkmoes, Nathan C; Hervey, W Judson; Shah, Manesh; Land, Miriam; Hauser, Loren; Larimer, Frank W; Van Berkel, Gary J; Goeringer, Douglas E

    2005-02-01

    There is currently a great need for rapid detection and positive identification of biological threat agents, as well as microbial species in general, directly from complex environmental samples. This need is most urgent in the area of homeland security, but also extends into medical, environmental, and agricultural sciences. Mass-spectrometry-based analysis is one of the leading technologies in the field with a diversity of different methodologies for biothreat detection. Over the past few years, "shotgun"proteomics has become one method of choice for the rapid analysis of complex protein mixtures by mass spectrometry. Recently, it was demonstrated that this methodology is capable of distinguishing a target species against a large database of background species from a single-component sample or dual-component mixtures with relatively the same concentration. Here, we examine the potential of shotgun proteomics to analyze a target species in a background of four contaminant species. We tested the capability of a common commercial mass-spectrometry-based shotgun proteomics platform for the detection of the target species (Escherichia coli) at four different concentrations and four different time points of analysis. We also tested the effect of database size on positive identification of the four microbes used in this study by testing a small (13-species) database and a large (261-species) database. The results clearly indicated that this technology could easily identify the target species at 20% in the background mixture at a 60, 120, 180, or 240 min analysis time with the small database. The results also indicated that the target species could easily be identified at 20% or 6% but could not be identified at 0.6% or 0.06% in either a 240 min analysis or a 30 h analysis with the small database. The effects of the large database were severe on the target species where detection above the background at any concentration used in this study was impossible, though the three other microbes used in this study were clearly identified above the background when analyzed with the large database. This study points to the potential application of this technology for biological threat agent detection but highlights many areas of needed research before the technology will be useful in real world samples.

  18. Mapping analysis and planning system for the John F. Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Hall, C. R.; Barkaszi, M. J.; Provancha, M. J.; Reddick, N. A.; Hinkle, C. R.; Engel, B. A.; Summerfield, B. R.

    1994-01-01

    Environmental management, impact assessment, research and monitoring are multidisciplinary activities which are ideally suited to incorporate a multi-media approach to environmental problem solving. Geographic information systems (GIS), simulation models, neural networks and expert-system software are some of the advancing technologies being used for data management, query, analysis and display. At the 140,000 acre John F. Kennedy Space Center, the Advanced Software Technology group has been supporting development and implementation of a program that integrates these and other rapidly evolving hardware and software capabilities into a comprehensive Mapping, Analysis and Planning System (MAPS) based in a workstation/local are network environment. An expert-system shell is being developed to link the various databases to guide users through the numerous stages of a facility siting and environmental assessment. The expert-system shell approach is appealing for its ease of data access by management-level decision makers while maintaining the involvement of the data specialists. This, as well as increased efficiency and accuracy in data analysis and report preparation, can benefit any organization involved in natural resources management.

  19. Forty years of improvements in European air quality: the role of EU policy-industry interplay

    NASA Astrophysics Data System (ADS)

    Crippa, M.; Janssens-Maenhout, G.; Dentener, F.; Guizzardi, D.; Sindelarova, K.; Muntean, M.; Van Dingenen, R.; Granier, C.

    2015-07-01

    The EDGAR (Emissions Database for Global Atmospheric Research) v4.3 global anthropogenic emissions inventory of several gaseous (SO2, NOx, CO, non-methane volatile organic compounds (NMVOCs) and NH3) and particulate (PM10, PM2.5, black and organic carbon (BC and OC)) air pollutants for the period 1970-2010 is used to develop retrospective air pollution emission scenarios to quantify the roles and contributions of changes in fuels consumption, technology, end-of-pipe emission reduction measures and their resulting impact on health and crop yields. This database presents changes in activity data, fuels and air pollution abatement technology for the past 4 decades, using international statistics and following guidelines for bottom-up emission inventory at the Tier 1 and Tier 2 levels with region-specific default values. With two further retrospective scenarios we assess (1) the impact of the technology and end-of-pipe (EOP) reduction measures in the European Union (EU) by considering a stagnation of technology with constant emission factors from 1970 and with no further abatement measures and improvement in European emissions standards, but fuel consumption occurring at historical pace, and (2) the impact of increased fuel consumption by considering unchanged energy use with constant fuel consumption since 1970, but technological development and end-of-pipe reductions. Our scenario analysis focuses on the three most important and most regulated sectors (power generation, the manufacturing industry and road transport), which are subject of multi-pollutant EU Air Quality regulations. If technology and European EOP reduction measures had stagnated at 1970 levels, EU air quality in 2010 would have suffered from 129 % higher SO2, 71 % higher NOx and 69 % higher PM2.5 emissions, demonstrating the large role of technology in reducing emissions in 2010. However, if fuel consumption had remained constant starting in 1970, the EU would have benefited from current technology and emission control standards, with reductions in NOx by even 13 % more. Such further savings are not observed for SO2 and PM2.5. If the EU consumed the same amount of fuels as in 1970 but with the current technology and emission control standards, then the emissions of SO2 and PM2.5 would be 42 % respectively 10 % higher. This scenario shows the importance for air quality of abandoning heavy residual fuel oil and shifting fuel types (from, e.g., coal to gas) in the EU. A reduced-form TM5-FASST (Fast Screening Scenario Tool based on the global chemical Transport Model 5) is applied to calculate regional and global levels of aerosol and ozone concentrations and to assess the impact of air quality improvements on human health and crop yield loss, showing substantial impacts of export of EU technologies and standards to other world regions.

  20. Oak Ridge Reservation Environmental Protection Rad Neshaps Radionuclide Inventory Web Database and Rad Neshaps Source and Dose Database

    DOE PAGES

    Scofield, Patricia A.; Smith, Linda Lenell; Johnson, David N.

    2017-07-01

    The U.S. Environmental Protection Agency promulgated national emission standards for emissions of radionuclides other than radon from US Department of Energy facilities in Chapter 40 of the Code of Federal Regulations (CFR) 61, Subpart H. This regulatory standard limits the annual effective dose that any member of the public can receive from Department of Energy facilities to 0.1 mSv. As defined in the preamble of the final rule, all of the facilities on the Oak Ridge Reservation, i.e., the Y–12 National Security Complex, Oak Ridge National Laboratory, East Tennessee Technology Park, and any other U.S. Department of Energy operations onmore » Oak Ridge Reservation, combined, must meet the annual dose limit of 0.1 mSv. At Oak Ridge National Laboratory, there are monitored sources and numerous unmonitored sources. To maintain radiological source and inventory information for these unmonitored sources, e.g., laboratory hoods, equipment exhausts, and room exhausts not currently venting to monitored stacks on the Oak Ridge National Laboratory campus, the Environmental Protection Rad NESHAPs Inventory Web Database was developed. This database is updated annually and is used to compile emissions data for the annual Radionuclide National Emission Standards for Hazardous Air Pollutants (Rad NESHAPs) report required by 40 CFR 61.94. It also provides supporting documentation for facility compliance audits. In addition, a Rad NESHAPs source and dose database was developed to import the source and dose summary data from Clean Air Act Assessment Package—1988 computer model files. As a result, this database provides Oak Ridge Reservation and facility-specific source inventory; doses associated with each source and facility; and total doses for the Oak Ridge Reservation dose.« less

  1. Classifying the bacterial gut microbiota of termites and cockroaches: A curated phylogenetic reference database (DictDb).

    PubMed

    Mikaelyan, Aram; Köhler, Tim; Lampert, Niclas; Rohland, Jeffrey; Boga, Hamadi; Meuser, Katja; Brune, Andreas

    2015-10-01

    Recent developments in sequencing technology have given rise to a large number of studies that assess bacterial diversity and community structure in termite and cockroach guts based on large amplicon libraries of 16S rRNA genes. Although these studies have revealed important ecological and evolutionary patterns in the gut microbiota, classification of the short sequence reads is limited by the taxonomic depth and resolution of the reference databases used in the respective studies. Here, we present a curated reference database for accurate taxonomic analysis of the bacterial gut microbiota of dictyopteran insects. The Dictyopteran gut microbiota reference Database (DictDb) is based on the Silva database but was significantly expanded by the addition of clones from 11 mostly unexplored termite and cockroach groups, which increased the inventory of bacterial sequences from dictyopteran guts by 26%. The taxonomic depth and resolution of DictDb was significantly improved by a general revision of the taxonomic guide tree for all important lineages, including a detailed phylogenetic analysis of the Treponema and Alistipes complexes, the Fibrobacteres, and the TG3 phylum. The performance of this first documented version of DictDb (v. 3.0) using the revised taxonomic guide tree in the classification of short-read libraries obtained from termites and cockroaches was highly superior to that of the current Silva and RDP databases. DictDb uses an informative nomenclature that is consistent with the literature also for clades of uncultured bacteria and provides an invaluable tool for anyone exploring the gut community structure of termites and cockroaches. Copyright © 2015 Elsevier GmbH. All rights reserved.

  2. Development and application of a database of food ingredient fraud and economically motivated adulteration from 1980 to 2010.

    PubMed

    Moore, Jeffrey C; Spink, John; Lipp, Markus

    2012-04-01

    Food ingredient fraud and economically motivated adulteration are emerging risks, but a comprehensive compilation of information about known problematic ingredients and detection methods does not currently exist. The objectives of this research were to collect such information from publicly available articles in scholarly journals and general media, organize into a database, and review and analyze the data to identify trends. The results summarized are a database that will be published in the US Pharmacopeial Convention's Food Chemicals Codex, 8th edition, and includes 1305 records, including 1000 records with analytical methods collected from 677 references. Olive oil, milk, honey, and saffron were the most common targets for adulteration reported in scholarly journals, and potentially harmful issues identified include spices diluted with lead chromate and lead tetraoxide, substitution of Chinese star anise with toxic Japanese star anise, and melamine adulteration of high protein content foods. High-performance liquid chromatography and infrared spectroscopy were the most common analytical detection procedures, and chemometrics data analysis was used in a large number of reports. Future expansion of this database will include additional publically available articles published before 1980 and in other languages, as well as data outside the public domain. The authors recommend in-depth analyses of individual incidents. This report describes the development and application of a database of food ingredient fraud issues from publicly available references. The database provides baseline information and data useful to governments, agencies, and individual companies assessing the risks of specific products produced in specific regions as well as products distributed and sold in other regions. In addition, the report describes current analytical technologies for detecting food fraud and identifies trends and developments. © 2012 US Pharmacupia Journal of Food Science © 2012 Institute of Food Technologistsreg;

  3. Oak Ridge Reservation Environmental Protection Rad Neshaps Radionuclide Inventory Web Database and Rad Neshaps Source and Dose Database.

    PubMed

    Scofield, Patricia A; Smith, Linda L; Johnson, David N

    2017-07-01

    The U.S. Environmental Protection Agency promulgated national emission standards for emissions of radionuclides other than radon from US Department of Energy facilities in Chapter 40 of the Code of Federal Regulations (CFR) 61, Subpart H. This regulatory standard limits the annual effective dose that any member of the public can receive from Department of Energy facilities to 0.1 mSv. As defined in the preamble of the final rule, all of the facilities on the Oak Ridge Reservation, i.e., the Y-12 National Security Complex, Oak Ridge National Laboratory, East Tennessee Technology Park, and any other U.S. Department of Energy operations on Oak Ridge Reservation, combined, must meet the annual dose limit of 0.1 mSv. At Oak Ridge National Laboratory, there are monitored sources and numerous unmonitored sources. To maintain radiological source and inventory information for these unmonitored sources, e.g., laboratory hoods, equipment exhausts, and room exhausts not currently venting to monitored stacks on the Oak Ridge National Laboratory campus, the Environmental Protection Rad NESHAPs Inventory Web Database was developed. This database is updated annually and is used to compile emissions data for the annual Radionuclide National Emission Standards for Hazardous Air Pollutants (Rad NESHAPs) report required by 40 CFR 61.94. It also provides supporting documentation for facility compliance audits. In addition, a Rad NESHAPs source and dose database was developed to import the source and dose summary data from Clean Air Act Assessment Package-1988 computer model files. This database provides Oak Ridge Reservation and facility-specific source inventory; doses associated with each source and facility; and total doses for the Oak Ridge Reservation dose.

  4. Oak Ridge Reservation Environmental Protection Rad Neshaps Radionuclide Inventory Web Database and Rad Neshaps Source and Dose Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scofield, Patricia A.; Smith, Linda Lenell; Johnson, David N.

    The U.S. Environmental Protection Agency promulgated national emission standards for emissions of radionuclides other than radon from US Department of Energy facilities in Chapter 40 of the Code of Federal Regulations (CFR) 61, Subpart H. This regulatory standard limits the annual effective dose that any member of the public can receive from Department of Energy facilities to 0.1 mSv. As defined in the preamble of the final rule, all of the facilities on the Oak Ridge Reservation, i.e., the Y–12 National Security Complex, Oak Ridge National Laboratory, East Tennessee Technology Park, and any other U.S. Department of Energy operations onmore » Oak Ridge Reservation, combined, must meet the annual dose limit of 0.1 mSv. At Oak Ridge National Laboratory, there are monitored sources and numerous unmonitored sources. To maintain radiological source and inventory information for these unmonitored sources, e.g., laboratory hoods, equipment exhausts, and room exhausts not currently venting to monitored stacks on the Oak Ridge National Laboratory campus, the Environmental Protection Rad NESHAPs Inventory Web Database was developed. This database is updated annually and is used to compile emissions data for the annual Radionuclide National Emission Standards for Hazardous Air Pollutants (Rad NESHAPs) report required by 40 CFR 61.94. It also provides supporting documentation for facility compliance audits. In addition, a Rad NESHAPs source and dose database was developed to import the source and dose summary data from Clean Air Act Assessment Package—1988 computer model files. As a result, this database provides Oak Ridge Reservation and facility-specific source inventory; doses associated with each source and facility; and total doses for the Oak Ridge Reservation dose.« less

  5. The Use of Technology to Advance HIV Prevention for Couples.

    PubMed

    Mitchell, Jason W

    2015-12-01

    The majority of HIV prevention studies and programs have targeted individuals or operated at the community level. This has also been the standard approach when incorporating technology (e.g., web-based, smartphones) to help improve HIV prevention efforts. The tides have turned for both approaches: greater attention is now focusing on couple-based HIV prevention and using technology to help improve these efforts for maximizing reach and potential impact. To assess the extent that technology has been used to help advance HIV prevention with couples, a literature review was conducted using four databases and included studies that collected data from 2000 to early 2015. Results from this review suggest that technology has primarily been used to help advance HIV prevention with couples as a tool for (1) recruitment and data collection and (2) intervention development. Challenges and limitations of conducting research (e.g., validity of dyadic data) along with future directions for how technology (e.g., mHealth, wearable sensors) can be used to advance HIV prevention with couples are then discussed. Given the growing and near ubiquitous use of the Internet and smartphones, further efforts in the realm of mHealth (e.g., applications or "apps") and eHealth are needed to develop novel couple-focused HIV-preventive interventions.

  6. Quantitative Analysis of Technological Innovation in Urology.

    PubMed

    Bhatt, Nikita R; Davis, Niall F; Dalton, David M; McDermott, Ted; Flynn, Robert J; Thomas, Arun Z; Manecksha, Rustom P

    2018-01-01

    To assess major areas of technological innovation in urology in the last 20 years using patent and publication data. Patent and MEDLINE databases were searched between 1980 and 2012 electronically using the terms urology OR urological OR urologist AND "surgeon" OR "surgical" OR "surgery". The patent codes obtained were grouped in technology clusters, further analyzed with individual searches, and growth curves were plotted. Growth rates and patterns were analyzed, and patents were correlated with publications as a measure of scientific support and of clinical adoption. The initial search revealed 417 patents and 20,314 publications. The top 5 technology clusters in descending order were surgical instruments including urinary catheters, minimally invasive surgery (MIS), lasers, robotic surgery, and image guidance. MIS and robotic surgery were the most emergent clusters in the last 5 years. Publication and patent growth rates were closely correlated (Pearson coefficient 0.78, P <.01), but publication growth rate remained constantly higher than patent growth, suggesting validated scientific support for urologic innovation and adoption into clinical practice. Patent metrics identify emergent technological innovations and such trends are valuable to understand progress in the field of urology. New surgical technologies like robotic surgery and MIS showed exponential growth in the last decade with good scientific vigilance. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Mobile health in China: a review of research and programs in medical care, health education, and public health.

    PubMed

    Corpman, David W

    2013-01-01

    There are nearly 1 billion mobile phone subscribers in China. Health care providers, telecommunications companies, technology firms, and Chinese governmental organizations use existing mobile technology and social networks to improve patient-provider communication, promote health education and awareness, add efficiency to administrative practices, and enhance public health campaigns. This review of mobile health in China summarizes existing clinical research and public health text messaging campaigns while highlighting potential future areas of research and program implementation. Databases and search engines served as the primary means of gathering relevant resources. Included material largely consists of scientific articles and official reports that met predefined inclusion criteria. This review includes 10 reports of controlled studies that assessed the use of mobile technology in health care settings and 17 official reports of public health awareness campaigns that used text messaging. All source material was published between 2006 and 2011. The controlled studies suggested that mobile technology interventions significantly improved an array of health care outcomes. However, additional efforts are needed to refine mobile health research and better understand the applicability of mobile technology in China's health care settings. A vast potential exists for the expansion of mobile health in China, especially as costs decrease and increasingly sophisticated technology becomes more widespread.

  8. The effectiveness and cost-effectiveness of donepezil, galantamine, rivastigmine and memantine for the treatment of Alzheimer's disease (review of Technology Appraisal No. 111): a systematic review and economic model.

    PubMed

    Bond, M; Rogers, G; Peters, J; Anderson, R; Hoyle, M; Miners, A; Moxham, T; Davis, S; Thokala, P; Wailoo, A; Jeffreys, M; Hyde, C

    2012-01-01

    Alzheimer’s disease (AD) is the most commonly occurring form of dementia. It is predominantly a disease of later life, affecting 5% of those over 65 in the UK. Review and update guidance to the NHS in England and Wales on the clinical effectiveness and cost-effectiveness of donepezil, galantamine, rivastigmine [acetylcholinesterase inhibitors (AChEIs)] and memantine within their licensed indications for the treatment of AD, which was issued in November 2006 (amended September 2007 and August 2009). Electronic databases were searched for systematic reviews and/or metaanalyses, randomised controlled trials (RCTs) and ongoing research in November 2009 and updated in March 2010; this updated search revealed no new includable studies. The databases searched included The Cochrane Library (2009 Issue 4, Cochrane Database of Systematic Reviews and Cochrane Central Register of Controlled Trials), MEDLINE, MEDLINE In-Process & Other Non-Indexed Citations, EMBASE, PsycINFO, EconLit, ISI Web of Science Databases--Science Citation Index, Conference Proceedings Citation Index, and BIOSIS; the Centre for Reviews and Dissemination (CRD) databases--NHS Economic Evaluation Database, Health Technology Assessment, and Database of Abstracts of Reviews of Effects. The clinical effectiveness systematic review was undertaken following the principles published by the NHS CRD. We included RCTs whose population was people with AD. The intervention and comparators depended on disease severity, measured by the Mini Mental State Examination (MMSE). mild AD (MMSE 21-26)--donepezil, galantamine and rivastigmine; moderate AD (MMSE 10-20)--donepezil, galantamine, rivastigmine and memantine; severe AD (MMSE < 10)--memantine. Comparators: mild AD (MMSE 21-26)--placebo or best supportive care (BSC); moderate AD (MMSE 10-20)--donepezil, galantamine, rivastigmine, memantine, placebo or BSC; severe AD (MMSE < 10)--placebo or BSC. The outcomes were clinical, global, functional, behavioural, quality of life, adverse events, costs and cost-effectiveness. Where appropriate, data were pooled using pair-wise meta-analysis, multiple outcome measures, metaregression and mixedtreatment comparisons. The decision model was based broadly on the structure of the three-state Markov model described in the previous technology assessment report, based upon time to institutionalisation, parameterised with updated estimates of effectiveness, costs and utilities. Notwithstanding the uncertainty of our results, we found in the base case that the AChEIs are probably cost saving at a willingness-to-pay (WTP) of £’30,000 per qualityadjusted life-year (QALY) for people with mild-to-moderate AD. For this class of drugs, there is a > 99% probability that the AChEIs are more cost-effective than BSC. These analyses assume that the AChEIs have no effect on survival. For the AChEIs, in people with mild to moderate AD, the probabilistic sensitivity analyses suggested that donepezil is the most cost-effective, with a 28% probability of being the most cost-effective option at a WTP of £’30,000 per QALY (27% at a WTP of £’20,000 per QALY). In the deterministic results, donepezil dominates the other drugs and BSC, which, along with rivastigmine patches, are associated with greater costs and fewer QALYs. Thus, although galantamine has a slightly cheaper total cost than donepezil (£’69,592 vs £’69,624), the slightly greater QALY gains from donepezil (1.616 vs 1.617) are enough for donepezil to dominate galantamine.The probability that memantine is cost-effective in a moderate to severe cohort compared with BSC at a WTP of £’30,000 per QALY is 38% (and 28% at a WTP of £’20,000 per QALY). The deterministic ICER for memantine is £’32,100 per/QALY and the probabilistic ICER is £’36,700 per/QALY. Trials were of 6 months maximum follow-up, lacked reporting of key outcomes, provided no subgroup analyses and used insensitive measures. Searches were limited to English language, The model does not include behavioural symptoms and there is uncertainty about the model structure and parameters. The additional clinical effectiveness evidence identified continues to suggest clinical benefit from the AChEIs in alleviating AD symptoms, although there is debate about the magnitude of the effect. Although there is also new evidence on the effectiveness of memantine, it remains less supportive of this drug’s use than the evidence for AChEIs. The conclusions concerning cost-effectiveness are quite different from the previous assessment. This is because both the changes in effectiveness and costs between drug use and non-drug use underlying the ICERs are very small. This leads to highly uncertain results, which are very sensitive to change. RESEARCH PRIORITIES: RCTs to include mortality, time to institutionalisation and quality of life, powered for subgroup analysis. The National Institute for Health Research Health Technology Assessment programme.

  9. An indoor positioning technology in the BLE mobile payment system

    NASA Astrophysics Data System (ADS)

    Han, Tiantian; Ding, Lei

    2017-05-01

    Mobile payment system for large supermarkets, the core function is through the BLE low-power Bluetooth technology to achieve the amount of payment in the mobile payment system, can through an indoor positioning technology to achieve value-added services. The technology by collecting Bluetooth RSSI, the fingerprint database of sampling points corresponding is established. To get Bluetooth module RSSI by the AP. Then, to use k-Nearest Neighbor match the value of the fingerprint database. Thereby, to help businesses find customers through the mall location, combined settlement amount of the customer's purchase of goods, to analyze customer's behavior. When the system collect signal strength, the distribution of the sampling points of RSSI is analyzed and the value is filtered. The system, used in the laboratory is designed to demonstrate the feasibility.

  10. CARDIOVASCULAR SCREENING OF YOUNG ATHLETES: A REVIEW OF ECONOMIC EVALUATIONS.

    PubMed

    Gerkens, Sophie; Van Brabandt, Hans; Desomer, Anja; Leonard, Christian; Neyt, Mattias

    2017-01-01

    Some experts have promoted preparticipative cardiovascular screening programs for young athletes and have claimed that such programs were cost-effective without performing a critical analysis of studies supporting this statement. In this systematic review, a critical assessment of economic evaluations on these programs is performed to determine if they really provide value for money. A systematic review of economic evaluations was performed on December 24, 2014. Web sites of health technology assessment agencies, the Cochrane database of systematic review, the National Health Service Economic Evaluation Database of the Cochrane Library, EMBASE, Medline, Psychinfo, and EconLit were searched to retrieve (reviews of) economic evaluations. No language or time restrictions were imposed and predefined selection criteria were used. Selected studies were critically assessed applying a structured data extraction sheet. Five relevant economic evaluations were critically assessed. Results of these studies were mixed. However, those in favor of screening made (methodological) incorrect choices, of which the most important one was not taking into account a no-screening alternative as comparator. Compared with no screening, other strategies (history and physical examination or history and physical examination plus electrocardiogram) were not considered cost-effective. Results of primary economic evaluations should not be blindly copied without critical assessment. Economic evaluations in this field lack the support of robust evidence. Negative consequences of screening (false positive findings, overtreatment) should also be taken into account and may cause more harm than good. A mass screening of young athletes for cardiovascular diseases does not provide value for money and should be discouraged.

  11. Assistive technology for ultrasound-guided central venous catheter placement.

    PubMed

    Ikhsan, Mohammad; Tan, Kok Kiong; Putra, Andi Sudjana

    2018-01-01

    This study evaluated the existing technology used to improve the safety and ease of ultrasound-guided central venous catheterization. Electronic database searches were conducted in Scopus, IEEE, Google Patents, and relevant conference databases (SPIE, MICCAI, and IEEE conferences) for related articles on assistive technology for ultrasound-guided central venous catheterization. A total of 89 articles were examined and pointed to several fields that are currently the focus of improvements to ultrasound-guided procedures. These include improving needle visualization, needle guides and localization technology, image processing algorithms to enhance and segment important features within the ultrasound image, robotic assistance using probe-mounted manipulators, and improving procedure ergonomics through in situ projections of important information. Probe-mounted robotic manipulators provide a promising avenue for assistive technology developed for freehand ultrasound-guided percutaneous procedures. However, there is currently a lack of clinical trials to validate the effectiveness of these devices.

  12. Technology in the Public Library: Results from the 1992 PLDS Survey of Technology.

    ERIC Educational Resources Information Center

    Fidler, Linda M.; Johnson, Debra Wilcox

    1994-01-01

    Discusses and compares the incorporation of technology by larger public libraries in Canada and the United States. Technology mentioned includes online public access catalogs; remote and local online database searching; microcomputers and software for public use; and fax, voice mail, and Telecommunication Devices for the Deaf and Teletype writer…

  13. The ADAMS interactive interpreter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rietscha, E.R.

    1990-12-17

    The ADAMS (Advanced DAta Management System) project is exploring next generation database technology. Database management does not follow the usual programming paradigm. Instead, the database dictionary provides an additional name space environment that should be interactively created and tested before writing application code. This document describes the implementation and operation of the ADAMS Interpreter, an interactive interface to the ADAMS data dictionary and runtime system. The Interpreter executes individual statements of the ADAMS Interface Language, providing a fast, interactive mechanism to define and access persistent databases. 5 refs.

  14. Beyond the electronic textbook model: software techniques to make on-line educational content dynamic.

    PubMed

    Frank, M S; Dreyer, K

    2001-06-01

    We describe a working software technology that enables educators to incorporate their expertise and teaching style into highly interactive and Socratic educational material for distribution on the world wide web. A graphically oriented interactive authoring system was developed to enable the computer novice to create and store within a database his or her domain expertise in the form of electronic knowledge. The authoring system supports and facilitates the input and integration of several types of content, including free-form, stylized text, miniature and full-sized images, audio, and interactive questions with immediate feedback. The system enables the choreography and sequencing of these entities for display within a web page as well as the sequencing of entire web pages within a case-based or thematic presentation. Images or segments of text can be hyperlinked with point-and-click to other entities such as adjunctive web pages, audio, or other images, cases, or electronic chapters. Miniature (thumbnail) images are automatically linked to their full-sized counterparts. The authoring system contains a graphically oriented word processor, an image editor, and capabilities to automatically invoke and use external image-editing software such as Photoshop. The system works in both local area network (LAN) and internet-centric environments. An internal metalanguage (invisible to the author but stored with the content) was invented to represent the choreographic directives that specify the interactive delivery of the content on the world wide web. A database schema was developed to objectify and store both this electronic knowledge and its associated choreographic metalanguage. A database engine was combined with page-rendering algorithms in order to retrieve content from the database and deliver it on the web in a Socratic style, assess the recipient's current fund of knowledge, and provide immediate feedback, thus stimulating in-person interaction with a human expert. This technology enables the educator to choreograph a stylized, interactive delivery of his or her message using multimedia components assembled in virtually any order, spanning any number of web pages for a given case or theme. An educator can thus exercise precise influence on specific learning objectives, embody his or her personal teaching style within the content, and ultimately enhance its educational impact. The described technology amplifies the efforts of the educator and provides a more dynamic and enriching learning environment for web-based education.

  15. National security and national competitiveness: Open source solutions; NASA requirements and capabilities

    NASA Technical Reports Server (NTRS)

    Cotter, Gladys A.

    1993-01-01

    Foreign competitors are challenging the world leadership of the U.S. aerospace industry, and increasingly tight budgets everywhere make international cooperation in aerospace science necessary. The NASA STI Program has as part of its mission to support NASA R&D, and to that end has developed a knowledge base of aerospace-related information known as the NASA Aerospace Database. The NASA STI Program is already involved in international cooperation with NATO/AGARD/TIP, CENDI, ICSU/ICSTI, and the U.S. Japan Committee on STI. With the new more open political climate, the perceived dearth of foreign information in the NASA Aerospace Database, and the development of the ESA database and DELURA, the German databases, the NASA STI Program is responding by sponsoring workshops on foreign acquisitions and by increasing its cooperation with international partners and with other U.S. agencies. The STI Program looks to the future of improved database access through networking and a GUI; new media; optical disk, video, and full text; and a Technology Focus Group that will keep the NASA STI Program current with technology.

  16. Technology-Mediated Interventions and Quality of Life for Persons Living with HIV/AIDS. A Systematic Review.

    PubMed

    Cho, Hwayoung; Iribarren, Sarah; Schnall, Rebecca

    2017-04-12

    As HIV/AIDS is considered a chronic disease; quality of life (QoL) has become an important focus for researchers and healthcare providers. Technology-mediated interventions have demonstrated improved clinical effectiveness in outcomes, such as viral suppression, for persons living with HIV/AIDS (PLWH). However, the evidence to support the impact of these interventions on QoL is lacking. The aim of this paper was to assess the impact of technology-mediated interventions on QoL and to identify the instruments used to measure the QoL of PLWH. For this review we followed the PRISMA guidelines. A literature search was conducted in PubMed, CINAHL, Cochrane, and EMBASE databases in April 2016. Inclusion criteria limited articles to those with technology-mediated interventions as compared to usual care; articles with the population defined as HIV-infected patients; and articles with QoL measured as a health outcome in randomized controlled trials. The Cochrane Collaboration Risk of Bias Tool was used to assess study quality. Of the 1,554 peer-reviewed articles returned in the searches, 10 met the inclusion criteria. This systematic review identified four types of technology-mediated interventions and two types of QoL instruments used to examine the impact of technology-mediated interventions on PLWH. Four studies of technology-mediated interventions resulted in improvement in QoL. Four studies considered QoL as a secondary outcome and resulted in a negative or neutral impact on QoL. Overall, four studies had a low risk of bias, one study had a moderate risk of bias, and the other five studies had a high risk of bias. The evidence to support the improvement of QoL using technology-mediated interventions is insufficient. This lack of research highlights the need for increased study of QoL as an outcome measure and the need for consistent measures to better understand the role of technology-mediated interventions in improving QoL for PLWH.

  17. Information management systems for pharmacogenomics.

    PubMed

    Thallinger, Gerhard G; Trajanoski, Slave; Stocker, Gernot; Trajanoski, Zlatko

    2002-09-01

    The value of high-throughput genomic research is dramatically enhanced by association with key patient data. These data are generally available but of disparate quality and not typically directly associated. A system that could bring these disparate data sources into a common resource connected with functional genomic data would be tremendously advantageous. However, the integration of clinical and accurate interpretation of the generated functional genomic data requires the development of information management systems capable of effectively capturing the data as well as tools to make that data accessible to the laboratory scientist or to the clinician. In this review these challenges and current information technology solutions associated with the management, storage and analysis of high-throughput data are highlighted. It is suggested that the development of a pharmacogenomic data management system which integrates public and proprietary databases, clinical datasets, and data mining tools embedded in a high-performance computing environment should include the following components: parallel processing systems, storage technologies, network technologies, databases and database management systems (DBMS), and application services.

  18. Creating a histology-embryology free digital image database using high-end microscopy and computer techniques for on-line biomedical education.

    PubMed

    Silva-Lopes, Victor W; Monteiro-Leal, Luiz H

    2003-07-01

    The development of new technology and the possibility of fast information delivery by either Internet or Intranet connections are changing education. Microanatomy education depends basically on the correct interpretation of microscopy images by students. Modern microscopes coupled to computers enable the presentation of these images in a digital form by creating image databases. However, the access to this new technology is restricted entirely to those living in cities and towns with an Information Technology (IT) infrastructure. This study describes the creation of a free Internet histology database composed by high-quality images and also presents an inexpensive way to supply it to a greater number of students through Internet/Intranet connections. By using state-of-the-art scientific instruments, we developed a Web page (http://www2.uerj.br/~micron/atlas/atlasenglish/index.htm) that, in association with a multimedia microscopy laboratory, intends to help in the reduction of the IT educational gap between developed and underdeveloped regions. Copyright 2003 Wiley-Liss, Inc.

  19. Handbook of automated data collection methods for the National Transit Database

    DOT National Transportation Integrated Search

    2003-10-01

    In recent years, with the increasing sophistication and capabilities of information processing technologies, there has been a renewed interest on the part of transit systems to tap the rich information potential of the National Transit Database (NTD)...

  20. Special issue on eHealth and mHealth: Challenges and future directions for assessment, treatment, and dissemination.

    PubMed

    Borrelli, Belinda; Ritterband, Lee M

    2015-12-01

    This special issue is intended to promote a discussion of eHealth and mHealth and its connection with health psychology. "eHealth" generally refers to the use of information technology, including the Internet, digital gaming, virtual reality, and robotics, in the promotion, prevention, treatment, and maintenance of health. "mHealth" refers to mobile and wireless applications, including text messaging, apps, wearable devices, remote sensing, and the use of social media such as Facebook and Twitter, in the delivery of health related services. This special issue includes 11 articles that begin to address the need for more rigorous methodology, valid assessment, innovative interventions, and increased access to evidence-based programs and interventions. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  1. Integrated Functional and Executional Modelling of Software Using Web-Based Databases

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Marietta, Roberta

    1998-01-01

    NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases. To appear in an article of Journal of Database Management.

  2. Computer Science and Technology: Modeling and Measurement Techniques for Evaluation of Design Alternatives in the Implementation of Database Management Software. Final Report.

    ERIC Educational Resources Information Center

    Deutsch, Donald R.

    This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…

  3. Short Tandem Repeat DNA Internet Database

    National Institute of Standards and Technology Data Gateway

    SRD 130 Short Tandem Repeat DNA Internet Database (Web, free access)   Short Tandem Repeat DNA Internet Database is intended to benefit research and application of short tandem repeat DNA markers for human identity testing. Facts and sequence information on each STR system, population data, commonly used multiplex STR systems, PCR primers and conditions, and a review of various technologies for analysis of STR alleles have been included.

  4. Comparative analysis between academic and patent publications based on Fenton Technologies among China, Brazil, and the rest of the world.

    PubMed

    de Luna, Airton José; Santos, Douglas Alves

    2017-03-01

    Worldwide, year by year, Fenton's Technologies have been highlighted in both academic and patent scopes, in part due to their proven efficiency as environment-friendly technologies destined to the abatement of organic pollutants, and also by their growing interest to produce industrial applications. Thus, aiming to understand the effective dynamic between two worlds, academy vs patents, the present study performs a comparative analysis about publications on Fenton-based Technologies (FbT). Therefore, in this work, technological foresight techniques were adopted focusing on patent and non-patent databases, employing for this, the Web of Science (WoS) database as a prospecting tool. The main results for the last decade point out to a strong increment of the Fenton's Technologies, as much in R&D as in patent applications in the world. Chinese Universities and firms command the scenario. There is an expressive gap between the academic and patent issues.

  5. Location-based technologies for supporting elderly pedestrian in "getting lost" events.

    PubMed

    Pulido Herrera, Edith

    2017-05-01

    Localization-based technologies promise to keep older adults with dementia safe and support them and their caregivers during getting lost events. This paper summarizes mainly technological contributions to support the target group in these events. Moreover, important aspects of the getting lost phenomenon such as its concept and ethical issues are also briefly addressed. Papers were selected from scientific databases and gray literature. Since the topic is still in its infancy, other terms were used to find contributions associated with getting lost e.g. wandering. Trends of applying localization systems were identified as personal locators, perimeter systems and assistance systems. The first system barely considered the older adult's opinion, while assistance systems may involve context awareness to improve the support for both the elderly and the caregiver. Since few studies report multidisciplinary work with a special focus on getting lost, there is not a strong evidence of the real efficiency of localization systems or guidelines to design systems for the target group. Further research about getting lost is required to obtain insights for developing customizable systems. Moreover, considering conditions of the older adult might increase the impact of developments that combine localization technologies and artificial intelligence techniques. Implications for Rehabilitation Whilst there is no cure for dementia such as Alzheimer's, it is feasible to take advantage of technological developments to somewhat diminish its negative impact. For instance, location-based systems may provide information to early diagnose the Alzheimer's disease by assessing navigational impairments of older adults. Assessing the latest supportive technologies and methodologies may provide insights to adopt strategies to properly manage getting lost events. More user-centered designs will provide appropriate assistance to older adults. Namely, customizable systems could assist older adults in their daily walks with the aim to increase their self-confidence, independence and autonomy.

  6. Effect of Tai Chi for the prevention or treatment of osteoporosis in elderly adults: protocol for a systematic review and meta-analysis.

    PubMed

    Mu, Wei-Qiang; Huang, Xia-Yu; Zhang, Jiang; Liu, Xiao-Cong; Huang, Mao-Mao

    2018-04-09

    Osteoporosis (OP) has been defined as a degenerative bone disease characterised by low bone mass and microstructural deterioration of bone tissue, leading to fragility and an increased risk of fractures, especially of the hip, spine and wrist. Exercise has been shown to benefit the maintenance of bone health and improvement of muscle strength, balance and coordination, thereby reducing the risk of falls and fractures. However, prior findings regarding the optimal types and regimens of exercise for treating low bone mineral density (BMD) in elderly people are not consistent. As an important component of traditional Chinese Qigong exercises, Tai Chi (TC) is an ancient art and science of healthcare derived from the martial arts. The objective of this study is to attempt to conduct a systematic review and meta-analysis of the existing studies on TC exercise as an intervention for the prevention or treatment of OP in elderly adults and to draw more useful conclusions regarding the safety and the effectiveness of TC in preventing or treating OP. Eight electronic databases (Science Citation Index, PubMed Database, Embase (Ovid) Database, the Cochrane Central Register of Controlled Trials, and Chinese databases, including Chinese BioMedical Database, China National Knowledge Infrastructure, Wanfang database and the Chongqing VIP Chinese Science and Technology Periodical Database) will be searched from the beginning of each database to 1 April 2018. Potential outcomes of interest will include rates of fractures or falls, BMD at the total hip and the total spine, bone formation biomarkers, bone resorption biomarkers, bone biomarkers, health-related quality of life and adverse events. Only randomised controlled trials comparing TC exercise against each other or non-intervention will be included. The Cochrane risk of bias assessment tool will be used for quality assessment. Ethical approval is not required as the study will be a review of existing studies. This review may help to elucidate whether TC exercise is effective for the prevention or treatment of OP in elderly adults. The findings of the study will be published in a peer-reviewed publication and will be disseminated electronically or in print. We will share the findings in the fourth quarter of 2018. CRD42018084950. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. Impact of endometriosis on in vitro fertilization outcomes: an evaluation of the Society for Assisted Reproductive Technologies Database.

    PubMed

    Senapati, Suneeta; Sammel, Mary D; Morse, Christopher; Barnhart, Kurt T

    2016-07-01

    To assess the impact of endometriosis, alone or in combination with other infertility diagnoses, on IVF outcomes. Population-based retrospective cohort study of cycles from the Society for Assisted Reproductive Technology Clinic Outcome Reporting System database. Not applicable. A total of 347,185 autologous fresh and frozen assisted reproductive technology cycles from the period 2008-2010. None. Oocyte yield, implantation rate, live birth rate. Although cycles of patients with endometriosis constituted 11% of the study sample, the majority (64%) reported a concomitant diagnosis, with male factor (42%), tubal factor (29%), and diminished ovarian reserve (22%) being the most common. Endometriosis, when isolated or with concomitant diagnoses, was associated with lower oocyte yield compared with those with unexplained infertility, tubal factor, and all other infertility diagnoses combined. Women with isolated endometriosis had similar or higher live birth rates compared with those in other diagnostic groups. However, women with endometriosis with concomitant diagnoses had lower implantation rates and live birth rates compared with unexplained infertility, tubal factor, and all other diagnostic groups. Endometriosis is associated with lower oocyte yield, lower implantation rates, and lower pregnancy rates after IVF. However, the association of endometriosis and IVF outcomes is confounded by other infertility diagnoses. Endometriosis, when associated with other alterations in the reproductive tract, has the lowest chance of live birth. In contrast, for the minority of women who have endometriosis in isolation, the live birth rate is similar or slightly higher compared with other infertility diagnoses. Copyright © 2016. Published by Elsevier Inc.

  8. The application of the geography census data in seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Yuan, Shen; Ying, Zhang

    2017-04-01

    Limited by basic data timeliness to earthquake emergency database in Sichuan province, after the earthquake disaster assessment results and the actual damage there is a certain gap. In 2015, Sichuan completed the province census for the first time which including topography, traffic, vegetation coverage, water area, desert and bare ground, traffic network, the census residents and facilities, geographical unit, geological hazard as well as the Lushan earthquake-stricken area's town planning construction and ecological environment restoration. On this basis, combining with the existing achievements of basic geographic information data and high resolution image data, supplemented by remote sensing image interpretation and geological survey, Carried out distribution and change situation of statistical analysis and information extraction for earthquake disaster hazard-affected body elements such as surface coverage, roads, structures infrastructure in Lushan county before 2013 after 2015. At the same time, achieved the transformation and updating from geographical conditions census data to earthquake emergency basic data through research their data type, structure and relationship. Finally, based on multi-source disaster information including hazard-affected body changed data and Lushan 7.0 magnitude earthquake CORS network coseismal displacement field, etc. obtaining intensity control points through information fusion. Then completed the seismic influence field correction and assessed earthquake disaster again through Sichuan earthquake relief headquarters technology platform. Compared the new assessment result,original assessment result and actual earthquake disaster loss which shows that the revised evaluation result is more close to the actual earthquake disaster loss. In the future can realize geographical conditions census data to earthquake emergency basic data's normalized updates, ensure the timeliness to earthquake emergency database meanwhile improve the accuracy of assessment of earthquake disaster constantly.

  9. Djeen (Database for Joomla!'s Extensible Engine): a research information management system for flexible multi-technology project administration.

    PubMed

    Stahl, Olivier; Duvergey, Hugo; Guille, Arnaud; Blondin, Fanny; Vecchio, Alexandre Del; Finetti, Pascal; Granjeaud, Samuel; Vigy, Oana; Bidaut, Ghislain

    2013-06-06

    With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. We developed Djeen (Database for Joomla!'s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group.Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material.

  10. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    PubMed Central

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  11. Environmental impacts of lighting technologies - Life cycle assessment and sensitivity analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welz, Tobias; Hischier, Roland, E-mail: Roland.Hischier@empa.ch; Hilty, Lorenz M.

    2011-04-15

    With two regulations, 244/2009 and 245/2009, the European Commission recently put into practice the EuP Directive in the area of lighting devices, aiming to improve energy efficiency in the domestic lighting sector. This article presents a comprehensive life cycle assessment comparison of four different lighting technologies: the tungsten lamp, the halogen lamp, the conventional fluorescent lamp and the compact fluorescent lamp. Taking advantage of the most up-to-date life cycle inventory database available (ecoinvent data version 2.01), all life cycle phases were assessed and the sensitivity of the results for varying assumptions analysed: different qualities of compact fluorescent lamps (production phase),more » different electricity mixes (use phase), and end-of-life scenarios for WEEE recycling versus municipal solid waste incineration (disposal phase). A functional unit of 'one hour of lighting' was defined and the environmental burdens for the whole life cycle for all four lamp types were calculated, showing a clearly lower impact for the two gas-discharge lamps, i.e. the fluorescent and the compact fluorescent lamp. Differences in the product quality of the compact fluorescent lamps reveal to have only a very small effect on the overall environmental performance of this lamp type; a decline of the actual life time of this lamp type doesn't result in a change of the rank order of the results of the here examined four lamp types. It was also shown that the environmental break-even point of the gas-discharge lamps is reached long before the end of their expected life-span. All in all, it can be concluded that a change from today's tungsten lamp technology to a low-energy-consuming technology such as the compact fluorescent lamp results in a substantial environmental benefit.« less

  12. The Impact of Technology-Based Interventions on Informal Caregivers of Stroke Survivors: A Systematic Review.

    PubMed

    Aldehaim, Abdulkarim Yousef; Alotaibi, Faisal F; Uphold, Constance R; Dang, Stuti

    2016-03-01

    This article is a systematic review of the impact of technology-based intervention on outcomes related to care providers for those who survived a stroke. Literature was identified in the PubMed, PsycINFO, Scopus, and Cochrane databases for evidence on technology-based interventions for stroke survivors' caregivers. The search was restricted for all English-language articles from 1970 to February 2015 that implied technology-based interventions. This review included studies that measured the impact of these types of approaches on one or more of the following: depression and any of the following-problem-solving ability, burden, health status, social support, preparedness, and healthcare utilization by care recipient-as secondary outcomes. Telephone or face-to-face counseling sessions were not of interest for this review. The search strategy yielded five studies that met inclusion criteria: two randomized clinical trials and three pilot/preliminary studies, with diverse approaches and designs. Four studies have assessed the primary outcome, two of which reported significant decreases in caregivers' depressive symptoms. Two studies had measured each of the following outcomes-burden, problem-solving ability, health status, and social support-and they revealed no significant differences following the intervention. Only one study assessed caregivers' preparedness and showed improved posttest scores. Healthcare services use by the care recipient was assessed by one study, and the results indicated significant reduction in emergency department visits and hospital re-admissions. Despite various study designs and small sample sizes, available data suggest that an intervention that incorporates a theoretical-based model and is designed to target caregivers as early as possible is a promising strategy. Furthermore, there is a need to incorporate a cost-benefit analysis in future studies.

  13. Service dogs and people with physical disabilities partnerships: a systematic review.

    PubMed

    Winkle, Melissa; Crowe, Terry K; Hendrix, Ingrid

    2012-03-01

    Occupational therapists have recognized the benefits that service dogs can provide people with disabilities. There are many anecdotal publications extolling the benefits of working with service dogs, but few rigorous studies exist to provide the evidence of the usefulness of this type of assistive technology option. This systematic review evaluates the published research that supports the use of service dogs for people with mobility-related physical disabilities. Articles were identified by computerized search of PubMed, CINAHL, PsycINFO, OT Seeker, the Cochrane Database of Systematic Reviews, SportDiscus, Education Research Complete, Public Administration Abstracts, Web of Knowledge and Academic Search Premier databases with no date range specified. The keywords used in the search included disabled persons, assistance dogs or service dogs and mobility impairments. The reference lists of the research papers were checked as was the personal citation database of the lead author. Twelve studies met the inclusion criteria and whereas the findings are promising, they are inconclusive and limited because of the level of evidence, which included one Level I, six Level III, four Level IV and one Level V. All of the studies reviewed had research design quality concerns including small participant sizes, poor descriptions of the interventions, outcome measures with minimal psychometrics and lack of power calculations. Findings indicated three major themes including social/participation, functional and psychological outcomes; all of which are areas in the occupational therapy scope of practice. Occupational therapists may play a critical role in referral, assessment, assisting clients and consulting with training organizations before, during and after the service dog placement process. In order for health care professionals to have confidence in recommending this type of assistive technology, the evidence to support such decisions must be strengthened. Copyright © 2011 John Wiley & Sons, Ltd.

  14. An offline-online Web-GIS Android application for fast data acquisition of landslide hazard and risk

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; Sudmeier-Rieux, Karen; Jaboyedoff, Michel; Derron, Marc-Henri; Devkota, Sanjaya

    2017-04-01

    Regional landslide assessments and mapping have been effectively pursued by research institutions, national and local governments, non-governmental organizations (NGOs), and different stakeholders for some time, and a wide range of methodologies and technologies have consequently been proposed. Land-use mapping and hazard event inventories are mostly created by remote-sensing data, subject to difficulties, such as accessibility and terrain, which need to be overcome. Likewise, landslide data acquisition for the field navigation can magnify the accuracy of databases and analysis. Open-source Web and mobile GIS tools can be used for improved ground-truthing of critical areas to improve the analysis of hazard patterns and triggering factors. This paper reviews the implementation and selected results of a secure mobile-map application called ROOMA (Rapid Offline-Online Mapping Application) for the rapid data collection of landslide hazard and risk. This prototype assists the quick creation of landslide inventory maps (LIMs) by collecting information on the type, feature, volume, date, and patterns of landslides using open-source Web-GIS technologies such as Leaflet maps, Cordova, GeoServer, PostgreSQL as the real DBMS (database management system), and PostGIS as its plug-in for spatial database management. This application comprises Leaflet maps coupled with satellite images as a base layer, drawing tools, geolocation (using GPS and the Internet), photo mapping, and event clustering. All the features and information are recorded into a GeoJSON text file in an offline version (Android) and subsequently uploaded to the online mode (using all browsers) with the availability of Internet. Finally, the events can be accessed and edited after approval by an administrator and then be visualized by the general public.

  15. Technology and Microcomputers for an Information Centre/Special Library.

    ERIC Educational Resources Information Center

    Daehn, Ralph M.

    1984-01-01

    Discusses use of microcomputer hardware and software, telecommunications methods, and advanced library methods to create a specialized information center's database of literature relating to farm machinery and food processing. Systems and services (electronic messaging, serials control, database creation, cataloging, collections, circulation,…

  16. Implementing a Microcomputer Database Management System.

    ERIC Educational Resources Information Center

    Manock, John J.; Crater, K. Lynne

    1985-01-01

    Current issues in selecting, structuring, and implementing microcomputer database management systems in research administration offices are discussed, and their capabilities are illustrated with the system used by the University of North Carolina at Wilmington. Trends in microcomputer technology and their likely impact on research administration…

  17. Sustaining Indigenous Languages in Cyberspace.

    ERIC Educational Resources Information Center

    Cazden, Courtney B.

    This paper describes how certain types of electronic technologies, specifically CD-ROMs, computerized databases, and telecommunications networks, are being incorporated into language and culture revitalization projects in Alaska and around the Pacific. The paper presents two examples of CD-ROMs and computerized databases from Alaska, describing…

  18. Designs on a National Research Network.

    ERIC Educational Resources Information Center

    Walsh, John

    1988-01-01

    Discusses the addition of the National Aeronautics and Space Administration database to the National Science Foundation's NSFnet data communication network. Outlines the history of databases in the United States and enumerates proposed upgrades from a new Office of Science and Technology policy report. (TW)

  19. Spatial database for a global assessment of undiscovered copper resources: Chapter Z in Global mineral resource assessment

    USGS Publications Warehouse

    Dicken, Connie L.; Dunlap, Pamela; Parks, Heather L.; Hammarstrom, Jane M.; Zientek, Michael L.; Zientek, Michael L.; Hammarstrom, Jane M.; Johnson, Kathleen M.

    2016-07-13

    As part of the first-ever U.S. Geological Survey global assessment of undiscovered copper resources, data common to several regional spatial databases published by the U.S. Geological Survey, including one report from Finland and one from Greenland, were standardized, updated, and compiled into a global copper resource database. This integrated collection of spatial databases provides location, geologic and mineral resource data, and source references for deposits, significant prospects, and areas permissive for undiscovered deposits of both porphyry copper and sediment-hosted copper. The copper resource database allows for efficient modeling on a global scale in a geographic information system (GIS) and is provided in an Esri ArcGIS file geodatabase format.

  20. Ab Initio Design of Potent Anti-MRSA Peptides based on Database Filtering Technology

    PubMed Central

    Mishra, Biswajit; Wang, Guangshun

    2012-01-01

    To meet the challenge of antibiotic resistance worldwide, a new generation of antimicrobials must be developed.1 This communication demonstrates ab initio design of potent peptides against methicillin-resistant Staphylococcus aureus (MRSA). Our idea is that the peptide is very likely to be active when most probable parameters are utilized in each step of the design. We derived the most probable parameters (e.g. amino acid composition, peptide hydrophobic content, and net charge) from the antimicrobial peptide database2 by developing a database filtering technology (DFT). Different from classic cationic antimicrobial peptides usually with high cationicity, DFTamP1, the first anti-MRSA peptide designed using this technology, is a short peptide with high hydrophobicity but low cationicity. Such a molecular design made the peptide highly potent. Indeed, the peptide caused bacterial surface damage and killed community-associated MRSA USA300 in 60 minutes. Structural determination of DFTamP1 by NMR spectroscopy revealed a broad hydrophobic surface, providing a basis for its potency against MRSA known to deploy positively charged moieties on the surface as a mechanism for resistance. A combination of our ab initio design with database screening3 led to yet another peptide with enhanced potency. Because of simple composition, short length, stability to proteases, and membrane targeting, the designed peptides are attractive leads for developing novel anti-MRSA therapeutics. Our database-derived design concept can be applied to the design of peptide mimicries to combat MRSA as well. PMID:22803960

Top