Sample records for comparative standardized-protocol database

  1. Feasibility and utility of applications of the common data model to multiple, disparate observational health databases

    PubMed Central

    Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B

    2015-01-01

    Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757

  2. Feasibility and utility of applications of the common data model to multiple, disparate observational health databases.

    PubMed

    Voss, Erica A; Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B

    2015-05-01

    To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  3. Establishment of a Universal Size Standard Strain for Use with the PulseNet Standardized Pulsed-Field Gel Electrophoresis Protocols: Converting the National Databases to the New Size Standard

    PubMed Central

    Hunter, Susan B.; Vauterin, Paul; Lambert-Fair, Mary Ann; Van Duyne, M. Susan; Kubota, Kristy; Graves, Lewis; Wrigley, Donna; Barrett, Timothy; Ribot, Efrain

    2005-01-01

    The PulseNet National Database, established by the Centers for Disease Control and Prevention in 1996, consists of pulsed-field gel electrophoresis (PFGE) patterns obtained from isolates of food-borne pathogens (currently Escherichia coli O157:H7, Salmonella, Shigella, and Listeria) and textual information about the isolates. Electronic images and accompanying text are submitted from over 60 U.S. public health and food regulatory agency laboratories. The PFGE patterns are generated according to highly standardized PFGE protocols. Normalization and accurate comparison of gel images require the use of a well-characterized size standard in at least three lanes of each gel. Originally, a well-characterized strain of each organism was chosen as the reference standard for that particular database. The increasing number of databases, difficulty in identifying an organism-specific standard for each database, the increased range of band sizes generated by the use of additional restriction endonucleases, and the maintenance of many different organism-specific strains encouraged us to search for a more versatile and universal DNA size marker. A Salmonella serotype Braenderup strain (H9812) was chosen as the universal size standard. This strain was subjected to rigorous testing in our laboratories to ensure that it met the desired criteria, including coverage of a wide range of DNA fragment sizes, even distribution of bands, and stability of the PFGE pattern. The strategy used to convert and compare data generated by the new and old reference standards is described. PMID:15750058

  4. The OAuth 2.0 Web Authorization Protocol for the Internet Addiction Bioinformatics (IABio) Database.

    PubMed

    Choi, Jeongseok; Kim, Jaekwon; Lee, Dong Kyun; Jang, Kwang Soo; Kim, Dai-Jin; Choi, In Young

    2016-03-01

    Internet addiction (IA) has become a widespread and problematic phenomenon as smart devices pervade society. Moreover, internet gaming disorder leads to increases in social expenditures for both individuals and nations alike. Although the prevention and treatment of IA are getting more important, the diagnosis of IA remains problematic. Understanding the neurobiological mechanism of behavioral addictions is essential for the development of specific and effective treatments. Although there are many databases related to other addictions, a database for IA has not been developed yet. In addition, bioinformatics databases, especially genetic databases, require a high level of security and should be designed based on medical information standards. In this respect, our study proposes the OAuth standard protocol for database access authorization. The proposed IA Bioinformatics (IABio) database system is based on internet user authentication, which is a guideline for medical information standards, and uses OAuth 2.0 for access control technology. This study designed and developed the system requirements and configuration. The OAuth 2.0 protocol is expected to establish the security of personal medical information and be applied to genomic research on IA.

  5. Normative Databases for Imaging Instrumentation.

    PubMed

    Realini, Tony; Zangwill, Linda M; Flanagan, John G; Garway-Heath, David; Patella, Vincent M; Johnson, Chris A; Artes, Paul H; Gaddie, Ian B; Fingeret, Murray

    2015-08-01

    To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer's database differs in size, eligibility criteria, and ethnic make-up, among other key features. The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments.

  6. Normative Databases for Imaging Instrumentation

    PubMed Central

    Realini, Tony; Zangwill, Linda; Flanagan, John; Garway-Heath, David; Patella, Vincent Michael; Johnson, Chris; Artes, Paul; Ben Gaddie, I.; Fingeret, Murray

    2015-01-01

    Purpose To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. Methods A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Results Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer’s database differs in size, eligibility criteria, and ethnic make-up, among other key features. Conclusions The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments. PMID:25265003

  7. Metabolomics Workbench: An international repository for metabolomics data and metadata, metabolite standards, protocols, tutorials and training, and analysis tools

    PubMed Central

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K. Sreekumaran; Sumner, Susan; Subramaniam, Shankar

    2016-01-01

    The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. PMID:26467476

  8. A Standardized Protocol for the Prospective Follow-Up of Cleft Lip and Palate Patients.

    PubMed

    Salimi, Negar; Jolanta, Aleksejūnienė; Edwin, Yen; Angelina, Loo

    2018-01-01

    To develop a standardized all-encompassing protocol for the assessment of cleft lip and palate patients with clinical and research implications. Electronic database searches were conducted and 13 major cleft centers worldwide were contacted in order to prepare for the development of the protocol. In preparation, the available evidence was reviewed and potential fistula-related risk determinants from 4 different domains were identified. No standardized protocol for the assessment of cleft patients could be found in any of the electronic database searches that were conducted. Interviews with representatives from several major centers revealed that the majority of centers do not have a standardized comprehensive strategy for the reporting and follow-up of cleft lip and palate patients. The protocol was developed and consisted of the following domains of determinants: (1) the sociodemographic domain, (2) the cleft defect domain, (3) the surgery domain, and (4) the fistula domain. The proposed protocol has the potential to enhance the quality of patient care by ensuring that multiple patient-related aspects are consistently reported. It may also facilitate future multicenter research, which could contribute to the reduction of fistula occurrence in cleft lip and palate patients.

  9. Validation of chronic obstructive pulmonary disease (COPD) diagnoses in healthcare databases: a systematic review protocol.

    PubMed

    Rimland, Joseph M; Abraha, Iosief; Luchetta, Maria Laura; Cozzolino, Francesco; Orso, Massimiliano; Cherubini, Antonio; Dell'Aquila, Giuseppina; Chiatti, Carlos; Ambrosio, Giuseppe; Montedori, Alessandro

    2016-06-01

    Healthcare databases are useful sources to investigate the epidemiology of chronic obstructive pulmonary disease (COPD), to assess longitudinal outcomes in patients with COPD, and to develop disease management strategies. However, in order to constitute a reliable source for research, healthcare databases need to be validated. The aim of this protocol is to perform the first systematic review of studies reporting the validation of codes related to COPD diagnoses in healthcare databases. MEDLINE, EMBASE, Web of Science and the Cochrane Library databases will be searched using appropriate search strategies. Studies that evaluated the validity of COPD codes (such as the International Classification of Diseases 9th Revision and 10th Revision system; the Real codes system or the International Classification of Primary Care) in healthcare databases will be included. Inclusion criteria will be: (1) the presence of a reference standard case definition for COPD; (2) the presence of at least one test measure (eg, sensitivity, positive predictive values, etc); and (3) the use of a healthcare database (including administrative claims databases, electronic healthcare databases or COPD registries) as a data source. Pairs of reviewers will independently abstract data using standardised forms and will assess quality using a checklist based on the Standards for Reporting of Diagnostic accuracy (STARD) criteria. This systematic review protocol has been produced in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocol (PRISMA-P) 2015 statement. Ethics approval is not required. Results of this study will be submitted to a peer-reviewed journal for publication. The results from this systematic review will be used for outcome research on COPD and will serve as a guide to identify appropriate case definitions of COPD, and reference standards, for researchers involved in validating healthcare databases. CRD42015029204. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Metabolomics Workbench: An international repository for metabolomics data and metadata, metabolite standards, protocols, tutorials and training, and analysis tools.

    PubMed

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K Sreekumaran; Sumner, Susan; Subramaniam, Shankar

    2016-01-04

    The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Standardized food images: A photographing protocol and image database.

    PubMed

    Charbonnier, Lisette; van Meer, Floor; van der Laan, Laura N; Viergever, Max A; Smeets, Paul A M

    2016-01-01

    The regulation of food intake has gained much research interest because of the current obesity epidemic. For research purposes, food images are a good and convenient alternative for real food because many dietary decisions are made based on the sight of foods. Food pictures are assumed to elicit anticipatory responses similar to real foods because of learned associations between visual food characteristics and post-ingestive consequences. In contemporary food science, a wide variety of images are used which introduces between-study variability and hampers comparison and meta-analysis of results. Therefore, we created an easy-to-use photographing protocol which enables researchers to generate high resolution food images appropriate for their study objective and population. In addition, we provide a high quality standardized picture set which was characterized in seven European countries. With the use of this photographing protocol a large number of food images were created. Of these images, 80 were selected based on their recognizability in Scotland, Greece and The Netherlands. We collected image characteristics such as liking, perceived calories and/or perceived healthiness ratings from 449 adults and 191 children. The majority of the foods were recognized and liked at all sites. The differences in liking ratings, perceived calories and perceived healthiness between sites were minimal. Furthermore, perceived caloric content and healthiness ratings correlated strongly (r ≥ 0.8) with actual caloric content in both adults and children. The photographing protocol as well as the images and the data are freely available for research use on http://nutritionalneuroscience.eu/. By providing the research community with standardized images and the tools to create their own, comparability between studies will be improved and a head-start is made for a world-wide standardized food image database. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Bedside diagnosis of dysphagia: a systematic review.

    PubMed

    O'Horo, John C; Rogus-Pulia, Nicole; Garcia-Arguello, Lisbeth; Robbins, JoAnne; Safdar, Nasia

    2015-04-01

    Dysphagia is associated with aspiration, pneumonia, and malnutrition, but remains challenging to identify at the bedside. A variety of exam protocols and maneuvers are commonly used, but the efficacy of these maneuvers is highly variable. We conducted a comprehensive search of 7 databases, including MEDLINE, Embase, and Scopus, from each database's earliest inception through June 9, 2014. Studies reporting diagnostic performance of a bedside examination maneuver compared to a reference gold standard (videofluoroscopic swallow study or flexible endoscopic evaluation of swallowing with sensory testing) were included for analysis. From each study, data were abstracted based on the type of diagnostic method and reference standard study population and inclusion/exclusion characteristics, design, and prediction of aspiration. The search strategy identified 38 articles meeting inclusion criteria. Overall, most bedside examinations lacked sufficient sensitivity to be used for screening purposes across all patient populations examined. Individual studies found dysphonia assessments, abnormal pharyngeal sensation assessments, dual axis accelerometry, and 1 description of water swallow testing to be sensitive tools, but none were reported as consistently sensitive. A preponderance of identified studies was in poststroke adults, limiting the generalizability of results. No bedside screening protocol has been shown to provide adequate predictive value for presence of aspiration. Several individual exam maneuvers demonstrated reasonable sensitivity, but reproducibility and consistency of these protocols was not established. More research is needed to design an optimal protocol for dysphagia detection. © 2015 Society of Hospital Medicine.

  13. Evolution of a Patient Information Management System in a Local Area Network Environment at Loyola University of Chicago Medical Center

    PubMed Central

    Price, Ronald N; Chandrasekhar, Arcot J; Tamirisa, Balaji

    1990-01-01

    The Department of Medicine at Loyola University Medical Center (LUMC) of Chicago has implemented a local area network (LAN) based Patient Information Management System (PIMS) as part of its integrated departmental database management system. PIMS consists of related database applications encompassing demographic information, current medications, problem lists, clinical data, prior events, and on-line procedure results. Integration into the existing departmental database system permits PIMS to capture and manipulate data in other departmental applications. Standardization of clinical data is accomplished through three data tables that verify diagnosis codes, procedures codes and a standardized set of clinical data elements. The modularity of the system, coupled with standardized data formats, allowed the development of a Patient Information Protocol System (PIPS). PIPS, a userdefinable protocol processor, provides physicians with individualized data entry or review screens customized for their specific research protocols or practice habits. Physician feedback indicates that the PIMS/PIPS combination enhances their ability to collect and review specific patient information by filtering large amount of clinical data.

  14. Comment on "flexible protocol for quantum private query based on B92 protocol"

    NASA Astrophysics Data System (ADS)

    Chang, Yan; Zhang, Shi-Bin; Zhu, Jing-Min

    2017-03-01

    In a recent paper (Quantum Inf Process 13:805-813, 2014), a flexible quantum private query (QPQ) protocol based on B92 protocol is presented. Here we point out that the B92-based QPQ protocol is insecure in database security when the channel has loss, that is, the user (Alice) will know more records in Bob's database compared with she has bought.

  15. EuroFlow standardization of flow cytometer instrument settings and immunophenotyping protocols

    PubMed Central

    Kalina, T; Flores-Montero, J; van der Velden, V H J; Martin-Ayuso, M; Böttcher, S; Ritgen, M; Almeida, J; Lhermitte, L; Asnafi, V; Mendonça, A; de Tute, R; Cullen, M; Sedek, L; Vidriales, M B; Pérez, J J; te Marvelde, J G; Mejstrikova, E; Hrusak, O; Szczepański, T; van Dongen, J J M; Orfao, A

    2012-01-01

    The EU-supported EuroFlow Consortium aimed at innovation and standardization of immunophenotyping for diagnosis and classification of hematological malignancies by introducing 8-color flow cytometry with fully standardized laboratory procedures and antibody panels in order to achieve maximally comparable results among different laboratories. This required the selection of optimal combinations of compatible fluorochromes and the design and evaluation of adequate standard operating procedures (SOPs) for instrument setup, fluorescence compensation and sample preparation. Additionally, we developed software tools for the evaluation of individual antibody reagents and antibody panels. Each section describes what has been evaluated experimentally versus adopted based on existing data and experience. Multicentric evaluation demonstrated high levels of reproducibility based on strict implementation of the EuroFlow SOPs and antibody panels. Overall, the 6 years of extensive collaborative experiments and the analysis of hundreds of cell samples of patients and healthy controls in the EuroFlow centers have provided for the first time laboratory protocols and software tools for fully standardized 8-color flow cytometric immunophenotyping of normal and malignant leukocytes in bone marrow and blood; this has yielded highly comparable data sets, which can be integrated in a single database. PMID:22948490

  16. Practical Quantum Private Database Queries Based on Passive Round-Robin Differential Phase-shift Quantum Key Distribution.

    PubMed

    Li, Jian; Yang, Yu-Guang; Chen, Xiu-Bo; Zhou, Yi-Hua; Shi, Wei-Min

    2016-08-19

    A novel quantum private database query protocol is proposed, based on passive round-robin differential phase-shift quantum key distribution. Compared with previous quantum private database query protocols, the present protocol has the following unique merits: (i) the user Alice can obtain one and only one key bit so that both the efficiency and security of the present protocol can be ensured, and (ii) it does not require to change the length difference of the two arms in a Mach-Zehnder interferometer and just chooses two pulses passively to interfere with so that it is much simpler and more practical. The present protocol is also proved to be secure in terms of the user security and database security.

  17. Practical Quantum Private Database Queries Based on Passive Round-Robin Differential Phase-shift Quantum Key Distribution

    PubMed Central

    Li, Jian; Yang, Yu-Guang; Chen, Xiu-Bo; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    A novel quantum private database query protocol is proposed, based on passive round-robin differential phase-shift quantum key distribution. Compared with previous quantum private database query protocols, the present protocol has the following unique merits: (i) the user Alice can obtain one and only one key bit so that both the efficiency and security of the present protocol can be ensured, and (ii) it does not require to change the length difference of the two arms in a Mach-Zehnder interferometer and just chooses two pulses passively to interfere with so that it is much simpler and more practical. The present protocol is also proved to be secure in terms of the user security and database security. PMID:27539654

  18. Federated or cached searches: Providing expected performance from multiple invasive species databases

    NASA Astrophysics Data System (ADS)

    Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.

    2011-06-01

    Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search "deep" web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.

  19. Federated or cached searches: providing expected performance from multiple invasive species databases

    USGS Publications Warehouse

    Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.

    2011-01-01

    Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search “deep” web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.

  20. Exposure to benzodiazepines (anxiolytics, hypnotics and related drugs) in seven European electronic healthcare databases: a cross-national descriptive study from the PROTECT-EU Project.

    PubMed

    Huerta, Consuelo; Abbing-Karahagopian, Victoria; Requena, Gema; Oliva, Belén; Alvarez, Yolanda; Gardarsdottir, Helga; Miret, Montserrat; Schneider, Cornelia; Gil, Miguel; Souverein, Patrick C; De Bruin, Marie L; Slattery, Jim; De Groot, Mark C H; Hesse, Ulrik; Rottenkolber, Marietta; Schmiedl, Sven; Montero, Dolores; Bate, Andrew; Ruigomez, Ana; García-Rodríguez, Luis Alberto; Johansson, Saga; de Vries, Frank; Schlienger, Raymond G; Reynolds, Robert F; Klungel, Olaf H; de Abajo, Francisco José

    2016-03-01

    Studies on drug utilization usually do not allow direct cross-national comparisons because of differences in the respective applied methods. This study aimed to compare time trends in BZDs prescribing by applying a common protocol and analyses plan in seven European electronic healthcare databases. Crude and standardized prevalence rates of drug prescribing from 2001-2009 were calculated in databases from Spain, United Kingdon (UK), The Netherlands, Germany and Denmark. Prevalence was stratified by age, sex, BZD type [(using ATC codes), i.e. BZD-anxiolytics BZD-hypnotics, BZD-related drugs and clomethiazole], indication and number of prescription. Crude prevalence rates of BZDs prescribing ranged from 570 to 1700 per 10,000 person-years over the study period. Standardization by age and sex did not substantially change the differences. Standardized prevalence rates increased in the Spanish (+13%) and UK databases (+2% and +8%) over the study period, while they decreased in the Dutch databases (-4% and -22%), the German (-12%) and Danish (-26%) database. Prevalence of anxiolytics outweighed that of hypnotics in the Spanish, Dutch and Bavarian databases, but the reverse was shown in the UK and Danish databases. Prevalence rates consistently increased with age and were two-fold higher in women than in men in all databases. A median of 18% of users received 10 or more prescriptions in 2008. Although similar methods were applied, the prevalence of BZD prescribing varied considerably across different populations. Clinical factors related to BZDs and characteristics of the databases may explain these differences. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Accuracy of LightCycler(R) SeptiFast for the detection and identification of pathogens in the blood of patients with suspected sepsis: a systematic review protocol.

    PubMed

    Dark, Paul; Wilson, Claire; Blackwood, Bronagh; McAuley, Danny F; Perkins, Gavin D; McMullan, Ronan; Gates, Simon; Warhurst, Geoffrey

    2012-01-01

    Background There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probe-based real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care. Registration PROSPERO-NIHR Prospective Register of Systematic Reviews (CRD42011001289).

  2. Enhancing user privacy in SARG04-based private database query protocols

    NASA Astrophysics Data System (ADS)

    Yu, Fang; Qiu, Daowen; Situ, Haozhen; Wang, Xiaoming; Long, Shun

    2015-11-01

    The well-known SARG04 protocol can be used in a private query application to generate an oblivious key. By usage of the key, the user can retrieve one out of N items from a database without revealing which one he/she is interested in. However, the existing SARG04-based private query protocols are vulnerable to the attacks of faked data from the database since in its canonical form, the SARG04 protocol lacks means for one party to defend attacks from the other. While such attacks can cause significant loss of user privacy, a variant of the SARG04 protocol is proposed in this paper with new mechanisms designed to help the user protect its privacy in private query applications. In the protocol, it is the user who starts the session with the database, trying to learn from it bits of a raw key in an oblivious way. An honesty test is used to detect a cheating database who had transmitted faked data. The whole private query protocol has O( N) communication complexity for conveying at least N encrypted items. Compared with the existing SARG04-based protocols, it is efficient in communication for per-bit learning.

  3. Validity of breast, lung and colorectal cancer diagnoses in administrative databases: a systematic review protocol.

    PubMed

    Abraha, Iosief; Giovannini, Gianni; Serraino, Diego; Fusco, Mario; Montedori, Alessandro

    2016-03-18

    Breast, lung and colorectal cancers constitute the most common cancers worldwide and their epidemiology, related health outcomes and quality indicators can be studied using administrative healthcare databases. To constitute a reliable source for research, administrative healthcare databases need to be validated. The aim of this protocol is to perform the first systematic review of studies reporting the validation of International Classification of Diseases 9th and 10th revision codes to identify breast, lung and colorectal cancer diagnoses in administrative healthcare databases. This review protocol has been developed according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocol (PRISMA-P) 2015 statement. We will search the following databases: MEDLINE, EMBASE, Web of Science and the Cochrane Library, using appropriate search strategies. We will include validation studies that used administrative data to identify breast, lung and colorectal cancer diagnoses or studies that evaluated the validity of breast, lung and colorectal cancer codes in administrative data. The following inclusion criteria will be used: (1) the presence of a reference standard case definition for the disease of interest; (2) the presence of at least one test measure (eg, sensitivity, positive predictive values, etc) and (3) the use of data source from an administrative database. Pairs of reviewers will independently abstract data using standardised forms and will assess quality using a checklist based on the Standards for Reporting of Diagnostic accuracy (STARD) criteria. Ethics approval is not required. We will submit results of this study to a peer-reviewed journal for publication. The results will serve as a guide to identify appropriate case definitions and algorithms of breast, lung and colorectal cancers for researchers involved in validating administrative healthcare databases as well as for outcome research on these conditions that used administrative healthcare databases. CRD42015026881. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. Guidelines for the collection of continuous stream water-temperature data in Alaska

    USGS Publications Warehouse

    Toohey, Ryan C.; Neal, Edward G.; Solin, Gary L.

    2014-01-01

    Objectives of stream monitoring programs differ considerably among many of the academic, Federal, state, tribal, and non-profit organizations in the state of Alaska. Broad inclusion of stream-temperature monitoring can provide an opportunity for collaboration in the development of a statewide stream-temperature database. Statewide and regional coordination could reduce overall monitoring cost, while providing better analyses at multiple spatial and temporal scales to improve resource decision-making. Increased adoption of standardized protocols and data-quality standards may allow for validation of historical modeling efforts with better projection calibration. For records of stream water temperature to be generally consistent, unbiased, and reproducible, data must be collected and analyzed according to documented protocols. Collection of water-temperature data requires definition of data-quality objectives, good site selection, proper selection of instrumentation, proper installation of sensors, periodic site visits to maintain sensors and download data, pre- and post-deployment verification against an NIST-certified thermometer, potential data corrections, and proper documentation, review, and approval. A study created to develop a quality-assurance project plan, data-quality objectives, and a database management plan that includes procedures for data archiving and dissemination could provide a means to standardize a statewide stream-temperature database in Alaska. Protocols can be modified depending on desired accuracy or specific needs of data collected. This document is intended to guide users in collecting time series water-temperature data in Alaskan streams and draws extensively on the broader protocols already published by the U.S. Geological Survey.

  5. Navigating spatial and temporal complexity in developing a long-term land use database for an agricultural watershed

    USDA-ARS?s Scientific Manuscript database

    No comprehensive protocols exist for the collection, standardization, and storage of agronomic management information into a database that preserves privacy, maintains data uncertainty, and translates everyday decisions into quantitative values. This manuscript describes the development of a databas...

  6. Validity of peptic ulcer disease and upper gastrointestinal bleeding diagnoses in administrative databases: a systematic review protocol.

    PubMed

    Montedori, Alessandro; Abraha, Iosief; Chiatti, Carlos; Cozzolino, Francesco; Orso, Massimiliano; Luchetta, Maria Laura; Rimland, Joseph M; Ambrosio, Giuseppe

    2016-09-15

    Administrative healthcare databases are useful to investigate the epidemiology, health outcomes, quality indicators and healthcare utilisation concerning peptic ulcers and gastrointestinal bleeding, but the databases need to be validated in order to be a reliable source for research. The aim of this protocol is to perform the first systematic review of studies reporting the validation of International Classification of Diseases, 9th Revision and 10th version (ICD-9 and ICD-10) codes for peptic ulcer and upper gastrointestinal bleeding diagnoses. MEDLINE, EMBASE, Web of Science and the Cochrane Library databases will be searched, using appropriate search strategies. We will include validation studies that used administrative data to identify peptic ulcer disease and upper gastrointestinal bleeding diagnoses or studies that evaluated the validity of peptic ulcer and upper gastrointestinal bleeding codes in administrative data. The following inclusion criteria will be used: (a) the presence of a reference standard case definition for the diseases of interest; (b) the presence of at least one test measure (eg, sensitivity, etc) and (c) the use of an administrative database as a source of data. Pairs of reviewers will independently abstract data using standardised forms and will evaluate quality using the checklist of the Standards for Reporting of Diagnostic Accuracy (STARD) criteria. This systematic review protocol has been produced in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analysis Protocol (PRISMA-P) 2015 statement. Ethics approval is not required given that this is a protocol for a systematic review. We will submit results of this study to a peer-reviewed journal for publication. The results will serve as a guide for researchers validating administrative healthcare databases to determine appropriate case definitions for peptic ulcer disease and upper gastrointestinal bleeding, as well as to perform outcome research using administrative healthcare databases of these conditions. CRD42015029216. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  7. Thematic Accuracy Assessment of the 2011 National Land Cover Database (NLCD)

    EPA Science Inventory

    Accuracy assessment is a standard protocol of National Land Cover Database (NLCD) mapping. Here we report agreement statistics between map and reference labels for NLCD 2011, which includes land cover for ca. 2001, ca. 2006, and ca. 2011. The two main objectives were assessment o...

  8. Evaluating a NoSQL Alternative for Chilean Virtual Observatory Services

    NASA Astrophysics Data System (ADS)

    Antognini, J.; Araya, M.; Solar, M.; Valenzuela, C.; Lira, F.

    2015-09-01

    Currently, the standards and protocols for data access in the Virtual Observatory architecture (DAL) are generally implemented with relational databases based on SQL. In particular, the Astronomical Data Query Language (ADQL), language used by IVOA to represent queries to VO services, was created to satisfy the different data access protocols, such as Simple Cone Search. ADQL is based in SQL92, and has extra functionality implemented using PgSphere. An emergent alternative to SQL are the so called NoSQL databases, which can be classified in several categories such as Column, Document, Key-Value, Graph, Object, etc.; each one recommended for different scenarios. Within their notable characteristics we can find: schema-free, easy replication support, simple API, Big Data, etc. The Chilean Virtual Observatory (ChiVO) is developing a functional prototype based on the IVOA architecture, with the following relevant factors: Performance, Scalability, Flexibility, Complexity, and Functionality. Currently, it's very difficult to compare these factors, due to a lack of alternatives. The objective of this paper is to compare NoSQL alternatives with SQL through the implementation of a Web API REST that satisfies ChiVO's needs: a SESAME-style name resolver for the data from ALMA. Therefore, we propose a test scenario by configuring a NoSQL database with data from different sources and evaluating the feasibility of creating a Simple Cone Search service and its performance. This comparison will allow to pave the way for the application of Big Data databases in the Virtual Observatory.

  9. Executing Complexity-Increasing Queries in Relational (MySQL) and NoSQL (MongoDB and EXist) Size-Growing ISO/EN 13606 Standardized EHR Databases

    PubMed Central

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2018-01-01

    This research shows a protocol to assess the computational complexity of querying relational and non-relational (NoSQL (not only Structured Query Language)) standardized electronic health record (EHR) medical information database systems (DBMS). It uses a set of three doubling-sized databases, i.e. databases storing 5000, 10,000 and 20,000 realistic standardized EHR extracts, in three different database management systems (DBMS): relational MySQL object-relational mapping (ORM), document-based NoSQL MongoDB, and native extensible markup language (XML) NoSQL eXist. The average response times to six complexity-increasing queries were computed, and the results showed a linear behavior in the NoSQL cases. In the NoSQL field, MongoDB presents a much flatter linear slope than eXist. NoSQL systems may also be more appropriate to maintain standardized medical information systems due to the special nature of the updating policies of medical information, which should not affect the consistency and efficiency of the data stored in NoSQL databases. One limitation of this protocol is the lack of direct results of improved relational systems such as archetype relational mapping (ARM) with the same data. However, the interpolation of doubling-size database results to those presented in the literature and other published results suggests that NoSQL systems might be more appropriate in many specific scenarios and problems to be solved. For example, NoSQL may be appropriate for document-based tasks such as EHR extracts used in clinical practice, or edition and visualization, or situations where the aim is not only to query medical information, but also to restore the EHR in exactly its original form. PMID:29608174

  10. Executing Complexity-Increasing Queries in Relational (MySQL) and NoSQL (MongoDB and EXist) Size-Growing ISO/EN 13606 Standardized EHR Databases.

    PubMed

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2018-03-19

    This research shows a protocol to assess the computational complexity of querying relational and non-relational (NoSQL (not only Structured Query Language)) standardized electronic health record (EHR) medical information database systems (DBMS). It uses a set of three doubling-sized databases, i.e. databases storing 5000, 10,000 and 20,000 realistic standardized EHR extracts, in three different database management systems (DBMS): relational MySQL object-relational mapping (ORM), document-based NoSQL MongoDB, and native extensible markup language (XML) NoSQL eXist. The average response times to six complexity-increasing queries were computed, and the results showed a linear behavior in the NoSQL cases. In the NoSQL field, MongoDB presents a much flatter linear slope than eXist. NoSQL systems may also be more appropriate to maintain standardized medical information systems due to the special nature of the updating policies of medical information, which should not affect the consistency and efficiency of the data stored in NoSQL databases. One limitation of this protocol is the lack of direct results of improved relational systems such as archetype relational mapping (ARM) with the same data. However, the interpolation of doubling-size database results to those presented in the literature and other published results suggests that NoSQL systems might be more appropriate in many specific scenarios and problems to be solved. For example, NoSQL may be appropriate for document-based tasks such as EHR extracts used in clinical practice, or edition and visualization, or situations where the aim is not only to query medical information, but also to restore the EHR in exactly its original form.

  11. Generation of comprehensive thoracic oncology database--tool for translational research.

    PubMed

    Surati, Mosmi; Robinson, Matthew; Nandi, Suvobroto; Faoro, Leonardo; Demchuk, Carley; Kanteti, Rajani; Ferguson, Benjamin; Gangadhar, Tara; Hensing, Thomas; Hasina, Rifat; Husain, Aliya; Ferguson, Mark; Karrison, Theodore; Salgia, Ravi

    2011-01-22

    The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wohlgemuth, John; Silverman, Timothy; Miller, David C.

    This paper describes an effort to inspect and evaluate PV modules in order to determine what failure or degradation modes are occurring in field installations. This paper will report on the results of six site visits, including the Sacramento Municipal Utility District (SMUD) Hedge Array, Tucson Electric Power (TEP) Springerville, Central Florida Utility, Florida Solar Energy Center (FSEC), the TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been usedmore » to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification.« less

  13. A Quantum Private Query Protocol for Enhancing both User and Database Privacy

    NASA Astrophysics Data System (ADS)

    Zhou, Yi-Hua; Bai, Xue-Wei; Li, Lei-Lei; Shi, Wei-Min; Yang, Yu-Guang

    2018-01-01

    In order to protect the privacy of query user and database, some QKD-based quantum private query (QPQ) protocols were proposed. Unfortunately some of them cannot resist internal attack from database perfectly; some others can ensure better user privacy but require a reduction of database privacy. In this paper, a novel two-way QPQ protocol is proposed to ensure the privacy of both sides of communication. In our protocol, user makes initial quantum states and derives the key bit by comparing initial quantum state and outcome state returned from database by ctrl or shift mode instead of announcing two non-orthogonal qubits as others which may leak part secret information. In this way, not only the privacy of database be ensured but also user privacy is strengthened. Furthermore, our protocol can also realize the security of loss-tolerance, cheat-sensitive, and resisting JM attack etc. Supported by National Natural Science Foundation of China under Grant Nos. U1636106, 61572053, 61472048, 61602019, 61502016; Beijing Natural Science Foundation under Grant Nos. 4152038, 4162005; Basic Research Fund of Beijing University of Technology (No. X4007999201501); The Scientific Research Common Program of Beijing Municipal Commission of Education under Grant No. KM201510005016

  14. The Biological Macromolecule Crystallization Database and NASA Protein Crystal Growth Archive

    PubMed Central

    Gilliland, Gary L.; Tung, Michael; Ladner, Jane

    1996-01-01

    The NIST/NASA/CARB Biological Macromolecule Crystallization Database (BMCD), NIST Standard Reference Database 21, contains crystal data and crystallization conditions for biological macromolecules. The database entries include data abstracted from published crystallographic reports. Each entry consists of information describing the biological macromolecule crystallized and crystal data and the crystallization conditions for each crystal form. The BMCD serves as the NASA Protein Crystal Growth Archive in that it contains protocols and results of crystallization experiments undertaken in microgravity (space). These database entries report the results, whether successful or not, from NASA-sponsored protein crystal growth experiments in microgravity and from microgravity crystallization studies sponsored by other international organizations. The BMCD was designed as a tool to assist x-ray crystallographers in the development of protocols to crystallize biological macromolecules, those that have previously been crystallized, and those that have not been crystallized. PMID:11542472

  15. A novel protocol for dispatcher assisted CPR improves CPR quality and motivation among rescuers-A randomized controlled simulation study.

    PubMed

    Rasmussen, Stinne Eika; Nebsbjerg, Mette Amalie; Krogh, Lise Qvirin; Bjørnshave, Katrine; Krogh, Kristian; Povlsen, Jonas Agerlund; Riddervold, Ingunn Skogstad; Grøfte, Thorbjørn; Kirkegaard, Hans; Løfgren, Bo

    2017-01-01

    Emergency dispatchers use protocols to instruct bystanders in cardiopulmonary resuscitation (CPR). Studies changing one element in the dispatcher's protocol report improved CPR quality. Whether several changes interact is unknown and the effect of combining multiple changes previously reported to improve CPR quality into one protocol remains to be investigated. We hypothesize that a novel dispatch protocol, combining multiple beneficial elements improves CPR quality compared with a standard protocol. A novel dispatch protocol was designed including wording on chest compressions, using a metronome, regular encouragements and a 10-s rest each minute. In a simulated cardiac arrest scenario, laypersons were randomized to perform single-rescuer CPR guided with the novel or the standard protocol. a composite endpoint of time to first compression, hand position, compression depth and rate and hands-off time (maximum score: 22 points). Afterwards participants answered a questionnaire evaluating the dispatcher assistance. The novel protocol (n=61) improved CPR quality score compared with the standard protocol (n=64) (mean (SD): 18.6 (1.4)) points vs. 17.5 (1.7) points, p<0.001. The novel protocol resulted in deeper chest compressions (mean (SD): 58 (12)mm vs. 52 (13)mm, p=0.02) and improved rate of correct hand position (61% vs. 36%, p=0.01) compared with the standard protocol. In both protocols hands-off time was short. The novel protocol improved motivation among rescuers compared with the standard protocol (p=0.002). Participants guided with a standard dispatch protocol performed high quality CPR. A novel bundle of care protocol improved CPR quality score and motivation among rescuers. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Accuracy of LightCycler® SeptiFast for the detection and identification of pathogens in the blood of patients with suspected sepsis: a systematic review protocol

    PubMed Central

    Wilson, Claire; Blackwood, Bronagh; McAuley, Danny F; Perkins, Gavin D; McMullan, Ronan; Gates, Simon; Warhurst, Geoffrey

    2012-01-01

    Background There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probe-based real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). Study selection: diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. Data extraction: three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care. Registration PROSPERO—NIHR Prospective Register of Systematic Reviews (CRD42011001289). PMID:22240646

  17. A novel method for efficient archiving and retrieval of biomedical images using MPEG-7

    NASA Astrophysics Data System (ADS)

    Meyer, Joerg; Pahwa, Ash

    2004-10-01

    Digital archiving and efficient retrieval of radiological scans have become critical steps in contemporary medical diagnostics. Since more and more images and image sequences (single scans or video) from various modalities (CT/MRI/PET/digital X-ray) are now available in digital formats (e.g., DICOM-3), hospitals and radiology clinics need to implement efficient protocols capable of managing the enormous amounts of data generated daily in a typical clinical routine. We present a method that appears to be a viable way to eliminate the tedious step of manually annotating image and video material for database indexing. MPEG-7 is a new framework that standardizes the way images are characterized in terms of color, shape, and other abstract, content-related criteria. A set of standardized descriptors that are automatically generated from an image is used to compare an image to other images in a database, and to compute the distance between two images for a given application domain. Text-based database queries can be replaced with image-based queries using MPEG-7. Consequently, image queries can be conducted without any prior knowledge of the keys that were used as indices in the database. Since the decoding and matching steps are not part of the MPEG-7 standard, this method also enables searches that were not planned by the time the keys were generated.

  18. Does an Otolaryngology-Specific Database Have Added Value? A Comparative Feasibility Analysis.

    PubMed

    Bellmunt, Angela M; Roberts, Rhonda; Lee, Walter T; Schulz, Kris; Pynnonen, Melissa A; Crowson, Matthew G; Witsell, David; Parham, Kourosh; Langman, Alan; Vambutas, Andrea; Ryan, Sheila E; Shin, Jennifer J

    2016-07-01

    There are multiple nationally representative databases that support epidemiologic and outcomes research, and it is unknown whether an otolaryngology-specific resource would prove indispensable or superfluous. Therefore, our objective was to determine the feasibility of analyses in the National Ambulatory Medical Care Survey (NAMCS) and National Hospital Ambulatory Medical Care Survey (NHAMCS) databases as compared with the otolaryngology-specific Creating Healthcare Excellence through Education and Research (CHEER) database. Parallel analyses in 2 data sets. Ambulatory visits in the United States. To test a fixed hypothesis that could be directly compared between data sets, we focused on a condition with expected prevalence high enough to substantiate availability in both. This query also encompassed a broad span of diagnoses to sample the breadth of available information. Specifically, we compared an assessment of suspected risk factors for sensorineural hearing loss in subjects 0 to 21 years of age, according to a predetermined protocol. We also assessed the feasibility of 6 additional diagnostic queries among all age groups. In the NAMCS/NHAMCS data set, the number of measured observations was not sufficient to support reliable numeric conclusions (percentage standard error among risk factors: 38.6-92.1). Analysis of the CHEER database demonstrated that age, sex, meningitis, and cytomegalovirus were statistically significant factors associated with pediatric sensorineural hearing loss (P < .01). Among the 6 additional diagnostic queries assessed, NAMCS/NHAMCS usage was also infeasible; the CHEER database contained 1585 to 212,521 more observations per annum. An otolaryngology-specific database has added utility when compared with already available national ambulatory databases. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.

  19. The TERRA-PNW Dataset: A New Source for Standardized Plant Trait, Forest Carbon Cycling, and Soil Properties Measurements from the Pacific Northwest US, 2000-2014.

    NASA Astrophysics Data System (ADS)

    Berner, L. T.; Law, B. E.

    2015-12-01

    Plant traits include physiological, morphological, and biogeochemical characteristics that in combination determine a species sensitivity to environmental conditions. Standardized, co-located, and geo-referenced species- and plot-level measurements are needed to address variation in species sensitivity to climate change impacts and for ecosystem process model development, parameterization and testing. We present a new database of plant trait, forest carbon cycling, and soil property measurements derived from multiple TERRA-PNW projects in the Pacific Northwest US, spanning 2000-2014. The database includes measurements from over 200 forest plots across Oregon and northern California, where the data were explicitly collected for scaling and modeling regional terrestrial carbon processes with models such as Biome-BGC and the Community Land Model. Some of the data are co-located at AmeriFlux sites in the region. The database currently contains leaf trait measurements (specific leaf area, leaf longevity, leaf carbon and nitrogen) from over 1,200 branch samples and 30 species, as well as plot-level biomass and productivity components, and soil carbon and nitrogen. Standardized protocols were used across projects, as summarized in an FAO protocols document. The database continues to expand and will include agricultural crops. The database will be hosted by the Oak Ridge National Laboratory (ORLN) Distributed Active Archive Center (DAAC). We hope that other regional databases will become publicly available to help enable Earth System Modeling to simulate species-level sensitivity to climate at regional to global scales.

  20. GMOMETHODS: the European Union database of reference methods for GMO analysis.

    PubMed

    Bonfini, Laura; Van den Bulcke, Marc H; Mazzara, Marco; Ben, Enrico; Patak, Alexandre

    2012-01-01

    In order to provide reliable and harmonized information on methods for GMO (genetically modified organism) analysis we have published a database called "GMOMETHODS" that supplies information on PCR assays validated according to the principles and requirements of ISO 5725 and/or the International Union of Pure and Applied Chemistry protocol. In addition, the database contains methods that have been verified by the European Union Reference Laboratory for Genetically Modified Food and Feed in the context of compliance with an European Union legislative act. The web application provides search capabilities to retrieve primers and probes sequence information on the available methods. It further supplies core data required by analytical labs to carry out GM tests and comprises information on the applied reference material and plasmid standards. The GMOMETHODS database currently contains 118 different PCR methods allowing identification of 51 single GM events and 18 taxon-specific genes in a sample. It also provides screening assays for detection of eight different genetic elements commonly used for the development of GMOs. The application is referred to by the Biosafety Clearing House, a global mechanism set up by the Cartagena Protocol on Biosafety to facilitate the exchange of information on Living Modified Organisms. The publication of the GMOMETHODS database can be considered an important step toward worldwide standardization and harmonization in GMO analysis.

  1. Pressure ulcer risk assessment and prevention: a systematic comparative effectiveness review.

    PubMed

    Chou, Roger; Dana, Tracy; Bougatsos, Christina; Blazina, Ian; Starmer, Amy J; Reitel, Katie; Buckley, David I

    2013-07-02

    Pressure ulcers are associated with substantial health burdens but may be preventable. To review the clinical utility of pressure ulcer risk assessment instruments and the comparative effectiveness of preventive interventions in persons at higher risk. MEDLINE (1946 through November 2012), CINAHL, the Cochrane Library, grant databases, clinical trial registries, and reference lists. Randomized trials and observational studies on effects of using risk assessment on clinical outcomes and randomized trials of preventive interventions on clinical outcomes. Multiple investigators abstracted and checked study details and quality using predefined criteria. One good-quality trial found no evidence that use of a pressure ulcer risk assessment instrument, with or without a protocolized intervention strategy based on assessed risk, reduces risk for incident pressure ulcers compared with less standardized risk assessment based on nurses' clinical judgment. In higher-risk populations, 1 good-quality and 4 fair-quality randomized trials found that more advanced static support surfaces were associated with lower risk for pressure ulcers compared with standard mattresses (relative risk range, 0.20 to 0.60). Evidence on the effectiveness of low-air-loss and alternating-air mattresses was limited, with some trials showing no clear differences from advanced static support surfaces. Evidence on the effectiveness of nutritional supplementation, repositioning, and skin care interventions versus usual care was limited and had methodological shortcomings, precluding strong conclusions. Only English-language articles were included, publication bias could not be formally assessed, and most studies had methodological shortcomings. More advanced static support surfaces are more effective than standard mattresses for preventing ulcers in higher-risk populations. The effectiveness of formal risk assessment instruments and associated intervention protocols compared with less standardized assessment methods and the effectiveness of other preventive interventions compared with usual care have not been clearly established.

  2. Secure quantum private information retrieval using phase-encoded queries

    NASA Astrophysics Data System (ADS)

    Olejnik, Lukasz

    2011-08-01

    We propose a quantum solution to the classical private information retrieval (PIR) problem, which allows one to query a database in a private manner. The protocol offers privacy thresholds and allows the user to obtain information from a database in a way that offers the potential adversary, in this model the database owner, no possibility of deterministically establishing the query contents. This protocol may also be viewed as a solution to the symmetrically private information retrieval problem in that it can offer database security (inability for a querying user to steal its contents). Compared to classical solutions, the protocol offers substantial improvement in terms of communication complexity. In comparison with the recent quantum private queries [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.100.230502 100, 230502 (2008)] protocol, it is more efficient in terms of communication complexity and the number of rounds, while offering a clear privacy parameter. We discuss the security of the protocol and analyze its strengths and conclude that using this technique makes it challenging to obtain the unconditional (in the information-theoretic sense) privacy degree; nevertheless, in addition to being simple, the protocol still offers a privacy level. The oracle used in the protocol is inspired both by the classical computational PIR solutions as well as the Deutsch-Jozsa oracle.

  3. Secure quantum private information retrieval using phase-encoded queries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olejnik, Lukasz

    We propose a quantum solution to the classical private information retrieval (PIR) problem, which allows one to query a database in a private manner. The protocol offers privacy thresholds and allows the user to obtain information from a database in a way that offers the potential adversary, in this model the database owner, no possibility of deterministically establishing the query contents. This protocol may also be viewed as a solution to the symmetrically private information retrieval problem in that it can offer database security (inability for a querying user to steal its contents). Compared to classical solutions, the protocol offersmore » substantial improvement in terms of communication complexity. In comparison with the recent quantum private queries [Phys. Rev. Lett. 100, 230502 (2008)] protocol, it is more efficient in terms of communication complexity and the number of rounds, while offering a clear privacy parameter. We discuss the security of the protocol and analyze its strengths and conclude that using this technique makes it challenging to obtain the unconditional (in the information-theoretic sense) privacy degree; nevertheless, in addition to being simple, the protocol still offers a privacy level. The oracle used in the protocol is inspired both by the classical computational PIR solutions as well as the Deutsch-Jozsa oracle.« less

  4. Assessing operating characteristics of CAD algorithms in the absence of a gold standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy Choudhury, Kingshuk; Paik, David S.; Yi, Chin A.

    2010-04-15

    Purpose: The authors examine potential bias when using a reference reader panel as ''gold standard'' for estimating operating characteristics of CAD algorithms for detecting lesions. As an alternative, the authors propose latent class analysis (LCA), which does not require an external gold standard to evaluate diagnostic accuracy. Methods: A binomial model for multiple reader detections using different diagnostic protocols was constructed, assuming conditional independence of readings given true lesion status. Operating characteristics of all protocols were estimated by maximum likelihood LCA. Reader panel and LCA based estimates were compared using data simulated from the binomial model for a range ofmore » operating characteristics. LCA was applied to 36 thin section thoracic computed tomography data sets from the Lung Image Database Consortium (LIDC): Free search markings of four radiologists were compared to markings from four different CAD assisted radiologists. For real data, bootstrap-based resampling methods, which accommodate dependence in reader detections, are proposed to test of hypotheses of differences between detection protocols. Results: In simulation studies, reader panel based sensitivity estimates had an average relative bias (ARB) of -23% to -27%, significantly higher (p-value <0.0001) than LCA (ARB -2% to -6%). Specificity was well estimated by both reader panel (ARB -0.6% to -0.5%) and LCA (ARB 1.4%-0.5%). Among 1145 lesion candidates LIDC considered, LCA estimated sensitivity of reference readers (55%) was significantly lower (p-value 0.006) than CAD assisted readers' (68%). Average false positives per patient for reference readers (0.95) was not significantly lower (p-value 0.28) than CAD assisted readers' (1.27). Conclusions: Whereas a gold standard based on a consensus of readers may substantially bias sensitivity estimates, LCA may be a significantly more accurate and consistent means for evaluating diagnostic accuracy.« less

  5. Migration of legacy mumps applications to relational database servers.

    PubMed

    O'Kane, K C

    2001-07-01

    An extended implementation of the Mumps language is described that facilitates vendor neutral migration of legacy Mumps applications to SQL-based relational database servers. Implemented as a compiler, this system translates Mumps programs to operating system independent, standard C code for subsequent compilation to fully stand-alone, binary executables. Added built-in functions and support modules extend the native hierarchical Mumps database with access to industry standard, networked, relational database management servers (RDBMS) thus freeing Mumps applications from dependence upon vendor specific, proprietary, unstandardized database models. Unlike Mumps systems that have added captive, proprietary RDMBS access, the programs generated by this development environment can be used with any RDBMS system that supports common network access protocols. Additional features include a built-in web server interface and the ability to interoperate directly with programs and functions written in other languages.

  6. Similarity-based modeling in large-scale prediction of drug-drug interactions.

    PubMed

    Vilar, Santiago; Uriarte, Eugenio; Santana, Lourdes; Lorberbaum, Tal; Hripcsak, George; Friedman, Carol; Tatonetti, Nicholas P

    2014-09-01

    Drug-drug interactions (DDIs) are a major cause of adverse drug effects and a public health concern, as they increase hospital care expenses and reduce patients' quality of life. DDI detection is, therefore, an important objective in patient safety, one whose pursuit affects drug development and pharmacovigilance. In this article, we describe a protocol applicable on a large scale to predict novel DDIs based on similarity of drug interaction candidates to drugs involved in established DDIs. The method integrates a reference standard database of known DDIs with drug similarity information extracted from different sources, such as 2D and 3D molecular structure, interaction profile, target and side-effect similarities. The method is interpretable in that it generates drug interaction candidates that are traceable to pharmacological or clinical effects. We describe a protocol with applications in patient safety and preclinical toxicity screening. The time frame to implement this protocol is 5-7 h, with additional time potentially necessary, depending on the complexity of the reference standard DDI database and the similarity measures implemented.

  7. Automated database-guided expert-supervised orientation for immunophenotypic diagnosis and classification of acute leukemia

    PubMed Central

    Lhermitte, L; Mejstrikova, E; van der Sluijs-Gelling, A J; Grigore, G E; Sedek, L; Bras, A E; Gaipa, G; Sobral da Costa, E; Novakova, M; Sonneveld, E; Buracchi, C; de Sá Bacelar, T; te Marvelde, J G; Trinquand, A; Asnafi, V; Szczepanski, T; Matarraz, S; Lopez, A; Vidriales, B; Bulsa, J; Hrusak, O; Kalina, T; Lecrevisse, Q; Martin Ayuso, M; Brüggemann, M; Verde, J; Fernandez, P; Burgos, L; Paiva, B; Pedreira, C E; van Dongen, J J M; Orfao, A; van der Velden, V H J

    2018-01-01

    Precise classification of acute leukemia (AL) is crucial for adequate treatment. EuroFlow has previously designed an AL orientation tube (ALOT) to guide towards the relevant classification panel (T-cell acute lymphoblastic leukemia (T-ALL), B-cell precursor (BCP)-ALL and/or acute myeloid leukemia (AML)) and final diagnosis. Now we built a reference database with 656 typical AL samples (145 T-ALL, 377 BCP-ALL, 134 AML), processed and analyzed via standardized protocols. Using principal component analysis (PCA)-based plots and automated classification algorithms for direct comparison of single-cells from individual patients against the database, another 783 cases were subsequently evaluated. Depending on the database-guided results, patients were categorized as: (i) typical T, B or Myeloid without or; (ii) with a transitional component to another lineage; (iii) atypical; or (iv) mixed-lineage. Using this automated algorithm, in 781/783 cases (99.7%) the right panel was selected, and data comparable to the final WHO-diagnosis was already provided in >93% of cases (85% T-ALL, 97% BCP-ALL, 95% AML and 87% mixed-phenotype AL patients), even without data on the full-characterization panels. Our results show that database-guided analysis facilitates standardized interpretation of ALOT results and allows accurate selection of the relevant classification panels, hence providing a solid basis for designing future WHO AL classifications. PMID:29089646

  8. Development and evaluation of a study design typology for human research.

    PubMed

    Carini, Simona; Pollock, Brad H; Lehmann, Harold P; Bakken, Suzanne; Barbour, Edward M; Gabriel, Davera; Hagler, Herbert K; Harper, Caryn R; Mollah, Shamim A; Nahm, Meredith; Nguyen, Hien H; Scheuermann, Richard H; Sim, Ida

    2009-11-14

    A systematic classification of study designs would be useful for researchers, systematic reviewers, readers, and research administrators, among others. As part of the Human Studies Database Project, we developed the Study Design Typology to standardize the classification of study designs in human research. We then performed a multiple observer masked evaluation of active research protocols in four institutions according to a standardized protocol. Thirty-five protocols were classified by three reviewers each into one of nine high-level study designs for interventional and observational research (e.g., N-of-1, Parallel Group, Case Crossover). Rater classification agreement was moderately high for the 35 protocols (Fleiss' kappa = 0.442) and higher still for the 23 quantitative studies (Fleiss' kappa = 0.463). We conclude that our typology shows initial promise for reliably distinguishing study design types for quantitative human research.

  9. Relativistic quantum private database queries

    NASA Astrophysics Data System (ADS)

    Sun, Si-Jia; Yang, Yu-Guang; Zhang, Ming-Ou

    2015-04-01

    Recently, Jakobi et al. (Phys Rev A 83, 022301, 2011) suggested the first practical private database query protocol (J-protocol) based on the Scarani et al. (Phys Rev Lett 92, 057901, 2004) quantum key distribution protocol. Unfortunately, the J-protocol is just a cheat-sensitive private database query protocol. In this paper, we present an idealized relativistic quantum private database query protocol based on Minkowski causality and the properties of quantum information. Also, we prove that the protocol is secure in terms of the user security and the database security.

  10. Incremental yield of dysplasia detection in Barrett's esophagus using volumetric laser endomicroscopy with and without laser marking compared with a standardized random biopsy protocol.

    PubMed

    Alshelleh, Mohammad; Inamdar, Sumant; McKinley, Matthew; Stewart, Molly; Novak, Jeffrey S; Greenberg, Ronald E; Sultan, Keith; Devito, Bethany; Cheung, Mary; Cerulli, Maurice A; Miller, Larry S; Sejpal, Divyesh V; Vegesna, Anil K; Trindade, Arvind J

    2018-02-02

    Volumetric laser endomicroscopy (VLE) is a new wide-field advanced imaging technology for Barrett's esophagus (BE). No data exist on incremental yield of dysplasia detection. Our aim is to report the incremental yield of dysplasia detection in BE using VLE. This is a retrospective study from a prospectively maintained database from 2011 to 2017 comparing the dysplasia yield of 4 different surveillance strategies in an academic BE tertiary care referral center. The groups were (1) random biopsies (RB), (2) Seattle protocol random biopsies (SP), (3) VLE without laser marking (VLE), and (4) VLE with laser marking (VLEL). A total of 448 consecutive patients (79 RB, 95 SP, 168 VLE, and 106 VLEL) met the inclusion criteria. After adjusting for visible lesions, the total dysplasia yield was 5.7%, 19.6%, 24.8%, and 33.7%, respectively. When compared with just the SP group, the VLEL group had statistically higher rates of overall dysplasia yield (19.6% vs 33.7%, P = .03; odds ratio, 2.1, P = .03). Both the VLEL and VLE groups had statistically significant differences in neoplasia (high-grade dysplasia and intramucosal cancer) detection compared with the SP group (14% vs 1%, P = .001 and 11% vs 1%, P = .003). A surveillance strategy involving VLEL led to a statistically significant higher yield of dysplasia and neoplasia detection compared with a standard random biopsy protocol. These results support the use of VLEL for surveillance in BE in academic centers. Copyright © 2018 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  11. Gonadotrophin-releasing hormone antagonists for assisted conception.

    PubMed

    Al-Inany, H G; Abou-Setta, A M; Aboulghar, M

    2006-07-19

    Gonadotrophin-releasing hormone antagonists produce immediate suppression of gonadotrophin secretion, hence, they can be given after starting gonadotrophin administration. This has resulted in dramatic reduction in the duration of treatment cycle. Two different regimes have been described. The multiple-dose protocol involves the administration of 0.25 mg cetrorelix (or ganirelix) daily from day six to seven of stimulation, or when the leading follicle is 14 to15 mm, until human chorionic gonadotrophin (HCG) administration and the single-dose protocol involves the single administration of 3 mg cetrorelix on day seven to eight of stimulation. Assuming comparable clinical outcome, these benefits would justify a change from the standard long protocol of GnRH agonists to the new GnRH antagonist regimens. To evaluate the evidence regarding the efficacy of gonadotrophin-releasing hormone (GnRH) antagonists with the standard long protocol of GnRH agonists for controlled ovarian hyperstimulation in assisted conception. We searched Cochrane Menstrual Disorders and Subfertility Group's Specialised Register, MEDLINE and EMBASE databases from 1987 to February 2006, and handsearched bibliographies of relevant publications and reviews, and abstracts of scientific meetings. We also contacted manufacturers in the field. Randomized controlled studies comparing different protocols of GnRH antagonists with GnRH agonists in assisted conception cycles were included in this review. Two authors independently assessed trial quality and extracted data. If relevant data were missing or unclear, the authors have been consulted Twenty seven RCTs comparing the GnRH antagonist to the long protocol of GnRH agonist fulfilled the inclusion criteria. Clinical pregnancy rate was significantly lower in the antagonist group. (OR = 0.84, 95% CI = 0.72 - 0.97). The ongoing pregnancy/ live-birth rate showed the same significant lower pregnancy in the antagonist group (P = 0.03; OR 0.82, 95% CI 0.69 to 0.98).However, there was statistically significant reduction in incidence of severe OHSS with antagonist protocol. The relative risk ratio was (P = 0.01; RR 0.61, 95% CI 0.42 to 0.89). In addition, interventions to prevent OHSS (e.g. coasting, cycle cancellation) were administered more frequently in the agonist group (P = 0.03; OR 0.44, 95% CI 0.21 to 0.93). GnRH antagonist protocol is a short and simple protocol with good clinical outcome with significant reduction in incidence of severe ovarian hyperstimulation syndrome and amount of gonadotrophins but the lower pregnancy rate compared to the GnRH agonist long protocol necessitates counseling subfertile couples before recommending change from GnRH agonist to antagonist..

  12. A standards-based clinical information system for HIV/AIDS.

    PubMed

    Stitt, F W

    1995-01-01

    To create a clinical data repository to interface the Veteran's Administration (VA) Decentralized Hospital Computer Program (DHCP) and a departmental clinical information system for the management of HIV patients. This system supports record-keeping, decision-making, reporting, and analysis. The database development was designed to overcome two impediments to successful implementations of clinical databases: (i) lack of a standard reference data model, and; (ii) lack of a universal standard for medical concept representation. Health Level Seven (HL7) is a standard protocol that specifies the implementation of interfaces between two computer applications (sender and receiver) from different vendors or sources of electronic data exchange in the health care environment. This eliminates or substantially reduces the custom interface programming and program maintenance that would otherwise be required. HL7 defines the data to be exchanged, the timing of the interchange, and the communication of errors to the application. The formats are generic in nature and must be configured to meet the needs of the two applications involved. The standard conceptually operates at the seventh level of the ISO model for Open Systems Interconnection (OSI). The OSI simply defines the data elements that are exchanged as abstract messages, and does not prescribe the exact bit stream of the messages that flow over the network. Lower level network software developed according to the OSI model may be used to encode and decode the actual bit stream. The OSI protocols are not universally implemented and, therefore, a set of encoding rules for defining the exact representation of a message must be specified. The VA has created an HL7 module to assist DHCP applications in exchanging health care information with other applications using the HL7 protocol. The DHCP HL7 module consists of a set of utility routines and files that provide a generic interface to the HL7 protocol for all DHCP applications. The VA's DHCP core modules are in standard use at 169 hospitals, and the role of the VA system in health care delivery has been discussed elsewhere. This development was performed at the Miami VA Medical Center Special Immunology Unit, where a database was created for an HIV patient registry in 1987. Over 2,300 patient have been entered into a database that supports a problem-oriented summary of the patient's clinical record. The interface to the VA DHCP was designed and implemented to capture information from the patient treatment file, pharmacy, laboratory, radiology, and other modules. We obtained a suite of programs for implementing the HL7 encoding rules from Columbia-Presbyterian Medical Center in New York, written in ANSI C. This toolkit isolates our application programs from the details of the HL7 encoding rules, and allows them to deal with abstract messages and the programming level. While HL7 has become a standard for healthcare message exchange, SQL (Structured Query Language) is the standard for database definition, data manipulation, and query. The target database (Stitt F.W. The Problem-Oriented Medical Synopsis: a patient-centered clinical information system. Proc 17 SCAMC. 1993:88-93) provides clinical workstation functionality. Medical concepts are encoded using a preferred terminology derived from over 15 sources that include the Unified Medical Language System and SNOMed International ( Stitt F.W. The Problem-Oriented Medical Synopsis: coding, indexing, and classification sub-model. Proc 18 SCAMC, 1994: in press). The databases were modeled using the Information Engineering CASE tools, and were written using relational database utilities, including embedded SQL in C (ESQL/C). We linked ESQL/C programs to the HL7 toolkit to allow data to be inserted, deleted, or updated, under transaction control. A graphical format will be used to display the entity-rel

  13. Privacy-preserving search for chemical compound databases.

    PubMed

    Shimizu, Kana; Nuida, Koji; Arai, Hiromi; Mitsunari, Shigeo; Attrapadung, Nuttapong; Hamada, Michiaki; Tsuda, Koji; Hirokawa, Takatsugu; Sakuma, Jun; Hanaoka, Goichiro; Asai, Kiyoshi

    2015-01-01

    Searching for similar compounds in a database is the most important process for in-silico drug screening. Since a query compound is an important starting point for the new drug, a query holder, who is afraid of the query being monitored by the database server, usually downloads all the records in the database and uses them in a closed network. However, a serious dilemma arises when the database holder also wants to output no information except for the search results, and such a dilemma prevents the use of many important data resources. In order to overcome this dilemma, we developed a novel cryptographic protocol that enables database searching while keeping both the query holder's privacy and database holder's privacy. Generally, the application of cryptographic techniques to practical problems is difficult because versatile techniques are computationally expensive while computationally inexpensive techniques can perform only trivial computation tasks. In this study, our protocol is successfully built only from an additive-homomorphic cryptosystem, which allows only addition performed on encrypted values but is computationally efficient compared with versatile techniques such as general purpose multi-party computation. In an experiment searching ChEMBL, which consists of more than 1,200,000 compounds, the proposed method was 36,900 times faster in CPU time and 12,000 times as efficient in communication size compared with general purpose multi-party computation. We proposed a novel privacy-preserving protocol for searching chemical compound databases. The proposed method, easily scaling for large-scale databases, may help to accelerate drug discovery research by making full use of unused but valuable data that includes sensitive information.

  14. Privacy-preserving search for chemical compound databases

    PubMed Central

    2015-01-01

    Background Searching for similar compounds in a database is the most important process for in-silico drug screening. Since a query compound is an important starting point for the new drug, a query holder, who is afraid of the query being monitored by the database server, usually downloads all the records in the database and uses them in a closed network. However, a serious dilemma arises when the database holder also wants to output no information except for the search results, and such a dilemma prevents the use of many important data resources. Results In order to overcome this dilemma, we developed a novel cryptographic protocol that enables database searching while keeping both the query holder's privacy and database holder's privacy. Generally, the application of cryptographic techniques to practical problems is difficult because versatile techniques are computationally expensive while computationally inexpensive techniques can perform only trivial computation tasks. In this study, our protocol is successfully built only from an additive-homomorphic cryptosystem, which allows only addition performed on encrypted values but is computationally efficient compared with versatile techniques such as general purpose multi-party computation. In an experiment searching ChEMBL, which consists of more than 1,200,000 compounds, the proposed method was 36,900 times faster in CPU time and 12,000 times as efficient in communication size compared with general purpose multi-party computation. Conclusion We proposed a novel privacy-preserving protocol for searching chemical compound databases. The proposed method, easily scaling for large-scale databases, may help to accelerate drug discovery research by making full use of unused but valuable data that includes sensitive information. PMID:26678650

  15. GnRH antagonist versus long agonist protocols in IVF: a systematic review and meta-analysis accounting for patient type.

    PubMed

    Lambalk, C B; Banga, F R; Huirne, J A; Toftager, M; Pinborg, A; Homburg, R; van der Veen, F; van Wely, M

    2017-09-01

    Most reviews of IVF ovarian stimulation protocols have insufficiently accounted for various patient populations, such as ovulatory women, women with polycystic ovary syndrome (PCOS) or women with poor ovarian response, and have included studies in which the agonist or antagonist was not the only variable between the compared study arms. The aim of the current study was to compare GnRH antagonist protocols versus standard long agonist protocols in couples undergoing IVF or ICSI, while accounting for various patient populations and treatment schedules. The Cochrane Menstrual Disorders and Subfertility Review Group specialized register of controlled trials and Pubmed and Embase databases were searched from inception until June 2016. Eligible trials were those that compared GnRH antagonist protocols and standard long GnRH agonist protocols in couples undergoing IVF or ICSI. The primary outcome was ongoing pregnancy rate. Secondary outcomes were: live birth rate, clinical pregnancy rate, number of oocytes retrieved and safety with regard to ovarian hyperstimulation syndrome (OHSS). Separate comparisons were performed for the general IVF population, women with PCOS and women with poor ovarian response. Pre-planned subgroup analyses were performed for various antagonist treatment schedules. We included 50 studies. Of these, 34 studies reported on general IVF patients, 10 studies reported on PCOS patients and 6 studies reported on poor responders. In general IVF patients, ongoing pregnancy rate was significantly lower in the antagonist group compared with the agonist group (RR 0.89, 95% CI 0.82-0.96). In women with PCOS and in women with poor ovarian response, there was no evidence of a difference in ongoing pregnancy between the antagonist and agonist groups (RR 0.97, 95% CI 0.84-1.11 and RR 0.87, 95% CI 0.65-1.17, respectively). Subgroup analyses for various antagonist treatment schedules compared to the long protocol GnRH agonist showed a significantly lower ongoing pregnancy rate when the oral hormonal programming pill (OHP) pretreatment was combined with a flexible protocol (RR 0.74, 95% CI 0.59-0.91) while without OHP, the RR was 0.84, 95% CI 0.71-1.0. Subgroup analysis for the fixed antagonist schedule demonstrated no evidence of a significant difference with or without OHP (RR 0.94, 95% CI 0.79-1.12 and RR 0.94, 95% CI 0.83-1.05, respectively). Antagonists resulted in significantly lower OHSS rates both in the general IVF patients and in women with PCOS (RR 0.63, 95% CI 0.50-0.81 and RR 0.53, 95% CI 0.30-0.95, respectively). No data on OHSS was available from trials in poor responders. In a general IVF population, GnRH antagonists are associated with lower ongoing pregnancy rates when compared to long protocol agonists, but also with lower OHSS rates. Within this population, antagonist treatment prevents one case of OHSS in 40 patients but results in one less ongoing pregnancy out of every 28 women treated. Thus standard use of the long GnRH agonist treatment is perhaps still the approach of choice for prevention of premature luteinization. In couples with PCOS and poor responders, GnRH antagonists do not seem to compromise ongoing pregnancy rates and are associated with less OHSS and therefore could be considered as standard treatment. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  16. [Automated anesthesia record systems].

    PubMed

    Heinrichs, W; Mönk, S; Eberle, B

    1997-07-01

    The introduction of electronic anaesthesia documentation systems was attempted as early as in 1979, although their efficient application has become reality only in the past few years. The advantages of the electronic protocol are apparent: Continuous high quality documentation, comparability of data due to the availability of a data bank, reduction in the workload of the anaesthetist and availability of additional data. Disadvantages of the electronic protocol have also been discussed in the literature. By going through the process of entering data on the course of the anaesthetic procedure on the protocol sheet, the information is mentally absorbed and evaluated by the anaesthetist. This information may, however, be lost when the data are recorded fully automatically-without active involvement on the part of the anaesthetist. Recent publications state that by using intelligent alarms and/or integrated displays manual record keeping is no longer necessary for anaesthesia vigilance. The technical design of automated anaesthesia records depends on an integration of network technology into the hospital. It will be appropriate to connect the systems to the internet, but safety requirements have to be followed strictly. Concerning the database, client server architecture as well as language standards like SQL should be used. Object oriented databases will be available in the near future. Another future goal of automated anaesthesia record systems will be using knowledge based technologies within these systems. Drug interactions, disease related anaesthetic techniques and other information sources can be integrated. At this time, almost none of the commercially available systems has matured to a point where their purchase can be recommended without reservation. There is still a lack of standards for the subsequent exchange of data and a solution to a number of ergonomic problems still remains to be found. Nevertheless, electronic anaesthesia protocols will be required in the near future. The advantages of accurate documentation and quality control in the presence of careful planning outweight cost considerations by far.

  17. Do we need 3D tube current modulation information for accurate organ dosimetry in chest CT? Protocols dose comparisons.

    PubMed

    Lopez-Rendon, Xochitl; Zhang, Guozhi; Coudyzer, Walter; Develter, Wim; Bosmans, Hilde; Zanca, Federica

    2017-11-01

    To compare the lung and breast dose associated with three chest protocols: standard, organ-based tube current modulation (OBTCM) and fast-speed scanning; and to estimate the error associated with organ dose when modelling the longitudinal (z-) TCM versus the 3D-TCM in Monte Carlo simulations (MC) for these three protocols. Five adult and three paediatric cadavers with different BMI were scanned. The CTDI vol of the OBTCM and the fast-speed protocols were matched to the patient-specific CTDI vol of the standard protocol. Lung and breast doses were estimated using MC with both z- and 3D-TCM simulated and compared between protocols. The fast-speed scanning protocol delivered the highest doses. A slight reduction for breast dose (up to 5.1%) was observed for two of the three female cadavers with the OBTCM in comparison to the standard. For both adult and paediatric, the implementation of the z-TCM data only for organ dose estimation resulted in 10.0% accuracy for the standard and fast-speed protocols, while relative dose differences were up to 15.3% for the OBTCM protocol. At identical CTDI vol values, the standard protocol delivered the lowest overall doses. Only for the OBTCM protocol is the 3D-TCM needed if an accurate (<10.0%) organ dosimetry is desired. • The z-TCM information is sufficient for accurate dosimetry for standard protocols. • The z-TCM information is sufficient for accurate dosimetry for fast-speed scanning protocols. • For organ-based TCM schemes, the 3D-TCM information is necessary for accurate dosimetry. • At identical CTDI vol , the fast-speed scanning protocol delivered the highest doses. • Lung dose was higher in XCare than standard protocol at identical CTDI vol .

  18. Red light photodynamic therapy for actinic keratosis using 37 J/cm2 : Fractionated irradiation with 12.3 mW/cm2 after 30 minutes incubation time compared to standard continuous irradiation with 75 mW/cm2 after 3 hours incubation time using a mathematical modeling.

    PubMed

    Vignion-Dewalle, Anne-Sophie; Baert, Gregory; Devos, Laura; Thecua, Elise; Vicentini, Claire; Mortier, Laurent; Mordon, Serge

    2017-09-01

    Photodynamic therapy (PDT) is an emerging treatment modality for various diseases, especially for dermatological conditions. Although, the standard PDT protocol for the treatment of actinic keratoses in Europe has shown to be effective, treatment-associated pain is often observed in patients. Different modifications to this protocol attempted to decrease pain have been investigated. The decrease in fluence rate seems to be a promising solution. Moreover, it has been suggested that light fractionation significantly increases the efficacy of PDT. Based on a flexible light-emitting textile, the FLEXITHERALIGHT device specifically provides a fractionated illumination at a fluence rate more than six times lower than that of the standard protocol. In a recently completed clinical trial of PDT for the treatment of actinic keratosis, the non-inferiority of a protocol involving illumination with the FLEXITHERALIGHT device after a short incubation time and referred to as the FLEXITHERALIGHT protocol has been assessed compared to the standard protocol. In this paper, we propose a comparison of the two above mentioned 635 nm red light protocols with 37 J/cm 2 in the PDT treatment of actinic keratosis: the standard protocol and the FLEXITHERALIGHT one through a mathematical modeling. This mathematical modeling, which slightly differs from the one we have already published, enables the local damage induced by the therapy to be estimated. The comparison performed in terms of the local damage induced by the therapy demonstrates that the FLEXITHERALIGHT protocol with lower fluence rate, light fractionation and shorter incubation time is somewhat less efficient than the standard protocol. Nevertheless, from the clinical trial results, the FLEXITHERALIGHT protocol results in non-inferior response rates compared to the standard protocol. This finding raises the question of whether the PDT local damage achieved by the FLEXITHERALIGHT protocol (respectively, the standard protocol) is sufficient (respectively, excessive) to destroy actinic keratosis cells. Lasers Surg. Med. 49:686-697, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  19. Building Capacity for a Long-Term, in-Situ, National-Scale Phenology Monitoring Network: Successes, Challenges and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Browning, D. M.

    2014-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org) is a national-scale science and monitoring initiative focused on phenology - the study of seasonal life-cycle events such as leafing, flowering, reproduction, and migration - as a tool to understand the response of biodiversity to environmental variation and change. USA-NPN provides a hierarchical, national monitoring framework that enables other organizations to leverage the capacity of the Network for their own applications - minimizing investment and duplication of effort - while promoting interoperability. Network participants can leverage: (1) Standardized monitoring protocols that have been broadly vetted, tested and published; (2) A centralized National Phenology Database (NPDb) for maintaining, archiving and replicating data, with standard metadata, terms-of-use, web-services, and documentation of QA/QC, plus tools for discovery, visualization and download of raw data and derived data products; and/or (3) A national in-situ, multi-taxa phenological monitoring system, Nature's Notebook, which enables participants to observe and record phenology of plants and animals - based on the protocols and information management system (IMS) described above - via either web or mobile applications. The protocols, NPDb and IMS, and Nature's Notebook represent a hierarchy of opportunities for involvement by a broad range of interested stakeholders, from individuals to agencies. For example, some organizations have adopted (e.g., the National Ecological Observatory Network or NEON) -- or are considering adopting (e.g., the Long-Term Agroecosystems Network or LTAR) -- the USA-NPN standardized protocols, but will develop their own database and IMS with web services to promote sharing of data with the NPDb. Other organizations (e.g., the Inventory and Monitoring Programs of the National Wildlife Refuge System and the National Park Service) have elected to use Nature's Notebook to support their phenological monitoring programs. We highlight the challenges and benefits of integrating phenology monitoring within existing and emerging national monitoring networks, and showcase opportunities that exist when standardized protocols are adopted and implemented to promote data interoperability and sharing.

  20. Additional considerations are required when preparing a protocol for a systematic review with multiple interventions.

    PubMed

    Chaimani, Anna; Caldwell, Deborah M; Li, Tianjing; Higgins, Julian P T; Salanti, Georgia

    2017-03-01

    The number of systematic reviews that aim to compare multiple interventions using network meta-analysis is increasing. In this study, we highlight aspects of a standard systematic review protocol that may need modification when multiple interventions are to be compared. We take the protocol format suggested by Cochrane for a standard systematic review as our reference and compare the considerations for a pairwise review with those required for a valid comparison of multiple interventions. We suggest new sections for protocols of systematic reviews including network meta-analyses with a focus on how to evaluate their assumptions. We provide example text from published protocols to exemplify the considerations. Standard systematic review protocols for pairwise meta-analyses need extensions to accommodate the increased complexity of network meta-analysis. Our suggested modifications are widely applicable to both Cochrane and non-Cochrane systematic reviews involving network meta-analyses. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Effect of a Standardized Protocol of Antibiotic Therapy on Surgical Site Infection after Laparoscopic Surgery for Complicated Appendicitis.

    PubMed

    Park, Hyoung-Chul; Kim, Min Jeong; Lee, Bong Hwa

    Although it is accepted that complicated appendicitis requires antibiotic therapy to prevent post-operative surgical infections, consensus protocols on the duration and regimens of treatment are not well established. This study aimed to compare the outcome of post-operative infectious complications in patients receiving old non-standardized and new standard antibiotic protocols, involving either 5 or 10 days of treatment, respectively. We enrolled 1,343 patients who underwent laparoscopic surgery for complicated appendicitis between January 2009 and December 2014. At the beginning of the new protocol, the patients were divided into two groups; 10 days of various antibiotic regimens (between January 2009 and June 2012, called the non-standardized protocol; n = 730) and five days of cefuroxime and metronidazole regimen (between July 2012 and December 2014; standardized protocol; n = 613). We compared the clinical outcomes, including surgical site infection (SSI) (superficial and deep organ/space infections) in the two groups. The standardized protocol group had a slightly shorter operative time (67 vs. 69 min), a shorter hospital stay (5 vs. 5.4 d), and lower medical cost (US$1,564 vs. US$1,654). Otherwise, there was no difference between the groups. No differences were found in the non-standardized and standard protocol groups with regard to the rate of superficial infection (10.3% vs. 12.7%; p = 0.488) or deep organ/space infection (2.3% vs. 2.1%; p = 0.797). In patients undergoing laparoscopic surgery for complicated appendicitis, five days of cefuroxime and metronidazole did not lead to more SSIs, and it decreased the medical costs compared with non-standardized antibiotic regimens.

  2. Natural Language Processing-Enabled and Conventional Data Capture Methods for Input to Electronic Health Records: A Comparative Usability Study.

    PubMed

    Kaufman, David R; Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark

    2016-10-28

    The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)-enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user's experience. The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods ("protocols") of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. ©David R. Kaufman, Barbara Sheehan, Peter Stetson, Ashish R. Bhatt, Adele I. Field, Chirag Patel, James Mark Maisel. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 28.10.2016.

  3. National protocol framework for the inventory and monitoring of bees

    USGS Publications Warehouse

    Droege, Sam; Engler, Joseph D.; Sellers, Elizabeth A.; Lee O'Brien,

    2016-01-01

    This national protocol framework is a standardized tool for the inventory and monitoring of the approximately 4,200 species of native and non-native bee species that may be found within the National Wildlife Refuge System (NWRS) administered by the U.S. Fish and Wildlife Service (USFWS). However, this protocol framework may also be used by other organizations and individuals to monitor bees in any given habitat or location. Our goal is to provide USFWS stations within the NWRS (NWRS stations are land units managed by the USFWS such as national wildlife refuges, national fish hatcheries, wetland management districts, conservation areas, leased lands, etc.) with techniques for developing an initial baseline inventory of what bee species are present on their lands and to provide an inexpensive, simple technique for monitoring bees continuously and for monitoring and evaluating long-term population trends and management impacts. The latter long-term monitoring technique requires a minimal time burden for the individual station, yet can provide a good statistical sample of changing populations that can be investigated at the station, regional, and national levels within the USFWS’ jurisdiction, and compared to other sites within the United States and Canada. This protocol framework was developed in cooperation with the United States Geological Survey (USGS), the USFWS, and a worldwide network of bee researchers who have investigated the techniques and methods for capturing bees and tracking population changes. The protocol framework evolved from field and lab-based investigations at the USGS Bee Inventory and Monitoring Laboratory at the Patuxent Wildlife Research Center in Beltsville, Maryland starting in 2002 and was refined by a large number of USFWS, academic, and state groups. It includes a Protocol Introduction and a set of 8 Standard Operating Procedures or SOPs and adheres to national standards of protocol content and organization. The Protocol Narrative describes the history and need for the protocol framework and summarizes the basic elements of objectives, sampling design, field methods, training, data management, analysis, and reporting. The SOPs provide more detail and specific instructions for implementing the protocol framework. A central database, for managing all the resulting data is under development. We welcome use of this protocol framework by our partners, as appropriate for their bee inventory and monitoring objectives.

  4. Computing and Communications Infrastructure for Network-Centric Warfare: Exploiting COTS, Assuring Performance

    DTIC Science & Technology

    2004-06-01

    remote databases, has seen little vendor acceptance. Each database ( Oracle , DB2, MySQL , etc.) has its own client- server protocol. Therefore each...existing standards – SQL , X.500/LDAP, FTP, etc. • View information dissemination as selective replication – State-oriented vs . message-oriented...allowing the 8 application to start. The resource management system would serve as a broker to the resources, making sure that resources are not

  5. Impact of database quality in knowledge-based treatment planning for prostate cancer.

    PubMed

    Wall, Phillip D H; Carver, Robert L; Fontenot, Jonas D

    2018-03-13

    This article investigates dose-volume prediction improvements in a common knowledge-based planning (KBP) method using a Pareto plan database compared with using a conventional, clinical plan database. Two plan databases were created using retrospective, anonymized data of 124 volumetric modulated arc therapy (VMAT) prostate cancer patients. The clinical plan database (CPD) contained planning data from each patient's clinically treated VMAT plan, which were manually optimized by various planners. The multicriteria optimization database (MCOD) contained Pareto-optimal plan data from VMAT plans created using a standardized multicriteria optimization protocol. Overlap volume histograms, incorporating fractional organ at risk volumes only within the treatment fields, were computed for each patient and used to match new patient anatomy to similar database patients. For each database patient, CPD and MCOD KBP predictions were generated for D 10 , D 30 , D 50 , D 65 , and D 80 of the bladder and rectum in a leave-one-out manner. Prediction achievability was evaluated through a replanning study on a subset of 31 randomly selected database patients using the best KBP predictions, regardless of plan database origin, as planning goals. MCOD predictions were significantly lower than CPD predictions for all 5 bladder dose-volumes and rectum D 50 (P = .004) and D 65 (P < .001), whereas CPD predictions for rectum D 10 (P = .005) and D 30 (P < .001) were significantly less than MCOD predictions. KBP predictions were statistically achievable in the replans for all predicted dose-volumes, excluding D 10 of bladder (P = .03) and rectum (P = .04). Compared with clinical plans, replans showed significant average reductions in D mean for bladder (7.8 Gy; P < .001) and rectum (9.4 Gy; P < .001), while maintaining statistically similar planning target volume, femoral head, and penile bulb dose. KBP dose-volume predictions derived from Pareto plans were more optimal overall than those resulting from manually optimized clinical plans, which significantly improved KBP-assisted plan quality. This work investigates how the plan quality of knowledge databases affects the performance and achievability of dose-volume predictions from a common knowledge-based planning approach for prostate cancer. Bladder and rectum dose-volume predictions derived from a database of standardized Pareto-optimal plans were compared with those derived from clinical plans manually designed by various planners. Dose-volume predictions from the Pareto plan database were significantly lower overall than those from the clinical plan database, without compromising achievability. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Natural Language Processing–Enabled and Conventional Data Capture Methods for Input to Electronic Health Records: A Comparative Usability Study

    PubMed Central

    Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark

    2016-01-01

    Background The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)–enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user’s experience. Objective The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. Methods This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods (“protocols”) of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. Results A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. Conclusions In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. PMID:27793791

  7. Prototype development and implementation of picture archiving and communications systems based on ISO-OSI standard

    NASA Astrophysics Data System (ADS)

    Martinez, Ralph; Nam, Jiseung

    1992-07-01

    Picture Archiving and Communication Systems (PACS) is an integration of digital image formation in a hospital, which encompasses various imaging equipment, image viewing workstations, image databases, and a high speed network. The integration requires a standardization of communication protocols to connect devices from different vendors. The American College of Radiology and the National Electrical Manufacturers Association (ACR- NEMA) standard Version 2.0 provides a point-to-point hardware interface, a set of software commands, and a consistent set of data formats for PACS. But, it is inadequate for PACS networking environments, because of its point-to-point nature and its inflexibility to allow other services and protocols in the future. Based on previous experience of PACS developments in The University of Arizona, a new communication protocol for PACS networks and an approach were proposed to ACR-NEMA Working Group VI. The defined PACS protocol is intended to facilitate the development of PACS''s capable of interfacing with other hospital information systems. Also, it is intended to allow the creation of diagnostic information data bases which can be interrogated by a variety of distributed devices. A particularly important goal is to support communications in a multivendor environment. The new protocol specifications are defined primarily as a combination of the International Organization for Standardization/Open Systems Interconnection (ISO/OSI), TCP/IP protocols, and the data format portion of ACR-NEMA standard. This paper addresses the specification and implementation of the ISO-based protocol into a PACS prototype. The protocol specification, which covers Presentation, Session, Transport, and Network layers, is summarized briefly. The protocol implementation is discussed based on our implementation efforts in the UNIX Operating System Environment. At the same time, results of performance comparison between the ISO and TCP/IP implementations are presented to demonstrate the implementation of defined protocol. The testing of performance analysis is done by prototyping PACS on available platforms, which are Micro VAX II, DECstation and SUN Workstation.

  8. Genevar: a database and Java application for the analysis and visualization of SNP-gene associations in eQTL studies.

    PubMed

    Yang, Tsun-Po; Beazley, Claude; Montgomery, Stephen B; Dimas, Antigone S; Gutierrez-Arcelus, Maria; Stranger, Barbara E; Deloukas, Panos; Dermitzakis, Emmanouil T

    2010-10-01

    Genevar (GENe Expression VARiation) is a database and Java tool designed to integrate multiple datasets, and provides analysis and visualization of associations between sequence variation and gene expression. Genevar allows researchers to investigate expression quantitative trait loci (eQTL) associations within a gene locus of interest in real time. The database and application can be installed on a standard computer in database mode and, in addition, on a server to share discoveries among affiliations or the broader community over the Internet via web services protocols. http://www.sanger.ac.uk/resources/software/genevar.

  9. REFOLDdb: a new and sustainable gateway to experimental protocols for protein refolding.

    PubMed

    Mizutani, Hisashi; Sugawara, Hideaki; Buckle, Ashley M; Sangawa, Takeshi; Miyazono, Ken-Ichi; Ohtsuka, Jun; Nagata, Koji; Shojima, Tomoki; Nosaki, Shohei; Xu, Yuqun; Wang, Delong; Hu, Xiao; Tanokura, Masaru; Yura, Kei

    2017-04-24

    More than 7000 papers related to "protein refolding" have been published to date, with approximately 300 reports each year during the last decade. Whilst some of these papers provide experimental protocols for protein refolding, a survey in the structural life science communities showed a necessity for a comprehensive database for refolding techniques. We therefore have developed a new resource - "REFOLDdb" that collects refolding techniques into a single, searchable repository to help researchers develop refolding protocols for proteins of interest. We based our resource on the existing REFOLD database, which has not been updated since 2009. We redesigned the data format to be more concise, allowing consistent representations among data entries compared with the original REFOLD database. The remodeled data architecture enhances the search efficiency and improves the sustainability of the database. After an exhaustive literature search we added experimental refolding protocols from reports published 2009 to early 2017. In addition to this new data, we fully converted and integrated existing REFOLD data into our new resource. REFOLDdb contains 1877 entries as of March 17 th , 2017, and is freely available at http://p4d-info.nig.ac.jp/refolddb/ . REFOLDdb is a unique database for the life sciences research community, providing annotated information for designing new refolding protocols and customizing existing methodologies. We envisage that this resource will find wide utility across broad disciplines that rely on the production of pure, active, recombinant proteins. Furthermore, the database also provides a useful overview of the recent trends and statistics in refolding technology development.

  10. Bedside Diagnosis of Dysphagia: A Systematic Review

    PubMed Central

    O’Horo, John C.; Rogus-Pulia, Nicole; Garcia-Arguello, Lisbeth; Robbins, JoAnne; Safdar, Nasia

    2015-01-01

    Background Dysphagia is associated with aspiration, pneumonia and malnutrition, but remains challenging to identify at the bedside. A variety of exam protocols and maneuvers are commonly used, but the efficacy of these maneuvers is highly variable. Methods We conducted a comprehensive search of seven databases, including MEDLINE, EMBASE and Scopus, from each database’s earliest inception through June 5th, 2013. Studies reporting diagnostic performance of a bedside examination maneuver compared to a reference gold standard (videofluoroscopic swallow study [VFSS] or flexible endoscopic evaluation of swallowing with sensory testing [FEEST]) were included for analysis. From each study, data were abstracted based on the type of diagnostic method and reference standard study population and inclusion/exclusion characteristics, design and prediction of aspiration. Results The search strategy identified 38 articles meeting inclusion criteria. Overall, most bedside examinations lacked sufficient sensitivity to be used for screening purposes across all patient populations examined. Individual studies found dysphonia assessments, abnormal pharyngeal sensation assessments, dual axis accelerometry, and one description of water swallow testing to be sensitive tools, but none were reported as consistently sensitive. A preponderance of identified studies was in post-stroke adults, limiting the generalizability of results. Conclusions No bedside screening protocol has been shown to provide adequate predictive value for presence of aspiration. Several individual exam maneuvers demonstrated reasonable sensitivity, but reproducibility and consistency of these protocols was not established. More research is needed to design an optimal protocol for dysphagia detection. PMID:25581840

  11. Deployment of Directory Service for IEEE N Bus Test System Information

    NASA Astrophysics Data System (ADS)

    Barman, Amal; Sil, Jaya

    2008-10-01

    Exchanging information over Internet and Intranet becomes a defacto standard in computer applications, among various users and organizations. Distributed system study, e-governance etc require transparent information exchange between applications, constituencies, manufacturers, and vendors. To serve these purposes database system is needed for storing system data and other relevant information. Directory service, which is a specialized database along with access protocol, could be the single solution since it runs over TCP/IP, supported by all POSIX compliance platforms and is based on open standard. This paper describes a way to deploy directory service, to store IEEE n bus test system data and integrating load flow program with it.

  12. Privacy-preserving record linkage using Bloom filters

    PubMed Central

    2009-01-01

    Background Combining multiple databases with disjunctive or additional information on the same person is occurring increasingly throughout research. If unique identification numbers for these individuals are not available, probabilistic record linkage is used for the identification of matching record pairs. In many applications, identifiers have to be encrypted due to privacy concerns. Methods A new protocol for privacy-preserving record linkage with encrypted identifiers allowing for errors in identifiers has been developed. The protocol is based on Bloom filters on q-grams of identifiers. Results Tests on simulated and actual databases yield linkage results comparable to non-encrypted identifiers and superior to results from phonetic encodings. Conclusion We proposed a protocol for privacy-preserving record linkage with encrypted identifiers allowing for errors in identifiers. Since the protocol can be easily enhanced and has a low computational burden, the protocol might be useful for many applications requiring privacy-preserving record linkage. PMID:19706187

  13. Privacy-preserving record linkage using Bloom filters.

    PubMed

    Schnell, Rainer; Bachteler, Tobias; Reiher, Jörg

    2009-08-25

    Combining multiple databases with disjunctive or additional information on the same person is occurring increasingly throughout research. If unique identification numbers for these individuals are not available, probabilistic record linkage is used for the identification of matching record pairs. In many applications, identifiers have to be encrypted due to privacy concerns. A new protocol for privacy-preserving record linkage with encrypted identifiers allowing for errors in identifiers has been developed. The protocol is based on Bloom filters on q-grams of identifiers. Tests on simulated and actual databases yield linkage results comparable to non-encrypted identifiers and superior to results from phonetic encodings. We proposed a protocol for privacy-preserving record linkage with encrypted identifiers allowing for errors in identifiers. Since the protocol can be easily enhanced and has a low computational burden, the protocol might be useful for many applications requiring privacy-preserving record linkage.

  14. A bibliometric analysis of systematic reviews on vaccines and immunisation.

    PubMed

    Fernandes, Silke; Jit, Mark; Bozzani, Fiammetta; Griffiths, Ulla K; Scott, J Anthony G; Burchett, Helen E D

    2018-04-19

    SYSVAC is an online bibliographic database of systematic reviews and systematic review protocols on vaccines and immunisation compiled by the London School of Hygiene & Tropical Medicine and hosted by the World Health Organization (WHO) through their National Immunization Technical Advisory Groups (NITAG) resource centre (www.nitag-resource.org). Here the development of the database and a bibliometric review of its content is presented, describing trends in the publication of policy-relevant systematic reviews on vaccines and immunisation from 2008 to 2016. Searches were conducted in seven scientific databases according to a standardized search protocol, initially in 2014 with the most recent update in January 2017. Abstracts and titles were screened according to specific inclusion criteria. All included publications were coded into relevant categories based on a standardized protocol and subsequently analysed to look at trends in time, topic, area of focus, population and geographic location. After screening for inclusion criteria, 1285 systematic reviews were included in the database. While in 2008 there were only 34 systematic reviews on a vaccine-related topic, this increased to 322 in 2016. The most frequent pathogens/diseases studied were influenza, human papillomavirus and pneumococcus. There were several areas of duplication and overlap. As more systematic reviews are published it becomes increasingly time-consuming for decision-makers to identify relevant information among the ever-increasing volume available. The risk of duplication also increases, particularly given the current lack of coordination of systematic reviews on vaccine-related questions, both in terms of their commissioning and their execution. The SYSVAC database offers an accessible catalogue of vaccine-relevant systematic reviews with, where possible access or a link to the full-text. SYSVAC provides a freely searchable platform to identify existing vaccine-policy-relevant systematic reviews. Systematic reviews will need to be assessed adequately for each specific question and quality. Copyright © 2018. Published by Elsevier Ltd.

  15. Identification of a New Isoindole-2-yl Scaffold as a Qo and Qi Dual Inhibitor of Cytochrome bc 1 Complex: Virtual Screening, Synthesis, and Biochemical Assay.

    PubMed

    Azizian, Homa; Bagherzadeh, Kowsar; Shahbazi, Sophia; Sharifi, Niusha; Amanlou, Massoud

    2017-09-18

    Respiratory chain ubiquinol-cytochrome (cyt) c oxidoreductase (cyt bc 1 or complex III) has been demonstrated as a promising target for numerous antibiotics and fungicide applications. In this study, a virtual screening of NCI diversity database was carried out in order to find novel Qo/Qi cyt bc 1 complex inhibitors. Structure-based virtual screening and molecular docking methodology were employed to further screen compounds with inhibition activity against cyt bc 1 complex after extensive reliability validation protocol with cross-docking method and identification of the best score functions. Subsequently, the application of rational filtering procedure over the target database resulted in the elucidation of a novel class of cyt bc 1 complex potent inhibitors with comparable binding energies and biological activities to those of the standard inhibitor, antimycin.

  16. A literature review: polypharmacy protocol for primary care.

    PubMed

    Skinner, Mary

    2015-01-01

    The purpose of this literature review is to critically evaluate published protocols on polypharmacy in adults ages 65 and older that are currently used in primary care settings that may potentially lead to fewer adverse drug events. A review of OVID, CINAHL, EBSCO, Cochrane Library, Medline, and PubMed databases was completed using the following key words: protocol, guideline, geriatrics, elderly, older adult, polypharmacy, and primary care. Inclusion criteria were: articles in medical, nursing, and pharmacology journals with an intervention, protocol, or guideline addressing polypharmacy that lead to fewer adverse drug events. Qualitative and quantitative studies were included. Exclusion criteria were: publications prior to the year 1992. A gap exists in the literature. No standardized protocol for addressing polypharmacy in the primary care setting was found. Mnemonics, algorithms, clinical practice guidelines, and clinical strategies for addressing polypharmacy in a variety of health care settings were found throughout the literature. Several screening instruments for use in primary care to assess potentially inappropriate prescription of medications in the elderly, such as the Beers Criteria and the STOPP screening tool, were identified. However, these screening instruments were not included in a standardized protocol to manage polypharmacy in primary care. Polypharmacy in the elderly is a critical problem that may result in adverse drug events such as falls, hospitalizations, and increased expenditures for both the patient and the health care system. No standardized protocols to address polypharmacy specific to the primary care setting were identified in this review of the literature. Given the growing population of elderly in this country and the high number of medications they consume, it is critical to focus on the utilization of a standardized protocol to address the potential harm of polypharmacy in the primary care setting and evaluate its effects on patient outcomes. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. A privacy preserving protocol for tracking participants in phase I clinical trials.

    PubMed

    El Emam, Khaled; Farah, Hanna; Samet, Saeed; Essex, Aleksander; Jonker, Elizabeth; Kantarcioglu, Murat; Earle, Craig C

    2015-10-01

    Some phase 1 clinical trials offer strong financial incentives for healthy individuals to participate in their studies. There is evidence that some individuals enroll in multiple trials concurrently. This creates safety risks and introduces data quality problems into the trials. Our objective was to construct a privacy preserving protocol to track phase 1 participants to detect concurrent enrollment. A protocol using secure probabilistic querying against a database of trial participants that allows for screening during telephone interviews and on-site enrollment was developed. The match variables consisted of demographic information. The accuracy (sensitivity, precision, and negative predictive value) of the matching and its computational performance in seconds were measured under simulated environments. Accuracy was also compared to non-secure matching methods. The protocol performance scales linearly with the database size. At the largest database size of 20,000 participants, a query takes under 20s on a 64 cores machine. Sensitivity, precision, and negative predictive value of the queries were consistently at or above 0.9, and were very similar to non-secure versions of the protocol. The protocol provides a reasonable solution to the concurrent enrollment problems in phase 1 clinical trials, and is able to ensure that personal information about participants is kept secure. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Integrative neuroscience: the role of a standardized database.

    PubMed

    Gordon, E; Cooper, N; Rennie, C; Hermens, D; Williams, L M

    2005-04-01

    Most brain related databases bring together specialized information, with a growing number that include neuroimaging measures. This article outlines the potential use and insights from the first entirely standardized and centralized database, which integrates information from neuroimaging measures (EEG, event related potential (ERP), structural/functional MRI), arousal (skin conductance responses (SCR)s, heart rate, respiration), neuropsychological and personality tests, genomics and demographics: The Brain Resource International Database. It comprises data from over 2000 "normative" subjects and a growing number of patients with neurological and psychiatric illnesses, acquired from over 50 laboratories (in the U.S.A, United Kingdom, Holland, South Africa, Israel and Australia), all with identical equipment and experimental procedures. Three primary goals of this database are to quantify individual differences in normative brain function, to compare an individual's performance to their database peers, and to provide a robust normative framework for clinical assessment and treatment prediction. We present three example demonstrations in relation to these goals. First, we show how consistent age differences may be quantified when large subject numbers are available, using EEG and ERP data from nearly 2000 stringently screened. normative subjects. Second, the use of a normalization technique provides a means to compare clinical subjects (50 ADHD subjects in this study) to the normative database with the effects of age and gender taken into account. Third, we show how a profile of EEG/ERP and autonomic measures potentially provides a means to predict treatment response in ADHD subjects. The example data consists of EEG under eyes open and eyes closed and ERP data for auditory oddball, working memory and Go-NoGo paradigms. Autonomic measures of skin conductance (tonic skin conductance level, SCL, and phasic skin conductance responses, SCRs) were acquired simultaneously with central EEG/ERP measures. The findings show that the power of large samples, tested using standardized protocols, allows for the quantification of individual differences that can subsequently be used to control such variation and to enhance the sensitivity and specificity of comparisons between normative and clinical groups. In terms of broader significance, the combination of size and multidimensional measures tapping the brain's core cognitive competencies, may provide a normative and evidence-based framework for individually-based assessments in "Personalized Medicine."

  19. Genevar: a database and Java application for the analysis and visualization of SNP-gene associations in eQTL studies

    PubMed Central

    Yang, Tsun-Po; Beazley, Claude; Montgomery, Stephen B.; Dimas, Antigone S.; Gutierrez-Arcelus, Maria; Stranger, Barbara E.; Deloukas, Panos; Dermitzakis, Emmanouil T.

    2010-01-01

    Summary: Genevar (GENe Expression VARiation) is a database and Java tool designed to integrate multiple datasets, and provides analysis and visualization of associations between sequence variation and gene expression. Genevar allows researchers to investigate expression quantitative trait loci (eQTL) associations within a gene locus of interest in real time. The database and application can be installed on a standard computer in database mode and, in addition, on a server to share discoveries among affiliations or the broader community over the Internet via web services protocols. Availability: http://www.sanger.ac.uk/resources/software/genevar Contact: emmanouil.dermitzakis@unige.ch PMID:20702402

  20. Informatics in radiology: use of CouchDB for document-based storage of DICOM objects.

    PubMed

    Rascovsky, Simón J; Delgado, Jorge A; Sanz, Alexander; Calvo, Víctor D; Castrillón, Gabriel

    2012-01-01

    Picture archiving and communication systems traditionally have depended on schema-based Structured Query Language (SQL) databases for imaging data management. To optimize database size and performance, many such systems store a reduced set of Digital Imaging and Communications in Medicine (DICOM) metadata, discarding informational content that might be needed in the future. As an alternative to traditional database systems, document-based key-value stores recently have gained popularity. These systems store documents containing key-value pairs that facilitate data searches without predefined schemas. Document-based key-value stores are especially suited to archive DICOM objects because DICOM metadata are highly heterogeneous collections of tag-value pairs conveying specific information about imaging modalities, acquisition protocols, and vendor-supported postprocessing options. The authors used an open-source document-based database management system (Apache CouchDB) to create and test two such databases; CouchDB was selected for its overall ease of use, capability for managing attachments, and reliance on HTTP and Representational State Transfer standards for accessing and retrieving data. A large database was created first in which the DICOM metadata from 5880 anonymized magnetic resonance imaging studies (1,949,753 images) were loaded by using a Ruby script. To provide the usual DICOM query functionality, several predefined "views" (standard queries) were created by using JavaScript. For performance comparison, the same queries were executed in both the CouchDB database and a SQL-based DICOM archive. The capabilities of CouchDB for attachment management and database replication were separately assessed in tests of a similar, smaller database. Results showed that CouchDB allowed efficient storage and interrogation of all DICOM objects; with the use of information retrieval algorithms such as map-reduce, all the DICOM metadata stored in the large database were searchable with only a minimal increase in retrieval time over that with the traditional database management system. Results also indicated possible uses for document-based databases in data mining applications such as dose monitoring, quality assurance, and protocol optimization. RSNA, 2012

  1. Prospective Evaluation of a Matrix-Assisted Laser Desorption Ionization–Time of Flight Mass Spectrometry System in a Hospital Clinical Microbiology Laboratory for Identification of Bacteria and Yeasts: a Bench-by-Bench Study for Assessing the Impact on Time to Identification and Cost-Effectiveness

    PubMed Central

    Tan, K. E.; Ellis, B. C.; Lee, R.; Stamper, P. D.; Zhang, S. X.

    2012-01-01

    Matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) has been found to be an accurate, rapid, and inexpensive method for the identification of bacteria and yeasts. Previous evaluations have compared the accuracy, time to identification, and costs of the MALDI-TOF MS method against standard identification systems or commercial panels. In this prospective study, we compared a protocol incorporating MALDI-TOF MS (MALDI protocol) with the current standard identification protocols (standard protocol) to determine the performance in actual practice using a specimen-based, bench-by-bench approach. The potential impact on time to identification (TTI) and costs had MALDI-TOF MS been the first-line identification method was quantitated. The MALDI protocol includes supplementary tests, notably for Streptococcus pneumoniae and Shigella, and indications for repeat MALDI-TOF MS attempts, often not measured in previous studies. A total of 952 isolates (824 bacterial isolates and 128 yeast isolates) recovered from 2,214 specimens were assessed using the MALDI protocol. Compared with standard protocols, the MALDI protocol provided identifications 1.45 days earlier on average (P < 0.001). In our laboratory, we anticipate that the incorporation of the MALDI protocol can reduce reagent and labor costs of identification by $102,424 or 56.9% within 12 months. The model included the fixed annual costs of the MALDI-TOF MS, such as the cost of protein standards and instrument maintenance, and the annual prevalence of organisms encountered in our laboratory. This comprehensive cost analysis model can be generalized to other moderate- to high-volume laboratories. PMID:22855510

  2. Prospective evaluation of a matrix-assisted laser desorption ionization-time of flight mass spectrometry system in a hospital clinical microbiology laboratory for identification of bacteria and yeasts: a bench-by-bench study for assessing the impact on time to identification and cost-effectiveness.

    PubMed

    Tan, K E; Ellis, B C; Lee, R; Stamper, P D; Zhang, S X; Carroll, K C

    2012-10-01

    Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has been found to be an accurate, rapid, and inexpensive method for the identification of bacteria and yeasts. Previous evaluations have compared the accuracy, time to identification, and costs of the MALDI-TOF MS method against standard identification systems or commercial panels. In this prospective study, we compared a protocol incorporating MALDI-TOF MS (MALDI protocol) with the current standard identification protocols (standard protocol) to determine the performance in actual practice using a specimen-based, bench-by-bench approach. The potential impact on time to identification (TTI) and costs had MALDI-TOF MS been the first-line identification method was quantitated. The MALDI protocol includes supplementary tests, notably for Streptococcus pneumoniae and Shigella, and indications for repeat MALDI-TOF MS attempts, often not measured in previous studies. A total of 952 isolates (824 bacterial isolates and 128 yeast isolates) recovered from 2,214 specimens were assessed using the MALDI protocol. Compared with standard protocols, the MALDI protocol provided identifications 1.45 days earlier on average (P < 0.001). In our laboratory, we anticipate that the incorporation of the MALDI protocol can reduce reagent and labor costs of identification by $102,424 or 56.9% within 12 months. The model included the fixed annual costs of the MALDI-TOF MS, such as the cost of protein standards and instrument maintenance, and the annual prevalence of organisms encountered in our laboratory. This comprehensive cost analysis model can be generalized to other moderate- to high-volume laboratories.

  3. Technical note: comparability of Hrdlička's Catalog of Crania data based on measurement landmark definitions.

    PubMed

    Stojanowski, Christopher M; Euber, Julie K

    2011-09-01

    Archival sources of data are critical anthropological resources that inform inferences about human biology and evolutionary history. Craniometric data are one of the most widely available sources of information on human population history because craniometrics were critical in early 20th century debates about race and biological variation. As such, extensive databases of raw craniometric data were published at the same time that the field was working to standardize measurement protocol. Hrdlička published between 10 and 16 raw craniometric variables for over 8,000 individuals in a series of seven catalogs throughout his career. With a New World emphasis, Hrdlička's data complement those of Howells (1973, 1989) and the two databases have been combined in the past. In this note we verify the consistency of Hrdlička's measurement protocol throughout the Catalog series and compare these definitions to those used by Howells. We conclude that 12 measurements are comparable throughout the Catalogs, with five of these equivalent to Howells' measurements: maximum cranial breadth (XCB), basion-bregma height (BBH), maximum bizygomatic breadth (ZYB), nasal breadth (NLB), and breadth of the upper alveolar arch (MAB). Most of Hrdlička's measurements are not strictly comparable to those of Howells, thus limiting the utility of combined datasets for multivariate analysis. Four measurements are inconsistently defined by Hrdlička and we recommend not using these data: nasal height, orbit breadth, orbit height, and menton-nasion height. This note promotes Hrdlička's tireless efforts at data collection and re-emphasizes observer error as a legitimate concern in craniometry as the field shifts to morphometric digital data acquisition. 2011 Wiley-Liss, Inc.

  4. The Native Plant Propagation Protocol Database: 16 years of sharing information

    Treesearch

    R. Kasten Dumroese; Thomas D. Landis

    2016-01-01

    The Native Plant Propagation Protocol Database was launched in 2001 to provide an online mechanism for sharing information about growing native plants. It relies on plant propagators to upload their protocols (detailed directions for growing particular native plants) so that others may benefit from their experience. Currently the database has nearly 3000 protocols and...

  5. Now That We've Found the "Hidden Web," What Can We Do with It?

    ERIC Educational Resources Information Center

    Cole, Timothy W.; Kaczmarek, Joanne; Marty, Paul F.; Prom, Christopher J.; Sandore, Beth; Shreeves, Sarah

    The Open Archives Initiative (OAI) Protocol for Metadata Harvesting (PMH) is designed to facilitate discovery of the "hidden web" of scholarly information, such as that contained in databases, finding aids, and XML documents. OAI-PMH supports standardized exchange of metadata describing items in disparate collections, of such as those…

  6. BioBarcode: a general DNA barcoding database and server platform for Asian biodiversity resources.

    PubMed

    Lim, Jeongheui; Kim, Sang-Yoon; Kim, Sungmin; Eo, Hae-Seok; Kim, Chang-Bae; Paek, Woon Kee; Kim, Won; Bhak, Jong

    2009-12-03

    DNA barcoding provides a rapid, accurate, and standardized method for species-level identification using short DNA sequences. Such a standardized identification method is useful for mapping all the species on Earth, particularly when DNA sequencing technology is cheaply available. There are many nations in Asia with many biodiversity resources that need to be mapped and registered in databases. We have built a general DNA barcode data processing system, BioBarcode, with open source software - which is a general purpose database and server. It uses mySQL RDBMS 5.0, BLAST2, and Apache httpd server. An exemplary database of BioBarcode has around 11,300 specimen entries (including GenBank data) and registers the biological species to map their genetic relationships. The BioBarcode database contains a chromatogram viewer which improves the performance in DNA sequence analyses. Asia has a very high degree of biodiversity and the BioBarcode database server system aims to provide an efficient bioinformatics protocol that can be freely used by Asian researchers and research organizations interested in DNA barcoding. The BioBarcode promotes the rapid acquisition of biological species DNA sequence data that meet global standards by providing specialized services, and provides useful tools that will make barcoding cheaper and faster in the biodiversity community such as standardization, depository, management, and analysis of DNA barcode data. The system can be downloaded upon request, and an exemplary server has been constructed with which to build an Asian biodiversity system http://www.asianbarcode.org.

  7. Accelerated rehabilitation compared with a standard protocol after distal radial fractures treated with volar open reduction and internal fixation: a prospective, randomized, controlled study.

    PubMed

    Brehmer, Jess L; Husband, Jeffrey B

    2014-10-01

    There are relatively few studies in the literature that specifically evaluate accelerated rehabilitation protocols for distal radial fractures treated with open reduction and internal fixation (ORIF). The purpose of this study was to compare the early postoperative outcomes (at zero to twelve weeks postoperatively) of patients enrolled in an accelerated rehabilitation protocol with those of patients enrolled in a standard rehabilitation protocol following ORIF for a distal radial fracture. We hypothesized that patients with accelerated rehabilitation after volar ORIF for a distal radial fracture would have an earlier return to function compared with patients who followed a standard protocol. From November 2007 to November 2010, eighty-one patients with an unstable distal radial fracture were prospectively randomized to follow either an accelerated or a standard rehabilitation protocol after undergoing ORIF with a volar plate for a distal radial fracture. Both groups began with gentle active range of motion at three to five days postoperatively. At two weeks, the accelerated group initiated wrist/forearm passive range of motion and strengthening exercises, whereas the standard group initiated passive range of motion and strengthening at six weeks postoperatively. Patients were assessed at three to five days, two weeks, three weeks, four weeks, six weeks, eight weeks, twelve weeks, and six months postoperatively. Outcomes included Disabilities of the Arm, Shoulder and Hand (DASH) scores (primary outcome) and measurements of wrist flexion/extension, supination, pronation, grip strength, and palmar pinch. The patients in the accelerated group had better mobility, strength, and DASH scores at the early postoperative time points (zero to eight weeks postoperatively) compared with the patients in the standard rehabilitation group. The difference between the groups was both clinically relevant and statistically significant. Patients who follow an accelerated rehabilitation protocol that emphasizes motion immediately postoperatively and initiates strengthening at two weeks after volar ORIF of a distal radial fracture have an earlier return to function than patients who follow a more standard rehabilitation protocol. Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence. Copyright © 2014 by The Journal of Bone and Joint Surgery, Incorporated.

  8. Evaluation of Protocol Uniformity Concerning Laparoscopic Cholecystectomy in The Netherlands

    PubMed Central

    Goossens, Richard H. M.; van Eijk, Daan J.; Lange, Johan F.

    2008-01-01

    Background Iatrogenic bile duct injury remains a current complication of laparoscopic cholecystectomy. One uniform and standardized protocol, based on the “critical view of safety” concept of Strasberg, should reduce the incidence of this complication. Furthermore, owing to the rapid development of minimally invasive surgery, technicians are becoming more frequently involved. To improve communication between the operating team and technicians, standardized actions should also be defined. The aim of this study was to compare existing protocols for laparoscopic cholecystectomy from various Dutch hospitals. Methods Fifteen Dutch hospitals were contacted for evaluation of their protocols for laparoscopic cholecystectomy. All evaluated protocols were divided into six steps and were compared accordingly. Results In total, 13 hospitals responded—5 academic hospitals, 5 teaching hospitals, 3 community hospitals—of which 10 protocols were usable for comparison. Concerning the trocar positions, only minor differences were found. The concept of “critical view of safety” was represented in just one protocol. Furthermore, the order of clipping and cutting the cystic artery and duct differed. Descriptions of instruments and apparatus were also inconsistent. Conclusions Present protocols differ too much to define a universal procedure among surgeons in The Netherlands. The authors propose one (inter)national standardized protocol, including standardized actions. This uniform standardized protocol has to be officially released and recommended by national scientific associations (e.g., the Dutch Society of Surgery) or international societies (e.g., European Association for Endoscopic Surgery and Society of American Gastrointestinal and Endoscopic Surgeons). The aim is to improve patient safety and professional communication, which are necessary for new developments. PMID:18224485

  9. The Ontological Perspectives of the Semantic Web and the Metadata Harvesting Protocol: Applications of Metadata for Improving Web Search.

    ERIC Educational Resources Information Center

    Fast, Karl V.; Campbell, D. Grant

    2001-01-01

    Compares the implied ontological frameworks of the Open Archives Initiative Protocol for Metadata Harvesting and the World Wide Web Consortium's Semantic Web. Discusses current search engine technology, semantic markup, indexing principles of special libraries and online databases, and componentization and the distinction between data and…

  10. Can different primary care databases produce comparable estimates of burden of disease: results of a study exploring venous leg ulceration.

    PubMed

    Petherick, Emily S; Pickett, Kate E; Cullum, Nicky A

    2015-08-01

    Primary care databases from the UK have been widely used to produce evidence on the epidemiology and health service usage of a wide range of conditions. To date there have been few evaluations of the comparability of estimates between different sources of these data. To estimate the comparability of two widely used primary care databases, the Health Improvement Network Database (THIN) and the General Practice Research Database (GPRD) using venous leg ulceration as an exemplar condition. Cross prospective cohort comparison. GPRD and the THIN databases using data from 1998 to 2006. A data set was extracted from both databases containing all cases of persons aged 20 years or greater with a database diagnosis of venous leg ulceration recorded in the databases for the period 1998-2006. Annual rates of incidence and prevalence of venous leg ulceration were calculated within each database and standardized to the European standard population and compared using standardized rate ratios. Comparable estimates of venous leg ulcer incidence from the GPRD and THIN databases could be obtained using data from 2000 to 2006 and of prevalence using data from 2001 to 2006. Recent data collected by these two databases are more likely to produce comparable results of the burden venous leg ulceration. These results require confirmation in other disease areas to enable researchers to have confidence in the comparability of findings from these two widely used primary care research resources. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. The use of PDAs to collect baseline survey data: lessons learned from a pilot project in Bolivia.

    PubMed

    Escandon, I N; Searing, H; Goldberg, R; Duran, R; Arce, J Monterrey

    2008-01-01

    We compared the use of personal digital assistants (PDAs) against the use of standard paper questionnaires for collecting survey data. The evaluation consisted of qualitative approaches to document the process of introducing PDAs. Fieldwork was carried out during June-July 2005 at 12 sites in Bolivia. Data collectors reacted positively to the use of the PDAs and noted the advantages and disadvantages of paper and PDA data collection. A number of difficulties encountered in the use of PDA technology serve as a warning for investigators planning its adoption. Problems included incompatible data files (which impeded the ability to interpret data), an inadequate back-up protocol, and lack of a good 'fit' between the technology and the study. Ensuring the existence of a back-end database, developing an appropriate and adequate back-up protocol, and assessing whether a technology 'fits' the project are important factors in weighing the decision to collect data using PDAs.

  12. The effects of lasers on bond strength to ceramic materials: A systematic review and meta-analysis.

    PubMed

    García-Sanz, Verónica; Paredes-Gallardo, Vanessa; Mendoza-Yero, Omel; Carbonell-Leal, Miguel; Albaladejo, Alberto; Montiel-Company, José María; Bellot-Arcís, Carlos

    2018-01-01

    Lasers have recently been introduced as an alternative means of conditioning dental ceramic surfaces in order to enhance their adhesive strength to cements and other materials. The present systematic review and meta-analysis aimed to review and quantitatively analyze the available literature in order to determine which bond protocols and laser types are the most effective. A search was conducted in the Pubmed, Embase and Scopus databases for papers published up to April 2017. PRISMA guidelines for systematic review and meta-analysis were followed. Fifty-two papers were eligible for inclusion in the review. Twenty-five studies were synthesized quantitatively. Lasers were found to increase bond strength of ceramic surfaces to resin cements and composites when compared with control specimens (p-value < 0.01), whereas no significant differences were found in comparison with air-particle abraded surfaces. High variability can be observed in adhesion values between different analyses, pointing to a need to standardize study protocols and to determine the optimal parameters for each laser type.

  13. Performing private database queries in a real-world environment using a quantum protocol.

    PubMed

    Chan, Philip; Lucio-Martinez, Itzel; Mo, Xiaofan; Simon, Christoph; Tittel, Wolfgang

    2014-06-10

    In the well-studied cryptographic primitive 1-out-of-N oblivious transfer, a user retrieves a single element from a database of size N without the database learning which element was retrieved. While it has previously been shown that a secure implementation of 1-out-of-N oblivious transfer is impossible against arbitrarily powerful adversaries, recent research has revealed an interesting class of private query protocols based on quantum mechanics in a cheat sensitive model. Specifically, a practical protocol does not need to guarantee that the database provider cannot learn what element was retrieved if doing so carries the risk of detection. The latter is sufficient motivation to keep a database provider honest. However, none of the previously proposed protocols could cope with noisy channels. Here we present a fault-tolerant private query protocol, in which the novel error correction procedure is integral to the security of the protocol. Furthermore, we present a proof-of-concept demonstration of the protocol over a deployed fibre.

  14. Performing private database queries in a real-world environment using a quantum protocol

    PubMed Central

    Chan, Philip; Lucio-Martinez, Itzel; Mo, Xiaofan; Simon, Christoph; Tittel, Wolfgang

    2014-01-01

    In the well-studied cryptographic primitive 1-out-of-N oblivious transfer, a user retrieves a single element from a database of size N without the database learning which element was retrieved. While it has previously been shown that a secure implementation of 1-out-of-N oblivious transfer is impossible against arbitrarily powerful adversaries, recent research has revealed an interesting class of private query protocols based on quantum mechanics in a cheat sensitive model. Specifically, a practical protocol does not need to guarantee that the database provider cannot learn what element was retrieved if doing so carries the risk of detection. The latter is sufficient motivation to keep a database provider honest. However, none of the previously proposed protocols could cope with noisy channels. Here we present a fault-tolerant private query protocol, in which the novel error correction procedure is integral to the security of the protocol. Furthermore, we present a proof-of-concept demonstration of the protocol over a deployed fibre. PMID:24913129

  15. The Impact of Mechanical and Restricted Kinematic Alignment on Knee Anatomy in Total Knee Arthroplasty.

    PubMed

    Almaawi, Abdulaziz M; Hutt, Jonathan R B; Masse, Vincent; Lavigne, Martin; Vendittoli, Pascal-Andre

    2017-07-01

    Total knee arthroplasty (TKA), aiming at neutral mechanical alignment (MA), inevitably modifies the patient's native knee anatomy. Another option is kinematic alignment (KA), which aims to restore the original anatomy of the knee. The aim of this study was to evaluate the variations in lower limb anatomy of a patient population scheduled for TKA, and to assess the use of a restricted KA TKA protocol and compare the resulting anatomic modifications with the standard MA technique. A total of 4884 knee computed tomography scans were analyzed from a database of patients undergoing TKA with patient-specific instrumentation. The lateral distal femoral angle (LDFA), medial proximal tibial angle (MPTA), and hip-knee-ankle angle (HKA) were measured. Bone resections were compared using a standard MA and a restricted KA aiming for independent tibial and femoral cuts of maximum ±5° deviation from the coronal mechanical axis and a resulting overall coronal HKA within ±3° of neutral. The mean preoperative MPTA was 2.9° varus, LDFA was 2.7° valgus, and overall HKA was 0.1° varus. Using our protocol, 2475 knees (51%) could have undergone KA without adjustment. To include 4062 cases (83%), mean corrections of 0.5° for MPTA and 0.3° for LDFA were needed, significantly less than with MA (3.3° for MPTA and 3.2° for LDFA; P < .001). The range of knee anatomy in patients scheduled for TKA is wide. MA leads to greater modifications of knee joint anatomy. To avoid reproducing extreme anatomy, the proposed restricted KA protocol provides an interesting hybrid option between MA and true KA. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. The risk of acute liver injury associated with the use of antibiotics--evaluating robustness of results in the pharmacoepidemiological research on outcomes of therapeutics by a European consortium (PROTECT) project.

    PubMed

    Udo, Renate; Tcherny-Lessenot, Stéphanie; Brauer, Ruth; Dolin, Paul; Irvine, David; Wang, Yunxun; Auclert, Laurent; Juhaeri, Juhaeri; Kurz, Xavier; Abenhaim, Lucien; Grimaldi, Lamiae; De Bruin, Marie L

    2016-03-01

    To examine the robustness of findings of case-control studies on the association between acute liver injury (ALI) and antibiotic use in the following different situations: (i) Replication of a protocol in different databases, with different data types, as well as replication in the same database, but performed by a different research team. (ii) Varying algorithms to identify cases, with and without manual case validation. (iii) Different exposure windows for time at risk. Five case-control studies in four different databases were performed with a common study protocol as starting point to harmonize study outcome definitions, exposure definitions and statistical analyses. All five studies showed an increased risk of ALI associated with antibiotic use ranging from OR 2.6 (95% CI 1.3-5.4) to 7.7 (95% CI 2.0-29.3). Comparable trends could be observed in the five studies: (i) without manual validation the use of the narrowest definition for ALI showed higher risk estimates, (ii) narrow and broad algorithm definitions followed by manual validation of cases resulted in similar risk estimates, and (iii) the use of a larger window (30 days vs 14 days) to define time at risk led to a decrease in risk estimates. Reproduction of a study using a predefined protocol in different database settings is feasible, although assumptions had to be made and amendments in the protocol were inevitable. Despite differences, the strength of association was comparable between the studies. In addition, the impact of varying outcome definitions and time windows showed similar trends within the data sources. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Abbreviated Combined MR Protocol: A New Faster Strategy for Characterizing Breast Lesions.

    PubMed

    Moschetta, Marco; Telegrafo, Michele; Rella, Leonarda; Stabile Ianora, Amato Antonio; Angelelli, Giuseppe

    2016-06-01

    The use of an abbreviated magnetic resonance (MR) protocol has been recently proposed for cancer screening. The aim of our study is to evaluate the diagnostic accuracy of an abbreviated MR protocol combining short TI inversion recovery (STIR), turbo-spin-echo (TSE)-T2 sequences, a pre-contrast T1, and a single intermediate (3 minutes after contrast injection) post-contrast T1 sequence for characterizing breast lesions. A total of 470 patients underwent breast MR examination for screening, problem solving, or preoperative staging. Two experienced radiologists evaluated both standard and abbreviated protocols in consensus. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy for both protocols were calculated (with the histological findings and 6-month ultrasound follow-up as the reference standard) and compared with the McNemar test. The post-processing and interpretation times for the MR images were compared with the paired t test. In 177 of 470 (38%) patients, the MR sequences detected 185 breast lesions. Standard and abbreviated protocols obtained sensitivity, specificity, diagnostic accuracy, PPV, and NPV values respectively of 92%, 92%, 92%, 68%, and 98% and of 89%, 91%, 91%, 64%, and 98% with no statistically significant difference (P < .0001). The mean post-processing and interpretation time were, respectively, 7 ± 1 minutes and 6 ± 3.2 minutes for the standard protocol and 1 ± 1.2 minutes and 2 ± 1.2 minutes for the abbreviated protocol, with a statistically significant difference (P < .01). An abbreviated combined MR protocol represents a time-saving tool for radiologists and patients with the same diagnostic potential as the standard protocol in patients undergoing breast MRI for screening, problem solving, or preoperative staging. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. BioBarcode: a general DNA barcoding database and server platform for Asian biodiversity resources

    PubMed Central

    2009-01-01

    Background DNA barcoding provides a rapid, accurate, and standardized method for species-level identification using short DNA sequences. Such a standardized identification method is useful for mapping all the species on Earth, particularly when DNA sequencing technology is cheaply available. There are many nations in Asia with many biodiversity resources that need to be mapped and registered in databases. Results We have built a general DNA barcode data processing system, BioBarcode, with open source software - which is a general purpose database and server. It uses mySQL RDBMS 5.0, BLAST2, and Apache httpd server. An exemplary database of BioBarcode has around 11,300 specimen entries (including GenBank data) and registers the biological species to map their genetic relationships. The BioBarcode database contains a chromatogram viewer which improves the performance in DNA sequence analyses. Conclusion Asia has a very high degree of biodiversity and the BioBarcode database server system aims to provide an efficient bioinformatics protocol that can be freely used by Asian researchers and research organizations interested in DNA barcoding. The BioBarcode promotes the rapid acquisition of biological species DNA sequence data that meet global standards by providing specialized services, and provides useful tools that will make barcoding cheaper and faster in the biodiversity community such as standardization, depository, management, and analysis of DNA barcode data. The system can be downloaded upon request, and an exemplary server has been constructed with which to build an Asian biodiversity system http://www.asianbarcode.org. PMID:19958506

  19. Comparison of EPA Method 1615 RT-qPCR Assays in Standard and Kit Format

    EPA Science Inventory

    EPA Method 1615 contains protocols for measuring enterovirus and norovirus by reverse transcription quantitative polymerase chain reaction. A commercial kit based upon these protocols was designed and compared to the method's standard approach. Reagent grade, secondary effluent, ...

  20. Multiprofissional electronic protocol in ophtalmology with enfasis in strabismus.

    PubMed

    Ribeiro, Christie Graf; Moreira, Ana Tereza Ramos; Pinto, José Simão DE Paula; Malafaia, Osvaldo

    2016-01-01

    to create and validate an electronic database in ophthalmology focused on strabismus, to computerize this database in the form of a systematic data collection software named Electronic Protocol, and to incorporate this protocol into the Integrated System of Electronic Protocols (SINPE(c)). this is a descriptive study, with the methodology divided into three phases: (1) development of a theoretical ophthalmologic database with emphasis on strabismus; (2) computerization of this theoretical ophthalmologic database using SINPE(c) and (3) interpretation of the information with demonstration of results to validate the protocol. We inputed data from the charts of fifty patients with known strabismus through the Electronic Protocol for testing and validation. the new electronic protocol was able to store information regarding patient history, physical examination, laboratory exams, imaging results, diagnosis and treatment of patients with ophthalmologic diseases, with emphasis on strabismus. We included 2,141 items in this master protocol and created 20 new specific electronic protocols for strabismus, each with its own specifics. Validation was achieved through correlation and corroboration of the symptoms and confirmed diagnoses of the fifty included patients with the diagnostic criteria for the twenty new strabismus protocols. a new, validated electronic database focusing on ophthalmology, with emphasis on strabismus, was successfully created through the standardized collection of information, and computerization of the database using proprietary software. This protocol is ready for deployment to facilitate data collection, sorting and application for practitioners and researchers in numerous specialties. criar uma base eletrônica de dados em oftalmologia com ênfase em estrabismo através da coleta padronizada de informações. Informatizar esta base sob a forma de software para a coleta sistemática de dados chamado "Protocolo Eletrônico" e incorporar este "Protocolo Eletrônico" da Oftalmologia ao Sistema Integrado de Protocolos Eletrônicos (SINPE(c)). este é um estudo descritivo e a metodologia aplicada em seu desenvolvimento está didaticamente dividida em três fases: 1) criação da base teórica de dados clínicos de oftalmologia com ênfase em estrabismo; 2) informatização da base teórica dos dados utilizando o SINPE(c); e 3) interpretação das informações com demonstração dos resultados. A informatização da base de dados foi realizada pela utilização da concessão de uso do SINPE(c). Foram incluídos neste protocolo 50 pacientes com estrabismo para validação do protocolo. o protocolo eletrônico desenvolvido permitiu armazenar informações relacionadas à anamnese, exame físico, exames complementares, diagnóstico e tratamento de pacientes com doenças oftalmológicas, com ênfase em estrabismo. Foram incluídos neste trabalho 2141 itens no protocolo mestre e foram criados 20 protocolos específicos de estrabismo, cada um com suas particularidades. Os 50 pacientes que foram incluídos nos protocolos específicos demonstraram a eficácia do método empregado. foi criada uma base eletrônica de dados em oftalmologia com ênfase em estrabismo através da coleta padronizada de informações. Esta base de dados foi informatizada sob a forma de software onde os futuros usuários poderão utilizar o protocolo eletrônico multiprofissional de doenças oftalmológicas com ênfase em estrabismo para a coleta de seus dados.

  1. Standardization of the Food Composition Database Used in the Latin American Nutrition and Health Study (ELANS).

    PubMed

    Kovalskys, Irina; Fisberg, Mauro; Gómez, Georgina; Rigotti, Attilio; Cortés, Lilia Yadira; Yépez, Martha Cecilia; Pareja, Rossina G; Herrera-Cuenca, Marianella; Zimberg, Ioná Z; Tucker, Katherine L; Koletzko, Berthold; Pratt, Michael

    2015-09-16

    Between-country comparisons of estimated dietary intake are particularly prone to error when different food composition tables are used. The objective of this study was to describe our procedures and rationale for the selection and adaptation of available food composition to a single database to enable cross-country nutritional intake comparisons. Latin American Study of Nutrition and Health (ELANS) is a multicenter cross-sectional study of representative samples from eight Latin American countries. A standard study protocol was designed to investigate dietary intake of 9000 participants enrolled. Two 24-h recalls using the Multiple Pass Method were applied among the individuals of all countries. Data from 24-h dietary recalls were entered into the Nutrition Data System for Research (NDS-R) program after a harmonization process between countries to include local foods and appropriately adapt the NDS-R database. A food matching standardized procedure involving nutritional equivalency of local food reported by the study participants with foods available in the NDS-R database was strictly conducted by each country. Standardization of food and nutrient assessments has the potential to minimize systematic and random errors in nutrient intake estimations in the ELANS project. This study is expected to result in a unique dataset for Latin America, enabling cross-country comparisons of energy, macro- and micro-nutrient intake within this region.

  2. Standardization of the Food Composition Database Used in the Latin American Nutrition and Health Study (ELANS)

    PubMed Central

    Kovalskys, Irina; Fisberg, Mauro; Gómez, Georgina; Rigotti, Attilio; Cortés, Lilia Yadira; Yépez, Martha Cecilia; Pareja, Rossina G.; Herrera-Cuenca, Marianella; Zimberg, Ioná Z.; Tucker, Katherine L.; Koletzko, Berthold; Pratt, Michael

    2015-01-01

    Between-country comparisons of estimated dietary intake are particularly prone to error when different food composition tables are used. The objective of this study was to describe our procedures and rationale for the selection and adaptation of available food composition to a single database to enable cross-country nutritional intake comparisons. Latin American Study of Nutrition and Health (ELANS) is a multicenter cross-sectional study of representative samples from eight Latin American countries. A standard study protocol was designed to investigate dietary intake of 9000 participants enrolled. Two 24-h recalls using the Multiple Pass Method were applied among the individuals of all countries. Data from 24-h dietary recalls were entered into the Nutrition Data System for Research (NDS-R) program after a harmonization process between countries to include local foods and appropriately adapt the NDS-R database. A food matching standardized procedure involving nutritional equivalency of local food reported by the study participants with foods available in the NDS-R database was strictly conducted by each country. Standardization of food and nutrient assessments has the potential to minimize systematic and random errors in nutrient intake estimations in the ELANS project. This study is expected to result in a unique dataset for Latin America, enabling cross-country comparisons of energy, macro- and micro-nutrient intake within this region. PMID:26389952

  3. Seaworthy Quantum Key Distribution Design and Validation (SEAKEY)

    DTIC Science & Technology

    2014-10-30

    to single photon detection, at comparable detection efficiencies. On the other hand, error-correction codes are better developed for small-alphabet...protocol is several orders of magnitude better than the Shapiro protocol, which needs entangled states. The bits/mode performance achieved by our...putting together a software tool implemented in MATLAB , which talks to the MODTRAN database via an intermediate numerical dump of transmission data

  4. Crystallography Open Database (COD): an open-access collection of crystal structures and platform for world-wide collaboration

    PubMed Central

    Gražulis, Saulius; Daškevič, Adriana; Merkys, Andrius; Chateigner, Daniel; Lutterotti, Luca; Quirós, Miguel; Serebryanaya, Nadezhda R.; Moeck, Peter; Downs, Robert T.; Le Bail, Armel

    2012-01-01

    Using an open-access distribution model, the Crystallography Open Database (COD, http://www.crystallography.net) collects all known ‘small molecule / small to medium sized unit cell’ crystal structures and makes them available freely on the Internet. As of today, the COD has aggregated ∼150 000 structures, offering basic search capabilities and the possibility to download the whole database, or parts thereof using a variety of standard open communication protocols. A newly developed website provides capabilities for all registered users to deposit published and so far unpublished structures as personal communications or pre-publication depositions. Such a setup enables extension of the COD database by many users simultaneously. This increases the possibilities for growth of the COD database, and is the first step towards establishing a world wide Internet-based collaborative platform dedicated to the collection and curation of structural knowledge. PMID:22070882

  5. TU-G-BRD-04: A Round Robin Dosimetry Intercomparison of Gamma Stereotactic Radiosurgery Calibration Protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drzymala, R; Alvarez, P; Bednarz, G

    2015-06-15

    Purpose: The purpose of this multi-institutional study was to compare two new gamma stereotactic radiosurgery (GSRS) dosimetry protocols to existing calibration methods. The ultimate goal was to guide AAPM Task Group 178 in recommending a standard GSRS dosimetry protocol. Methods: Nine centers (ten GSRS units) participated in the study. Each institution made eight sets of dose rate measurements: six with two different ionization chambers in three different 160mm-diameter spherical phantoms (ABS plastic, Solid Water and liquid water), and two using the same ionization chambers with a custom in-air positioning jig. Absolute dose rates were calculated using a newly proposed formalismmore » by the IAEA working group for small and non-standard radiation fields and with a new air-kerma based protocol. The new IAEA protocol requires an in-water ionization chamber calibration and uses previously reported Monte-Carlo generated factors to account for the material composition of the phantom, the type of ionization chamber, and the unique GSRS beam configuration. Results obtained with the new dose calibration protocols were compared to dose rates determined by the AAPM TG-21 and TG-51 protocols, with TG-21 considered as the standard. Results: Averaged over all institutions, ionization chambers and phantoms, the mean dose rate determined with the new IAEA protocol relative to that determined with TG-21 in the ABS phantom was 1.000 with a standard deviation of 0.008. For TG-51, the average ratio was 0.991 with a standard deviation of 0.013, and for the new in-air formalism it was 1.008 with a standard deviation of 0.012. Conclusion: Average results with both of the new protocols agreed with TG-21 to within one standard deviation. TG-51, which does not take into account the unique GSRS beam configuration or phantom material, was not expected to perform as well as the new protocols. The new IAEA protocol showed remarkably good agreement with TG-21. Conflict of Interests: Paula Petti, Josef Novotny, Gennady Neyman and Steve Goetsch are consultants for Elekta Instrument A/B; Elekta Instrument AB, PTW Freiburg GmbH, Standard Imaging, Inc., and The Phantom Laboratory, Inc. loaned equipment for use in these experiments; The University of Wisconsin Accredited Dosimetry Calibration Laboratory provided calibration services.« less

  6. Device Oriented Project Controller

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dalesio, Leo; Kraimer, Martin

    2013-11-20

    This proposal is directed at the issue of developing control systems for very large HEP projects. A de-facto standard in accelerator control is the Experimental Physics and Industrial Control System (EPICS), which has been applied successfully to many physics projects. EPICS is a channel based system that requires that each channel of each device be configured and controlled. In Phase I, the feasibility of a device oriented extension to the distributed channel database was demonstrated by prototyping a device aware version of an EPICS I/O controller that functions with the current version of the channel access communication protocol. Extensions havemore » been made to the grammar to define the database. Only a multi-stage position controller with limit switches was developed in the demonstration, but the grammar should support a full range of functional record types. In phase II, a full set of record types will be developed to support all existing record types, a set of process control functions for closed loop control, and support for experimental beam line control. A tool to configure these records will be developed. A communication protocol will be developed or extensions will be made to Channel Access to support introspection of components of a device. Performance bench marks will be made on both communication protocol and the database. After these records and performance tests are under way, a second of the grammar will be undertaken.« less

  7. Towards a five-minute comprehensive cardiac MR examination using highly accelerated parallel imaging with a 32-element coil array: feasibility and initial comparative evaluation.

    PubMed

    Xu, Jian; Kim, Daniel; Otazo, Ricardo; Srichai, Monvadi B; Lim, Ruth P; Axel, Leon; Mcgorty, Kelly Anne; Niendorf, Thoralf; Sodickson, Daniel K

    2013-07-01

    To evaluate the feasibility and perform initial comparative evaluations of a 5-minute comprehensive whole-heart magnetic resonance imaging (MRI) protocol with four image acquisition types: perfusion (PERF), function (CINE), coronary artery imaging (CAI), and late gadolinium enhancement (LGE). This study protocol was Health Insurance Portability and Accountability Act (HIPAA)-compliant and Institutional Review Board-approved. A 5-minute comprehensive whole-heart MRI examination protocol (Accelerated) using 6-8-fold-accelerated volumetric parallel imaging was incorporated into and compared with a standard 2D clinical routine protocol (Standard). Following informed consent, 20 patients were imaged with both protocols. Datasets were reviewed for image quality using a 5-point Likert scale (0 = non-diagnostic, 4 = excellent) in blinded fashion by two readers. Good image quality with full whole-heart coverage was achieved using the accelerated protocol, particularly for CAI, although significant degradations in quality, as compared with traditional lengthy examinations, were observed for the other image types. Mean total scan time was significantly lower for the Accelerated as compared to Standard protocols (28.99 ± 4.59 min vs. 1.82 ± 0.05 min, P < 0.05). Overall image quality for the Standard vs. Accelerated protocol was 3.67 ± 0.29 vs. 1.5 ± 0.51 (P < 0.005) for PERF, 3.48 ± 0.64 vs. 2.6 ± 0.68 (P < 0.005) for CINE, 2.35 ± 1.01 vs. 2.48 ± 0.68 (P = 0.75) for CAI, and 3.67 ± 0.42 vs. 2.67 ± 0.84 (P < 0.005) for LGE. Diagnostic image quality for Standard vs. Accelerated protocols was 20/20 (100%) vs. 10/20 (50%) for PERF, 20/20 (100%) vs. 18/20 (90%) for CINE, 18/20 (90%) vs. 18/20 (90%) for CAI, and 20/20 (100%) vs. 18/20 (90%) for LGE. This study demonstrates the technical feasibility and promising image quality of 5-minute comprehensive whole-heart cardiac examinations, with simplified scan prescription and high spatial and temporal resolution enabled by highly parallel imaging technology. The study also highlights technical hurdles that remain to be addressed. Although image quality remained diagnostic for most scan types, the reduced image quality of PERF, CINE, and LGE scans in the Accelerated protocol remain a concern. Copyright © 2012 Wiley Periodicals, Inc.

  8. The role of MRI in axillary lymph node imaging in breast cancer patients: a systematic review.

    PubMed

    Kuijs, V J L; Moossdorff, M; Schipper, R J; Beets-Tan, R G H; Heuts, E M; Keymeulen, K B M I; Smidt, M L; Lobbes, M B I

    2015-04-01

    To assess whether MRI can exclude axillary lymph node metastasis, potentially replacing sentinel lymph node biopsy (SLNB), and consequently eliminating the risk of SLNB-associated morbidity. PubMed, Cochrane, Medline and Embase databases were searched for relevant publications up to July 2014. Studies were selected based on predefined inclusion and exclusion criteria and independently assessed by two reviewers using a standardised extraction form. Sixteen eligible studies were selected from 1,372 publications identified by the search. A dedicated axillary protocol [sensitivity 84.7 %, negative predictive value (NPV) 95.0 %] was superior to a standard protocol covering both the breast and axilla simultaneously (sensitivity 82.0 %, NPV 82.6 %). Dynamic, contrast-enhanced MRI had a lower median sensitivity (60.0 %) and NPV (80.0 %) compared to non-enhanced T1w/T2w sequences (88.4, 94.7 %), diffusion-weighted imaging (84.2, 90.6 %) and ultrasmall superparamagnetic iron oxide (USPIO)- enhanced T2*w sequences (83.0, 95.9 %). The most promising results seem to be achievable when using non-enhanced T1w/T2w and USPIO-enhanced T2*w sequences in combination with a dedicated axillary protocol (sensitivity 84.7 % and NPV 95.0 %). The diagnostic performance of some MRI protocols for excluding axillary lymph node metastases approaches the NPV needed to replace SLNB. However, current observations are based on studies with heterogeneous study designs and limited populations. • Some axillary MRI protocols approach the NPV of an SLNB procedure. • Dedicated axillary MRI is more accurate than protocols also covering the breast. • T1w/T2w protocols combined with USPIO-enhanced sequences are the most promising sequences.

  9. Normalization of cortical thickness measurements across different T1 magnetic resonance imaging protocols by novel W-Score standardization.

    PubMed

    Chung, Jinyong; Yoo, Kwangsun; Lee, Peter; Kim, Chan Mi; Roh, Jee Hoon; Park, Ji Eun; Kim, Sang Joon; Seo, Sang Won; Shin, Jeong-Hyeon; Seong, Joon-Kyung; Jeong, Yong

    2017-10-01

    The use of different 3D T1-weighted magnetic resonance (T1 MR) imaging protocols induces image incompatibility across multicenter studies, negating the many advantages of multicenter studies. A few methods have been developed to address this problem, but significant image incompatibility still remains. Thus, we developed a novel and convenient method to improve image compatibility. W-score standardization creates quality reference values by using a healthy group to obtain normalized disease values. We developed a protocol-specific w-score standardization to control the protocol effect, which is applied to each protocol separately. We used three data sets. In dataset 1, brain T1 MR images of normal controls (NC) and patients with Alzheimer's disease (AD) from two centers, acquired with different T1 MR protocols, were used (Protocol 1 and 2, n = 45/group). In dataset 2, data from six subjects, who underwent MRI with two different protocols (Protocol 1 and 2), were used with different repetition times, echo times, and slice thicknesses. In dataset 3, T1 MR images from a large number of healthy normal controls (Protocol 1: n = 148, Protocol 2: n = 343) were collected for w-score standardization. The protocol effect and disease effect on subjects' cortical thickness were analyzed before and after the application of protocol-specific w-score standardization. As expected, different protocols resulted in differing cortical thickness measurements in both NC and AD subjects. Different measurements were obtained for the same subject when imaged with different protocols. Multivariate pattern difference between measurements was observed between the protocols. Classification accuracy between two protocols was nearly 90%. After applying protocol-specific w-score standardization, the differences between the protocols substantially decreased. Most importantly, protocol-specific w-score standardization reduced both univariate and multivariate differences in the images while maintaining the AD disease effect. Compared to conventional regression methods, our method showed the best performance for in terms of controlling the protocol effect while preserving disease information. Protocol-specific w-score standardization effectively resolved the concerns of conventional regression methods. It showed the best performance for improving the compatibility of a T1 MR post-processed feature, cortical thickness. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Evaluation of Vitamin D Standardization Program protocols for standardizing serum 25-hydroxyvitamin D data: a case study of the program's potential for national nutrition and health surveys12345

    PubMed Central

    Cashman, Kevin D; Kiely, Mairead; Kinsella, Michael; Durazo-Arvizu, Ramón A; Tian, Lu; Zhang, Yue; Lucey, Alice; Flynn, Albert; Gibney, Michael J; Vesper, Hubert W; Phinney, Karen W; Coates, Paul M; Picciano, Mary F; Sempos, Christopher T

    2013-01-01

    Background: The Vitamin D Standardization Program (VDSP) has developed protocols for standardizing procedures of 25-hydroxyvitamin D [25(OH)D] measurement in National Health/Nutrition Surveys to promote 25(OH)D measurements that are accurate and comparable over time, location, and laboratory procedure to improve public health practice. Objective: We applied VDSP protocols to existing ELISA-derived serum 25(OH)D data from the Irish National Adult Nutrition Survey (NANS) as a case-study survey and evaluated their effectiveness by comparison of the protocol-projected estimates with those from a reanalysis of survey serums by using liquid chromatography–tandem mass spectrometry (LC–tandem MS). Design: The VDSP reference system and protocols were applied to ELISA-based serum 25(OH)D data from the representative NANS sample (n = 1118). A reanalysis of 99 stored serums by using standardized LC–tandem MS and resulting regression equations yielded predicted standardized serum 25(OH)D values, which were then compared with LC–tandem MS reanalyzed values for all serums. Results: Year-round prevalence rates for serum 25(OH)D concentrations <30, <40, and <50 nmol/L were 6.5%, 21.9%, and 40.0%, respectively, via original ELISA measurements and 11.4%, 25.3%, and 43.7%, respectively, when VDSP protocols were applied. Differences in estimates at <30- and <40-nmol/L thresholds, but not at the <50-nmol/L threshold, were significant (P < 0.05). A reanalysis of all serums by using LC–tandem MS confirmed prevalence estimates as 11.2%, 27.2%, and 45.0%, respectively. Prevalences of serum 25(OH)D concentrations >125 nmol/L were 1.2%, 0.3%, and 0.6% by means of ELISA, VDSP protocols, and LC–tandem MS, respectively. Conclusion: VDSP protocols hold a major potential for national nutrition and health surveys in terms of the standardization of serum 25(OH)D data. PMID:23615829

  11. An approach to standardization of urine sediment analysis via suggestion of a common manual protocol.

    PubMed

    Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki

    2016-01-01

    The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.

  12. TCP Performance Enhancement Over Iridium

    NASA Technical Reports Server (NTRS)

    Torgerson, Leigh; Hutcherson, Joseph; McKelvey, James

    2007-01-01

    In support of iNET maturation, NASA-JPL has collaborated with NASA-Dryden to develop, test and demonstrate an over-the-horizon vehicle-to-ground networking capability, using Iridium as the vehicle-to-ground communications link for relaying critical vehicle telemetry. To ensure reliability concerns are met, the Space Communications Protocol Standards (SCPS) transport protocol was investigated for its performance characteristics in this environment. In particular, the SCPS-TP software performance was compared to that of the standard Transmission Control Protocol (TCP) over the Internet Protocol (IP). This paper will report on the results of this work.

  13. Comparing Short Dental Implants to Standard Dental Implants: Protocol for a Systematic Review.

    PubMed

    Rokn, Amir Reza; Keshtkar, Abbasali; Monzavi, Abbas; Hashemi, Kazem; Bitaraf, Tahereh

    2018-01-18

    Short dental implants have been proposed as a simpler, cheaper, and faster alternative for the rehabilitation of atrophic edentulous areas to avoid the disadvantages of surgical techniques for increasing bone volume. This review will compare short implants (4 to 8 mm) to standard implants (larger than 8 mm) in edentulous jaws, evaluating on the basis of marginal bone loss (MBL), survival rate, complications, and prosthesis failure. We will electronically search for randomized controlled trials comparing short dental implants to standard dental implants in the following databases: PubMed, Web of Science, EMBASE, Scopus, the Cochrane Central Register of Controlled Trials, and ClinicalTrials.gov with English language restrictions. We will manually search the reference lists of relevant reviews and the included articles in this review. The following journals will also be searched: European Journal of Oral Implantology, Clinical Oral Implants Research, and Clinical Implant Dentistry and Related Research. Two reviewers will independently perform the study selection, data extraction and quality assessment (using the Cochrane Collaboration tool) of included studies. All meta-analysis procedures including appropriate effect size combination, sub-group analysis, meta-regression, assessing publication or reporting bias will be performed using Stata (Statacorp, TEXAS) version 12.1. Short implant effectiveness will be assessed using the mean difference of MBL in terms of weighted mean difference (WMD) and standardized mean difference (SMD) using Cohen's method. The combined effect size measures in addition to the related 95% confidence intervals will be estimated by a fixed effect model. The heterogeneity of the related effect size will be assessed using a Q Cochrane test and I2 measure. The MBL will be presented by a standardized mean difference with a 95% confidence interval. The survival rate of implants, prostheses failures, and complications will be reported using a risk ratio at 95% confidence interval (P<.05). The present protocol illustrates an appropriate method to perform the systematic review and ensures transparency for the completed review. The results will be published in a peer-reviewed journal and social networks. In addition, an ethics approval is not considered necessary. PROSPERO registration number: CRD42016048363; https://www.crd.york.ac.uk/PROSPERO/ display_record.asp?ID=CRD42016048363 (Archived by WebCite at http://www.webcitation.org/6wZ7Fntry). ©Amir Reza Rokn, Abbasali Keshtkar, Abbas Monzavi, Kazem Hashemi, Tahereh Bitaraf. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 18.01.2018.

  14. Determining contrast medium dose and rate on basis of lean body weight: does this strategy improve patient-to-patient uniformity of hepatic enhancement during multi-detector row CT?

    PubMed

    Ho, Lisa M; Nelson, Rendon C; Delong, David M

    2007-05-01

    To prospectively evaluate the use of lean body weight (LBW) as the main determinant of the volume and rate of contrast material administration during multi-detector row computed tomography of the liver. This HIPAA-compliant study had institutional review board approval. All patients gave written informed consent. Four protocols were compared. Standard protocol involved 125 mL of iopamidol injected at 4 mL/sec. Total body weight (TBW) protocol involved 0.7 g iodine per kilogram of TBW. Calculated LBW and measured LBW protocols involved 0.86 g of iodine per kilogram and 0.92 g of iodine per kilogram calculated or measured LBW for men and women, respectively. Injection rate used for the three experimental protocols was determined proportionally on the basis of the calculated volume of contrast material. Postcontrast attenuation measurements during portal venous phase were obtained in liver, portal vein, and aorta for each group and were summed for each patient. Patient-to-patient enhancement variability in same group was measured with Levene test. Two-tailed t test was used to compare the three experimental protocols with the standard protocol. Data analysis was performed in 101 patients (25 or 26 patients per group), including 56 men and 45 women (mean age, 53 years). Average summed attenuation values for standard, TBW, calculated LBW, and measured LBW protocols were 419 HU +/- 50 (standard deviation), 443 HU +/- 51, 433 HU +/- 50, and 426 HU +/- 33, respectively (P = not significant for all). Levene test results for summed attenuation data for standard, TBW, calculated LBW, and measured LBW protocols were 40 +/- 29, 38 +/- 33 (P = .83), 35 +/- 35 (P = .56), and 26 +/- 19 (P = .05), respectively. By excluding highly variable but poorly perfused adipose tissue from calculation of contrast medium dose, the measured LBW protocol may lessen patient-to-patient enhancement variability while maintaining satisfactory hepatic and vascular enhancement.

  15. Methodological challenges in international performance measurement using patient-level administrative data.

    PubMed

    Kiivet, Raul; Sund, Reijo; Linna, Miika; Silverman, Barbara; Pisarev, Heti; Friedman, Nurit

    2013-09-01

    We conducted this case study in order to test how health system performance could be compared using the existing national administrative health databases containing individual data. In this comparative analysis we used national data set from three countries, Estonia, Israel and Finland to follow the medical history, treatment outcome and resource use of patients with a chronic disease (diabetes) for 8 years after medical treatment was initiated. This study showed that several clinically important aspects of quality of care as well as health policy issues of cost-effectiveness and efficiency of health systems can be assessed by using the national administrative health data systems, in case those collecting person-level health service data. We developed a structured study protocol and detailed data specifications to generate standardized data sets, in each country, for long-term follow up of incident cohort of diabetic persons as well as shared analyzing programs to produce performance measures from the standardized data sets. This stepwise decentralized approach and use of anonymous person-level data allowed us to mitigate any legal, ownership, confidentiality and privacy concerns and to create internationally comparative data with the extent of detail that is seldom seen before. For example, our preliminary performance comparisons indicate that higher mortality among relatively young diabetes patients in Estonia may be related to considerably higher rates of cardiovascular complications and lower use of statins. Modern administrative person-level health service databases contain sufficiently rich data in details to assess the performance of health systems in the management of chronic diseases. This paper presents and discusses the methodological challenges and the way the problems were solved or avoided to enhance the representativeness and comparability of results. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  16. SLOWLY REPEATED EVOKED PAIN (SREP) AS A MARKER OF CENTRAL SENSITIZATION IN FIBROMYALGIA: DIAGNOSTIC ACCURACY AND RELIABILITY IN COMPARISON WITH TEMPORAL SUMMATION OF PAIN.

    PubMed

    de la Coba, Pablo; Bruehl, Stephen; Gálvez-Sánchez, Carmen María; Reyes Del Paso, Gustavo A

    2018-05-01

    This study examined the diagnostic accuracy and test-retest reliability of a novel dynamic evoked pain protocol (slowly repeated evoked pain; SREP) compared to temporal summation of pain (TSP), a standard index of central sensitization. Thirty-five fibromyalgia (FM) and 30 rheumatoid arthritis (RA) patients completed, in pseudorandomized order, a standard mechanical TSP protocol (10 stimuli of 1s duration at the thenar eminence using a 300g monofilament with 1s interstimulus interval) and the SREP protocol (9 suprathreshold pressure stimuli of 5s duration applied to the fingernail with a 30s interstimulus interval). In order to evaluate reliability for both protocols, they were repeated in a second session 4-7 days later. Evidence for significant pain sensitization over trials (increasing pain intensity ratings) was observed for SREP in FM (p<.001) but not in RA (p=.35), whereas significant sensitization was observed in both diagnostic groups for the TSP protocol (p's<.008). Compared to TSP, SREP demonstrated higher overall diagnostic accuracy (87.7% vs. 64.6%), greater sensitivity (0.89 vs. 0.57), and greater specificity (0.87 vs. 0.73) in discriminating between FM and RA patients. Test-retest reliability of SREP sensitization was good in FM (ICCs: 0.80), and moderate in RA (ICC: 0.68). SREP seems to be a dynamic evoked pain index tapping into pain sensitization that allows for greater diagnostic accuracy in identifying FM patients compared to a standard TSP protocol. Further research is needed to study mechanisms underlying SREP and the potential utility of adding SREP to standard pain evaluation protocols.

  17. Evaluation of a Modified Pamidronate Protocol for the Treatment of Osteogenesis Imperfecta.

    PubMed

    Palomo, Telma; Andrade, Maria C; Peters, Barbara S E; Reis, Fernanda A; Carvalhaes, João Tomás A; Glorieux, Francis H; Rauch, Frank; Lazaretti-Castro, Marise

    2016-01-01

    Intravenous pamidronate is widely used to treat children with osteogenesis imperfecta (OI). In a well-studied protocol ('standard protocol'), pamidronate is given at a daily dose of 1 mg per kg body weight over 4 h on 3 successive days; infusion cycles are repeated every 4 months. Here, we evaluated renal safety of a simpler protocol for intravenous pamidronate infusions (2 mg per kg body weight given in a single infusion over 2 h, repeated every 4 months; 'modified protocol'). Results of 18 patients with OI types I, III, or IV treated with the modified protocol for 12 months were compared to 18 historic controls, treated with standard protocol. In the modified protocol, mild transient post-infusion increases in serum creatinine were found during each infusion but after 12 months serum creatinine remained similar from baseline [0.40 mg/dl (SD: 0.13)] to the end of the study [0.41 mg/dl (SD: 0.11)] (P = 0.79). The two protocols led to similar changes in serum creatinine during the first pamidronate infusion [modified protocol: +2% (SD: 21%); standard protocol: -3% (SD: 8%); P = 0.32]. Areal lumbar spine bone mineral density Z-scores increased from -2.7 (SD: 1.5) to -1.8 (SD: 1.4) with the modified protocol, and from -4.1 (SD: 1.4) to -3.1 (SD: 1.1) with standard protocol (P = 0.68 for group differences in bone density Z-score changes). The modified pamidronate protocol is safe and may have similar effects on bone density as the standard pamidronate protocol. More studies are needed with longer follow-up to prove anti-fracture efficacy.

  18. Apples to apples: the origin and magnitude of differences in asbestos cancer risk estimates derived using varying protocols.

    PubMed

    Berman, D Wayne

    2011-08-01

    Given that new protocols for assessing asbestos-related cancer risk have recently been published, questions arise concerning how they compare to the "IRIS" protocol currently used by regulators. The newest protocols incorporate findings from 20 additional years of literature. Thus, differences between the IRIS and newer Berman and Crump protocols are examined to evaluate whether these protocols can be reconciled. Risks estimated by applying these protocols to real exposure data from both laboratory and field studies are also compared to assess the relative health protectiveness of each protocol. The reliability of risks estimated using the two protocols are compared by evaluating the degree with which each potentially reproduces the known epidemiology study risks. Results indicate that the IRIS and Berman and Crump protocols can be reconciled; while environment-specific variation within fiber type is apparently due primarily to size effects (not addressed by IRIS), the 10-fold (average) difference between amphibole asbestos risks estimated using each protocol is attributable to an arbitrary selection of the lowest of available mesothelioma potency factors in the IRIS protocol. Thus, the IRIS protocol may substantially underestimate risk when exposure is primarily to amphibole asbestos. Moreover, while the Berman and Crump protocol is more reliable than the IRIS protocol overall (especially for predicting amphibole risk), evidence is presented suggesting a new fiber-size-related adjustment to the Berman and Crump protocol may ultimately succeed in reconciling the entire epidemiology database. However, additional data need to be developed before the performance of the adjusted protocol can be fully validated. © 2011 Society for Risk Analysis.

  19. Knowledge translation interventions for critically ill patients: a systematic review*.

    PubMed

    Sinuff, Tasnim; Muscedere, John; Adhikari, Neill K J; Stelfox, Henry T; Dodek, Peter; Heyland, Daren K; Rubenfeld, Gordon D; Cook, Deborah J; Pinto, Ruxandra; Manoharan, Venika; Currie, Jan; Cahill, Naomi; Friedrich, Jan O; Amaral, Andre; Piquette, Dominique; Scales, Damon C; Dhanani, Sonny; Garland, Allan

    2013-11-01

    We systematically reviewed ICU-based knowledge translation studies to assess the impact of knowledge translation interventions on processes and outcomes of care. We searched electronic databases (to July, 2010) without language restrictions and hand-searched reference lists of relevant studies and reviews. Two reviewers independently identified randomized controlled trials and observational studies comparing any ICU-based knowledge translation intervention (e.g., protocols, guidelines, and audit and feedback) to management without a knowledge translation intervention. We focused on clinical topics that were addressed in greater than or equal to five studies. Pairs of reviewers abstracted data on the clinical topic, knowledge translation intervention(s), process of care measures, and patient outcomes. For each individual or combination of knowledge translation intervention(s) addressed in greater than or equal to three studies, we summarized each study using median risk ratio for dichotomous and standardized mean difference for continuous process measures. We used random-effects models. Anticipating a small number of randomized controlled trials, our primary meta-analyses included randomized controlled trials and observational studies. In separate sensitivity analyses, we excluded randomized controlled trials and collapsed protocols, guidelines, and bundles into one category of intervention. We conducted meta-analyses for clinical outcomes (ICU and hospital mortality, ventilator-associated pneumonia, duration of mechanical ventilation, and ICU length of stay) related to interventions that were associated with improvements in processes of care. From 11,742 publications, we included 119 investigations (seven randomized controlled trials, 112 observational studies) on nine clinical topics. Interventions that included protocols with or without education improved continuous process measures (seven observational studies and one randomized controlled trial; standardized mean difference [95% CI]: 0.26 [0.1, 0.42]; p = 0.001 and four observational studies and one randomized controlled trial; 0.83 [0.37, 1.29]; p = 0.0004, respectively). Heterogeneity among studies within topics ranged from low to extreme. The exclusion of randomized controlled trials did not change our results. Single-intervention and lower-quality studies had higher standardized mean differences compared to multiple-intervention and higher-quality studies (p = 0.013 and 0.016, respectively). There were no associated improvements in clinical outcomes. Knowledge translation interventions in the ICU that include protocols with or without education are associated with the greatest improvements in processes of critical care.

  20. Private database queries based on counterfactual quantum key distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Jia-Li; Guo, Fen-Zhuo; Gao, Fei; Liu, Bin; Wen, Qiao-Yan

    2013-08-01

    Based on the fundamental concept of quantum counterfactuality, we propose a protocol to achieve quantum private database queries, which is a theoretical study of how counterfactuality can be employed beyond counterfactual quantum key distribution (QKD). By adding crucial detecting apparatus to the device of QKD, the privacy of both the distrustful user and the database owner can be guaranteed. Furthermore, the proposed private-database-query protocol makes full use of the low efficiency in the counterfactual QKD, and by adjusting the relevant parameters, the protocol obtains excellent flexibility and extensibility.

  1. Examining database persistence of ISO/EN 13606 standardized electronic health record extracts: relational vs. NoSQL approaches.

    PubMed

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Lozano-Rubí, Raimundo; Serrano-Balazote, Pablo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2017-08-18

    The objective of this research is to compare the relational and non-relational (NoSQL) database systems approaches in order to store, recover, query and persist standardized medical information in the form of ISO/EN 13606 normalized Electronic Health Record XML extracts, both in isolation and concurrently. NoSQL database systems have recently attracted much attention, but few studies in the literature address their direct comparison with relational databases when applied to build the persistence layer of a standardized medical information system. One relational and two NoSQL databases (one document-based and one native XML database) of three different sizes have been created in order to evaluate and compare the response times (algorithmic complexity) of six different complexity growing queries, which have been performed on them. Similar appropriate results available in the literature have also been considered. Relational and non-relational NoSQL database systems show almost linear algorithmic complexity query execution. However, they show very different linear slopes, the former being much steeper than the two latter. Document-based NoSQL databases perform better in concurrency than in isolation, and also better than relational databases in concurrency. Non-relational NoSQL databases seem to be more appropriate than standard relational SQL databases when database size is extremely high (secondary use, research applications). Document-based NoSQL databases perform in general better than native XML NoSQL databases. EHR extracts visualization and edition are also document-based tasks more appropriate to NoSQL database systems. However, the appropriate database solution much depends on each particular situation and specific problem.

  2. Dedicated dental volumetric and total body multislice computed tomography: a comparison of image quality and radiation dose

    NASA Astrophysics Data System (ADS)

    Strocchi, Sabina; Colli, Vittoria; Novario, Raffaele; Carrafiello, Gianpaolo; Giorgianni, Andrea; Macchi, Aldo; Fugazzola, Carlo; Conte, Leopoldo

    2007-03-01

    Aim of this work is to compare the performances of a Xoran Technologies i-CAT Cone Beam CT for dental applications with those of a standard total body multislice CT (Toshiba Aquilion 64 multislice) used for dental examinations. Image quality and doses to patients have been compared for the three main i-CAT protocols, the Toshiba standard protocol and a Toshiba modified protocol. Images of two phantoms have been acquired: a standard CT quality control phantom and an Alderson Rando ® anthropomorphic phantom. Image noise, Signal to Noise Ratio (SNR), Contrast to Noise Ratio (CNR) and geometric accuracy have been considered. Clinical image quality was assessed. Effective dose and doses to main head and neck organs were evaluated by means of thermo-luminescent dosimeters (TLD-100) placed in the anthropomorphic phantom. A Quality Index (QI), defined as the ratio of squared CNR to effective dose, has been evaluated. The evaluated effective doses range from 0.06 mSv (i-CAT 10 s protocol) to 2.37 mSv (Toshiba standard protocol). The Toshiba modified protocol (halved tube current, higher pitch value) imparts lower effective dose (0.99 mSv). The conventional CT device provides lower image noise and better SNR, but clinical effectiveness similar to that of dedicated dental CT (comparable CNR and clinical judgment). Consequently, QI values are much higher for this second CT scanner. No geometric distortion has been observed with both devices. As a conclusion, dental volumetric CT supplies adequate image quality to clinical purposes, at doses that are really lower than those imparted by a conventional CT device.

  3. Multisite Semiautomated Clinical Data Repository for Duplication 15q Syndrome: Study Protocol and Early Uses.

    PubMed

    Ajayi, Oluwaseun Jessica; Smith, Ebony Jeannae; Viangteeravat, Teeradache; Huang, Eunice Y; Nagisetty, Naga Satya V Rao; Urraca, Nora; Lusk, Laina; Finucane, Brenda; Arkilo, Dimitrios; Young, Jennifer; Jeste, Shafali; Thibert, Ronald; Reiter, Lawrence T

    2017-10-18

    Chromosome 15q11.2-q13.1 duplication syndrome (Dup15q syndrome) is a rare disorder caused by duplications of chromosome 15q11.2-q13.1, resulting in a wide range of developmental disabilities in affected individuals. The Dup15q Alliance is an organization that provides family support and promotes research to improve the quality of life of patients living with Dup15q syndrome. Because of the low prevalence of this condition, the establishment of a single research repository would have been difficult and more time consuming without collaboration across multiple institutions. The goal of this project is to establish a national deidentified database with clinical and survey information on individuals diagnosed with Dup15q syndrome. The development of a multiclinic site repository for clinical and survey data on individuals with Dup15q syndrome was initiated and supported by the Dup15q Alliance. Using collaborative workflows, communication protocols, and stakeholder engagement tools, a comprehensive database of patient-centered information was built. We successfully established a self-report populating, centralized repository for Dup15q syndrome research. This repository also resulted in the development of standardized instruments that can be used for other studies relating to developmental disorders. By standardizing the data collection instruments, it allows us integrate our data with other national databases, such as the National Database for Autism Research. A substantial portion of the data collected from the questionnaires was facilitated through direct engagement of participants and their families. This allowed for a more complete set of information to be collected with a minimal turnaround time. We developed a repository that can efficiently be mined for shared clinical phenotypes observed at multiple clinic sites and used as a springboard for future clinical and basic research studies. ©Oluwaseun Jessica Ajayi, Ebony Jeannae Smith, Teeradache Viangteeravat, Eunice Y Huang, Naga Satya V Rao Nagisetty, Nora Urraca, Laina Lusk, Brenda Finucane, Dimitrios Arkilo, Jennifer Young, Shafali Jeste, Ronald Thibert, The Dup15q Alliance, Lawrence T Reiter. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 18.10.2017.

  4. The effects of lasers on bond strength to ceramic materials: A systematic review and meta-analysis

    PubMed Central

    García-Sanz, Verónica; Mendoza-Yero, Omel; Carbonell-Leal, Miguel; Albaladejo, Alberto; Montiel-Company, José María; Bellot-Arcís, Carlos

    2018-01-01

    Lasers have recently been introduced as an alternative means of conditioning dental ceramic surfaces in order to enhance their adhesive strength to cements and other materials. The present systematic review and meta-analysis aimed to review and quantitatively analyze the available literature in order to determine which bond protocols and laser types are the most effective. A search was conducted in the Pubmed, Embase and Scopus databases for papers published up to April 2017. PRISMA guidelines for systematic review and meta-analysis were followed. Fifty-two papers were eligible for inclusion in the review. Twenty-five studies were synthesized quantitatively. Lasers were found to increase bond strength of ceramic surfaces to resin cements and composites when compared with control specimens (p-value < 0.01), whereas no significant differences were found in comparison with air-particle abraded surfaces. High variability can be observed in adhesion values between different analyses, pointing to a need to standardize study protocols and to determine the optimal parameters for each laser type. PMID:29293633

  5. System for Configuring Modular Telemetry Transponders

    NASA Technical Reports Server (NTRS)

    Varnavas, Kosta A. (Inventor); Sims, William Herbert, III (Inventor)

    2014-01-01

    A system for configuring telemetry transponder cards uses a database of error checking protocol data structures, each containing data to implement at least one CCSDS protocol algorithm. Using a user interface, a user selects at least one telemetry specific error checking protocol from the database. A compiler configures an FPGA with the data from the data structures to implement the error checking protocol.

  6. Dual-energy CT and ceramic or titanium prostheses material reduce CT artifacts and provide superior image quality of total knee arthroplasty.

    PubMed

    Kasparek, Maximilian F; Töpker, Michael; Lazar, Mathias; Weber, Michael; Kasparek, Michael; Mang, Thomas; Apfaltrer, Paul; Kubista, Bernd; Windhager, Reinhard; Ringl, Helmut

    2018-06-07

    To evaluate the influence of different scan parameters for single-energy CT and dual-energy CT, as well as the impact of different material used in a TKA prosthesis on image quality and the extent of metal artifacts. Eight pairs of TKA prostheses from different vendors were examined in a phantom set-up. Each pair consisted of a conventional CoCr prosthesis and the corresponding anti-allergic prosthesis (full titanium, ceramic, or ceramic-coated) from the same vendor. Nine different (seven dual-energy CT and two single-energy CT) scan protocols with different characteristics were used to determine the most suitable CT protocol for TKA imaging. Quantitative image analysis included assessment of blooming artifacts (metal implants appear thicker on CT than they are, given as virtual growth in mm in this paper) and streak artifacts (thick dark lines around metal). Qualitative image analysis was used to investigate the bone-prosthesis interface. The full titanium prosthesis and full ceramic knee showed significantly fewer blooming artifacts compared to the standard CoCr prosthesis (mean virtual growth 0.6-2.2 mm compared to 2.9-4.6 mm, p < 0.001). Dual-energy CT protocols showed less blooming (range 3.3-3.8 mm) compared to single-energy protocols (4.6-5.5 mm). The full titanium and full ceramic prostheses showed significantly fewer streak artifacts (mean standard deviation 77-86 Hounsfield unit (HU)) compared to the standard CoCr prosthesis (277-334 HU, p < 0.001). All dual-energy CT protocols had fewer metal streak artifacts (215-296 HU compared to single-energy CT protocols (392-497 HU)). Full titanium and ceramic prostheses were ranked superior with regard to the image quality at the bone/prosthesis interface compared to a standard CoCr prosthesis, and all dual-energy CT protocols were ranked better than single-energy protocols. Dual-energy CT and ceramic or titanium prostheses reduce CT artifacts and provide superior image quality of total knee arthroplasty at the bone/prosthesis interface. These findings support the use of dual-energy CT as a solid imaging base for clinical decision-making and the use of full-titanium or ceramic prostheses to allow for better CT visualization of the bone-prosthesis interface.

  7. Assessing and Improving the Quality of Food Composition Databases for Nutrition and Health Applications in Europe: The Contribution of EuroFIR123

    PubMed Central

    Finglas, Paul M.; Berry, Rachel; Astley, Siân

    2014-01-01

    Food composition databases (FCDBs) form an integral part of nutrition and health research, patient treatment, manufacturing processes, and consumer information. FCDBs have traditionally been compiled at a national level; therefore, until recently, there was limited standardization of procedures across different data sets. Digital technologies now allow FCDB users to access a variety of information from different sources, which has emphasized the need for greater harmonization. The European Food Information Resource (EuroFIR) Network of Excellence and Nexus projects (2005–2013) has been instrumental in addressing differences in FCDBs and in producing standardized protocols and quality schemes to compile and manage them. A formal, recognized European standard for food composition data has been prepared, which will further assist in the production of comparable data. Quality schemes need to address both the composition data, plus the methods of sampling, analysis, and calculation, and the documentation of processes. The EuroFIR data exchange platform provides a wealth of resources for composition compilers and end users and continues to develop new and innovative tools and methodologies. EuroFIR also is working in collaboration with the European Food Safety Authority, and as a partner in several European projects. Through such collaborations, EuroFIR will continue to develop FCDB harmonization and to use new technologies to ensure sustainable future initiatives in the food composition activities that underpin food and health research in Europe. PMID:25469406

  8. QKD-based quantum private query without a failure probability

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Gao, Fei; Huang, Wei; Wen, QiaoYan

    2015-10-01

    In this paper, we present a quantum-key-distribution (QKD)-based quantum private query (QPQ) protocol utilizing single-photon signal of multiple optical pulses. It maintains the advantages of the QKD-based QPQ, i.e., easy to implement and loss tolerant. In addition, different from the situations in the previous QKD-based QPQ protocols, in our protocol, the number of the items an honest user will obtain is always one and the failure probability is always zero. This characteristic not only improves the stability (in the sense that, ignoring the noise and the attack, the protocol would always succeed), but also benefits the privacy of the database (since the database will no more reveal additional secrets to the honest users). Furthermore, for the user's privacy, the proposed protocol is cheat sensitive, and for security of the database, we obtain an upper bound for the leaked information of the database in theory.

  9. Quantum Private Queries

    NASA Astrophysics Data System (ADS)

    Giovannetti, Vittorio; Lloyd, Seth; Maccone, Lorenzo

    2008-06-01

    We propose a cheat sensitive quantum protocol to perform a private search on a classical database which is efficient in terms of communication complexity. It allows a user to retrieve an item from the database provider without revealing which item he or she retrieved: if the provider tries to obtain information on the query, the person querying the database can find it out. The protocol ensures also perfect data privacy of the database: the information that the user can retrieve in a single query is bounded and does not depend on the size of the database. With respect to the known (quantum and classical) strategies for private information retrieval, our protocol displays an exponential reduction in communication complexity and in running-time computational complexity.

  10. The making of a pan-European organ transplant registry.

    PubMed

    Smits, Jacqueline M; Niesing, Jan; Breidenbach, Thomas; Collett, Dave

    2013-03-01

    A European patient registry to track the outcomes of organ transplant recipients does not exist. As knowledge gleaned from large registries has already led to the creation of standards of care that gained widespread support from patients and healthcare providers, the European Union initiated a project that would enable the creation of a European Registry linking currently existing national databases. This report contains a description of all functional, technical, and legal prerequisites, which upon fulfillment should allow for the seamless sharing of national longitudinal data across temporal, geographical, and subspecialty boundaries. To create a platform that can effortlessly link multiple databases and maintain the integrity of the existing national databases crucial elements were described during the project. These elements are: (i) use of a common dictionary, (ii) use of a common database and refined data uploading technology, (iii) use of standard methodology to allow uniform protocol driven and meaningful long-term follow-up analyses, (iv) use of a quality assurance mechanism to guarantee completeness and accuracy of the data collected, and (v) establishment of a solid legal framework that allows for safe data exchange. © 2012 The Authors Transplant International © 2012 European Society for Organ Transplantation. Published by Blackwell Publishing Ltd.

  11. Portfolio of prospective clinical trials including brachytherapy: an analysis of the ClinicalTrials.gov database.

    PubMed

    Cihoric, Nikola; Tsikkinis, Alexandros; Miguelez, Cristina Gutierrez; Strnad, Vratislav; Soldatovic, Ivan; Ghadjar, Pirus; Jeremic, Branislav; Dal Pra, Alan; Aebersold, Daniel M; Lössl, Kristina

    2016-03-22

    To evaluate the current status of prospective interventional clinical trials that includes brachytherapy (BT) procedures. The records of 175,538 (100 %) clinical trials registered at ClinicalTrials.gov were downloaded on September 2014 and a database was established. Trials using BT as an intervention were identified for further analyses. The selected trials were manually categorized according to indication(s), BT source, applied dose rate, primary sponsor type, location, protocol initiator and funding source. We analyzed trials across 8 available trial protocol elements registered within the database. In total 245 clinical trials were identified, 147 with BT as primary investigated treatment modality and 98 that included BT as an optional treatment component or as part of the standard treatment. Academic centers were the most frequent protocol initiators in trials where BT was the primary investigational treatment modality (p < 0.01). High dose rate (HDR) BT was the most frequently investigated type of BT dose rate (46.3 %) followed by low dose rate (LDR) (42.0 %). Prostate was the most frequently investigated tumor entity in trials with BT as the primary treatment modality (40.1 %) followed by breast cancer (17.0 %). BT was rarely the primary investigated treatment modality for cervical cancer (6.8 %). Most clinical trials using BT are predominantly in early phases, investigator-initiated and with low accrual numbers. Current investigational activities that include BT mainly focus on prostate and breast cancers. Important questions concerning the optimal usage of BT will not be answered in the near future.

  12. Human Connectome Project Informatics: quality control, database services, and data visualization

    PubMed Central

    Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.

    2013-01-01

    The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591

  13. A Weak Value Based QKD Protocol Robust Against Detector Attacks

    NASA Astrophysics Data System (ADS)

    Troupe, James

    2015-03-01

    We propose a variation of the BB84 quantum key distribution protocol that utilizes the properties of weak values to insure the validity of the quantum bit error rate estimates used to detect an eavesdropper. The protocol is shown theoretically to be secure against recently demonstrated attacks utilizing detector blinding and control and should also be robust against all detector based hacking. Importantly, the new protocol promises to achieve this additional security without negatively impacting the secure key generation rate as compared to that originally promised by the standard BB84 scheme. Implementation of the weak measurements needed by the protocol should be very feasible using standard quantum optical techniques.

  14. MRI of penile fracture: what should be a tailored protocol in emergency?

    PubMed

    Esposito, Andrea Alessandro; Giannitto, Caterina; Muzzupappa, Claudia; Maccagnoni, Sara; Gadda, Franco; Albo, Giancarlo; Biondetti, Pietro Raimondo

    2016-09-01

    To conduct a review of literature to summarize the existing MRI protocols for penile trauma, suggesting a tailored protocol to reduce costs and time of examination. A systematic search was performed in Medline, Embase, Cochrane Library, and Cinahl databases from 1995 to 2015 to identify studies evaluating penis trauma with MRI examination. Studies were included if there was the description of MRI protocol with at least sequences and orthogonal planes used. We chose a systematic approach for data extraction and descriptive synthesis. 12 articles were included in our study. Among the list of 12 articles: 2 were case reports, 3 were clinical series, and 7 were reviews. Clinical trials were not found. There is no unanimous consensus among the authors. Summarizing the data, the most used protocol is characterized by T2 sequences in three orthogonal planes plus T1 sequences in one plane (either axial or sagittal) without contrast medium injection. There is a lack of a standard protocol. A tailored protocol to answer the diagnostic question, reducing costs and time of examination, is characterized by T2 sequences in three orthogonal planes plus at least a T1 sequence (either axial or sagittal plane).

  15. Outcomes of Optimized over Standard Protocol of Rabbit Antithymocyte Globulin for Severe Aplastic Anemia: A Single-Center Experience

    PubMed Central

    Ge, Meili; Shao, Yingqi; Huang, Jinbo; Huang, Zhendong; Zhang, Jing; Nie, Neng; Zheng, Yizhou

    2013-01-01

    Background Previous reports showed that outcome of rabbit antithymocyte globulin (rATG) was not satisfactory as the first-line therapy for severe aplastic anemia (SAA). We explored a modifying schedule of administration of rATG. Design and Methods Outcomes of a cohort of 175 SAA patients, including 51 patients administered with standard protocol (3.55 mg/kg/d for 5 days) and 124 cases with optimized protocol (1.97 mg/kg/d for 9 days) of rATG plus cyclosporine (CSA), were analyzed retrospectively. Results Of all 175 patients, response rates at 3 and 6 months were 36.6% and 56.0%, respectively. 51 cases received standard protocol had poor responses at 3 (25.5%) and 6 months (41.2%). However, 124 patients received optimized protocol had better responses at 3 (41.1%, P = 0.14) and 6 (62.1%, P = 0.01). Higher incidences of infection (57.1% versus 37.9%, P = 0.02) and early mortality (17.9% versus 0.8%, P<0.001) occurred in patients received standard protocol compared with optimized protocol. The 5-year overall survival in favor of the optimized over standard rATG protocol (76.0% versus. 50.3%, P<0.001) was observed. By multivariate analysis, optimized protocol (RR = 2.21, P = 0.04), response at 3 months (RR = 10.31, P = 0.03) and shorter interval (<23 days) between diagnosis and initial dose of rATG (RR = 5.35, P = 0.002) were independent favorable predictors of overall survival. Conclusions Optimized instead of standard rATG protocol in combination with CSA remained efficacious as a first-line immunosuppressive regimen for SAA. PMID:23554855

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nielsen, Yousef W., E-mail: yujwni01@heh.regionh.d; Eiberg, Jonas P., E-mail: Eiberg@dadlnet.d; Logager, Vibeke B., E-mail: viloe@heh.regionh.d

    The purpose of this study was to determine the diagnostic performance of 3T whole-body magnetic resonance angiography (WB-MRA) using a hybrid protocol in comparison with a standard protocol in patients with peripheral arterial disease (PAD). In 26 consecutive patients with PAD two different protocols were used for WB-MRA: a standard sequential protocol (n = 13) and a hybrid protocol (n = 13). WB-MRA was performed using a gradient echo sequence, body coil for signal reception, and gadoterate meglumine as contrast agent (0.3 mmol/kg body weight). Two blinded observers evaluated all WB-MRA examinations with regard to presence of stenoses, as wellmore » as diagnostic quality and degree of venous contamination in each of the four stations used in WB-MRA. Digital subtraction angiography served as the method of reference. Sensitivity for detecting significant arterial disease (luminal narrowing {>=} 50%) using standard-protocol WB-MRA for the two observers was 0.63 (95%CI: 0.51-0.73) and 0.66 (0.58-0.78). Specificities were 0.94 (0.91-0.97) and 0.96 (0.92-0.98), respectively. In the hybrid protocol WB-MRA sensitivities were 0.75 (0.64-0.84) and 0.70 (0.58-0.8), respectively. Specificities were 0.93 (0.88-0.96) and 0.95 (0.91-0.97). Interobserver agreement was good using both the standard and the hybrid protocol, with {kappa} = 0.62 (0.44-0.67) and {kappa} = 0.70 (0.59-0.79), respectively. WB-MRA quality scores were significantly higher in the lower leg using the hybrid protocol compared to standard protocol (p = 0.003 and p = 0.03, observers 1 and 2). Distal venous contamination scores were significantly lower with the hybrid protocol (p = 0.02 and p = 0.01, observers 1 and 2). In conclusion, hybrid-protocol WB-MRA shows a better diagnostic performance than standard protocol WB-MRA at 3 T in patients with PAD.« less

  17. Comparing inactivation protocols of Yersinia organisms for identification with matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Couderc, Carine; Nappez, Claude; Drancourt, Michel

    2012-03-30

    It is recommended that harmful Biosafety Level 3 (BSL-3) bacteria be inactivated prior to identification by mass spectrometry, yet optimal effects of inactivation protocol have not been defined. Here, we compare trifluoroacetic acid inactivation (protocol A) with ethanol inactivation (protocol B) of Yersinia organisms prior to identification by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS). The total number of peaks detected was 10.5 ± 1.7 for protocol A and 15.7 ± 4.2 for protocol B (ρ <0.001, ANOVA test). The signal-to-noise ratio for the m/z 6049 peak present in all of the tested Yersinia isolates was 9.7 ± 3.1 for protocol A and 18.1 ± 4.6 for protocol B (ρ < 0.001). Compared with spectra in our local database containing 48 Yersinia spp., including 20 strains of Y. pestis, the identification score was 1.79 ± 0.2 for protocol A and 1.97 ± 0.19 for protocol B (ρ = 0.0024). Our observations indicate that for the identification of Yersinia organisms, ethanol inactivation yielded MALDI-TOF-MS spectra of significantly higher quality than spectra derived from trifluoroacetic acid inactivation. Combined with previously published data, our results permit the updating of protocols for inactivating BSL-3 bacteria. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Building a biomedical cyberinfrastructure for collaborative research.

    PubMed

    Schad, Peter A; Mobley, Lee Rivers; Hamilton, Carol M

    2011-05-01

    For the potential power of genome-wide association studies (GWAS) and translational medicine to be realized, the biomedical research community must adopt standard measures, vocabularies, and systems to establish an extensible biomedical cyberinfrastructure. Incorporating standard measures will greatly facilitate combining and comparing studies via meta-analysis. Incorporating consensus-based and well-established measures into various studies should reduce the variability across studies due to attributes of measurement, making findings across studies more comparable. This article describes two well-established consensus-based approaches to identifying standard measures and systems: PhenX (consensus measures for phenotypes and eXposures), and the Open Geospatial Consortium (OGC). NIH support for these efforts has produced the PhenX Toolkit, an assembled catalog of standard measures for use in GWAS and other large-scale genomic research efforts, and the RTI Spatial Impact Factor Database (SIFD), a comprehensive repository of geo-referenced variables and extensive meta-data that conforms to OGC standards. The need for coordinated development of cyberinfrastructure to support measures and systems that enhance collaboration and data interoperability is clear; this paper includes a discussion of standard protocols for ensuring data compatibility and interoperability. Adopting a cyberinfrastructure that includes standard measures and vocabularies, and open-source systems architecture, such as the two well-established systems discussed here, will enhance the potential of future biomedical and translational research. Establishing and maintaining the cyberinfrastructure will require a fundamental change in the way researchers think about study design, collaboration, and data storage and analysis. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  19. Building a Biomedical Cyberinfrastructure for Collaborative Research

    PubMed Central

    Schad, Peter A.; Mobley, Lee Rivers; Hamilton, Carol M.

    2018-01-01

    For the potential power of genome-wide association studies (GWAS) and translational medicine to be realized, the biomedical research community must adopt standard measures, vocabularies, and systems to establish an extensible biomedical cyberinfrastructure. Incorporating standard measures will greatly facilitate combining and comparing studies via meta-analysis, which is a means for deriving larger populations, needed for increased statistical power to detect less apparent and more complex associations (gene-environment interactions and polygenic gene-gene interactions). Incorporating consensus-based and well-established measures into various studies should reduce the variability across studies due to attributes of measurement, making findings across studies more comparable. This article describes two consensus-based approaches to establishing standard measures and systems: PhenX (consensus measures for Phenotypes and eXposures), and the Open Geospatial Consortium (OGC). National Institutes of Health support for these efforts has produced the PhenX Toolkit, an assembled catalog of standard measures for use in GWAS and other large-scale genomic research efforts, and the RTI Spatial Impact Factor Database (SIFD), a comprehensive repository of georeferenced variables and extensive metadata that conforms to OGC standards. The need for coordinated development of cyberinfrastructure to support collaboration and data interoperability is clear, and we discuss standard protocols for ensuring data compatibility and interoperability. Adopting a cyberinfrastructure that includes standard measures, vocabularies, and open-source systems architecture will enhance the potential of future biomedical and translational research. Establishing and maintaining the cyberinfrastructure will require a fundamental change in the way researchers think about study design, collaboration, and data storage and analysis. PMID:21521587

  20. Practicing Surgeons Lead in Quality Care, Safety, and Cost Control

    PubMed Central

    Shively, Eugene H.; Heine, Michael J.; Schell, Robert H.; Sharpe, J Neal; Garrison, R Neal; Vallance, Steven R.; DeSimone, Kenneth J.S.; Polk, Hiram C.

    2004-01-01

    Objective: To report the experiences of 66 surgical specialists from 15 different hospitals who performed 43 CPT-based procedures more than 16,000 times. Summary Background Data: Surgeons are under increasing pressure to demonstrate patient safety data as quantitated by objective and subjective outcomes that meet or exceed the standards of benchmark institutions or databases. Methods: Data from 66 surgical specialists on 43 CPT-based procedures were accessioned over a 4-year period. The hospitals vary from a small 30-bed hospital to large teaching hospitals. All reported deaths and complications were verified from hospital and office records and compared with benchmarks. Results: Over a 4-year inclusive period (1999–2002), 16,028 elective operations were accessioned. There was a total 1.4% complication rate and 0.05% death rate. A system has been developed for tracking outcomes. A wide range of improvements have been identified. These include the following: 1) improved classification of indications for systemic prophylactic antibiotic use and reduction in the variety of drugs used, 2) shortened length of stay for standard procedures in different surgical specialties, 3) adherence to strict indicators for selected operative procedures, 4) less use of costly diagnostic procedures, 5) decreased use of expensive home health services, 6) decreased use of very expensive drugs, 7) identification of the unnecessary expense of disposable laparoscopic devices, 8) development of a method to compare a one-surgeon hospital with his peers, and 9) development of unique protocols for interaction of anesthesia and surgery. The system also provides a very good basis for confirmation of patient safety and improvement therein. Conclusions: Since 1998, Quality Surgical Solutions, PLLC, has developed simple physician-authored protocols for delivering high-quality and cost-effective surgery that measure up to benchmark institutions. We have discovered wide areas for improvements in surgery by adherence to simple protocols, minimizing death and complications and clarifying cost issues. PMID:15166954

  1. International Image Concordance Study to Compare a Point of Care Tampon Colposcope to a Standard-of-Care Colposcope

    PubMed Central

    Mueller, Jenna L.; Asma, Elizabeth; Lam, Christopher T.; Krieger, Marlee S.; Gallagher, Jennifer E.; Erkanli, Alaattin; Hariprasad, Roopa; Malliga, J.S.; Muasher, Lisa C.; Mchome, Bariki; Oneko, Olola; Taylor, Peyton; Venegas, Gino; Wanyoro, Anthony; Mehrotra, Ravi; Schmitt, John W.; Ramanujam, Nimmi

    2017-01-01

    Objective Barriers to cervical cancer screening in low resource settings include lack of accessible high quality services, high cost, and the need for multiple visits. To address these challenges, we developed a low cost intra-vaginal optical cervical imaging device, the Point of Care Tampon (POCkeT) colposcope, and evaluated whether its performance is comparable to a standard-of-care colposcope. Methods There were two protocols, which included 44 and 18 patients respectively. For the first protocol, white light cervical images were collected in vivo, blinded by device, and sent electronically to 8 physicians from high, middle and low income countries. For the second protocol, green light images were also collected and sent electronically to the highest performing physician from the first protocol who has experience in both a high and low income country. For each image, physicians completed a survey assessing cervix characteristics and severity of precancerous lesions. Corresponding pathology was obtained for all image pairs. Results For the first protocol, average percent agreement between devices was 70% across all physicians. POCkeT and standard-of-care colposcope images had 37% and 51% percent agreement respectively with pathology for high-grade squamous intraepithelial lesions (HSILs). Investigation of HSIL POCkeT images revealed decreased visibility of vascularization and lack of contrast in lesion margins. After changes were made for the second protocol, the two devices achieved similar agreement to pathology for HSIL lesions (55%). Conclusions Based on the exploratory study, physician interpretation of cervix images acquired using a portable, low cost, POCkeT colposcope was comparable to a standard-of-care colposcope. PMID:28263237

  2. SCPortalen: human and mouse single-cell centric database

    PubMed Central

    Noguchi, Shuhei; Böttcher, Michael; Hasegawa, Akira; Kouno, Tsukasa; Kato, Sachi; Tada, Yuhki; Ura, Hiroki; Abe, Kuniya; Shin, Jay W; Plessy, Charles; Carninci, Piero

    2018-01-01

    Abstract Published single-cell datasets are rich resources for investigators who want to address questions not originally asked by the creators of the datasets. The single-cell datasets might be obtained by different protocols and diverse analysis strategies. The main challenge in utilizing such single-cell data is how we can make the various large-scale datasets to be comparable and reusable in a different context. To challenge this issue, we developed the single-cell centric database ‘SCPortalen’ (http://single-cell.clst.riken.jp/). The current version of the database covers human and mouse single-cell transcriptomics datasets that are publicly available from the INSDC sites. The original metadata was manually curated and single-cell samples were annotated with standard ontology terms. Following that, common quality assessment procedures were conducted to check the quality of the raw sequence. Furthermore, primary data processing of the raw data followed by advanced analyses and interpretation have been performed from scratch using our pipeline. In addition to the transcriptomics data, SCPortalen provides access to single-cell image files whenever available. The target users of SCPortalen are all researchers interested in specific cell types or population heterogeneity. Through the web interface of SCPortalen users are easily able to search, explore and download the single-cell datasets of their interests. PMID:29045713

  3. Sodium content of foods contributing to sodium intake: A comparison between selected foods from the CDC Packaged Food Database and the USDA National Nutrient Database for Standard Reference

    USDA-ARS?s Scientific Manuscript database

    The sodium concentration (mg/100g) for 23 of 125 Sentinel Foods were identified in the 2009 CDC Packaged Food Database (PFD) and compared with data in the USDA’s 2013 Standard Reference 26 (SR 26) database. Sentinel Foods are foods and beverages identified by USDA to be monitored as primary indicat...

  4. Secure quantum communication using classical correlated channel

    NASA Astrophysics Data System (ADS)

    Costa, D.; de Almeida, N. G.; Villas-Boas, C. J.

    2016-10-01

    We propose a secure protocol to send quantum information from one part to another without a quantum channel. In our protocol, which resembles quantum teleportation, a sender (Alice) and a receiver (Bob) share classical correlated states instead of EPR ones, with Alice performing measurements in two different bases and then communicating her results to Bob through a classical channel. Our secure quantum communication protocol requires the same amount of classical bits as the standard quantum teleportation protocol. In our scheme, as in the usual quantum teleportation protocol, once the classical channel is established in a secure way, a spy (Eve) will never be able to recover the information of the unknown quantum state, even if she is aware of Alice's measurement results. Security, advantages, and limitations of our protocol are discussed and compared with the standard quantum teleportation protocol.

  5. Clinical decision support tools: personal digital assistant versus online dietary supplement databases.

    PubMed

    Clauson, Kevin A; Polen, Hyla H; Peak, Amy S; Marsh, Wallace A; DiScala, Sandra L

    2008-11-01

    Clinical decision support tools (CDSTs) on personal digital assistants (PDAs) and online databases assist healthcare practitioners who make decisions about dietary supplements. To assess and compare the content of PDA dietary supplement databases and their online counterparts used as CDSTs. A total of 102 question-and-answer pairs were developed within 10 weighted categories of the most clinically relevant aspects of dietary supplement therapy. PDA versions of AltMedDex, Lexi-Natural, Natural Medicines Comprehensive Database, and Natural Standard and their online counterparts were assessed by scope (percent of correct answers present), completeness (3-point scale), ease of use, and a composite score integrating all 3 criteria. Descriptive statistics and inferential statistics, including a chi(2) test, Scheffé's multiple comparison test, McNemar's test, and the Wilcoxon signed rank test were used to analyze data. The scope scores for PDA databases were: Natural Medicines Comprehensive Database 84.3%, Natural Standard 58.8%, Lexi-Natural 50.0%, and AltMedDex 36.3%, with Natural Medicines Comprehensive Database statistically superior (p < 0.01). Completeness scores were: Natural Medicines Comprehensive Database 78.4%, Natural Standard 51.0%, Lexi-Natural 43.5%, and AltMedDex 29.7%. Lexi-Natural was superior in ease of use (p < 0.01). Composite scores for PDA databases were: Natural Medicines Comprehensive Database 79.3, Natural Standard 53.0, Lexi-Natural 48.0, and AltMedDex 32.5, with Natural Medicines Comprehensive Database superior (p < 0.01). There was no difference between the scope for PDA and online database pairs with Lexi-Natural (50.0% and 53.9%, respectively) or Natural Medicines Comprehensive Database (84.3% and 84.3%, respectively) (p > 0.05), whereas differences existed for AltMedDex (36.3% vs 74.5%, respectively) and Natural Standard (58.8% vs 80.4%, respectively) (p < 0.01). For composite scores, AltMedDex and Natural Standard online were better than their PDA counterparts (p < 0.01). Natural Medicines Comprehensive Database achieved significantly higher scope, completeness, and composite scores compared with other dietary supplement PDA CDSTs in this study. There was no difference between the PDA and online databases for Lexi-Natural and Natural Medicines Comprehensive Database, whereas online versions of AltMedDex and Natural Standard were significantly better than their PDA counterparts.

  6. CT-based attenuation correction and resolution compensation for I-123 IMP brain SPECT normal database: a multicenter phantom study.

    PubMed

    Inui, Yoshitaka; Ichihara, Takashi; Uno, Masaki; Ishiguro, Masanobu; Ito, Kengo; Kato, Katsuhiko; Sakuma, Hajime; Okazawa, Hidehiko; Toyama, Hiroshi

    2018-06-01

    Statistical image analysis of brain SPECT images has improved diagnostic accuracy for brain disorders. However, the results of statistical analysis vary depending on the institution even when they use a common normal database (NDB), due to different intrinsic spatial resolutions or correction methods. The present study aimed to evaluate the correction of spatial resolution differences between equipment and examine the differences in skull bone attenuation to construct a common NDB for use in multicenter settings. The proposed acquisition and processing protocols were those routinely used at each participating center with additional triple energy window (TEW) scatter correction (SC) and computed tomography (CT) based attenuation correction (CTAC). A multicenter phantom study was conducted on six imaging systems in five centers, with either single photon emission computed tomography (SPECT) or SPECT/CT, and two brain phantoms. The gray/white matter I-123 activity ratio in the brain phantoms was 4, and they were enclosed in either an artificial adult male skull, 1300 Hounsfield units (HU), a female skull, 850 HU, or an acrylic cover. The cut-off frequency of the Butterworth filters was adjusted so that the spatial resolution was unified to a 17.9 mm full width at half maximum (FWHM), that of the lowest resolution system. The gray-to-white matter count ratios were measured from SPECT images and compared with the actual activity ratio. In addition, mean, standard deviation and coefficient of variation images were calculated after normalization and anatomical standardization to evaluate the variability of the NDB. The gray-to-white matter count ratio error without SC and attenuation correction (AC) was significantly larger for higher bone densities (p < 0.05). The count ratio error with TEW and CTAC was approximately 5% regardless of bone density. After adjustment of the spatial resolution in the SPECT images, the variability of the NDB decreased and was comparable to that of the NDB without correction. The proposed protocol showed potential for constructing an appropriate common NDB from SPECT images with SC, AC and spatial resolution compensation.

  7. Toward a public toxicogenomics capability for supporting predictive toxicology: survey of current resources and chemical indexing of experiments in GEO and ArrayExpress.

    PubMed

    Williams-Devane, ClarLynda R; Wolf, Maritja A; Richard, Ann M

    2009-06-01

    A publicly available toxicogenomics capability for supporting predictive toxicology and meta-analysis depends on availability of gene expression data for chemical treatment scenarios, the ability to locate and aggregate such information by chemical, and broad data coverage within chemical, genomics, and toxicological information domains. This capability also depends on common genomics standards, protocol description, and functional linkages of diverse public Internet data resources. We present a survey of public genomics resources from these vantage points and conclude that, despite progress in many areas, the current state of the majority of public microarray databases is inadequate for supporting these objectives, particularly with regard to chemical indexing. To begin to address these inadequacies, we focus chemical annotation efforts on experimental content contained in the two primary public genomic resources: ArrayExpress and Gene Expression Omnibus. Automated scripts and extensive manual review were employed to transform free-text experiment descriptions into a standardized, chemically indexed inventory of experiments in both resources. These files, which include top-level summary annotations, allow for identification of current chemical-associated experimental content, as well as chemical-exposure-related (or "Treatment") content of greatest potential value to toxicogenomics investigation. With these chemical-index files, it is possible for the first time to assess the breadth and overlap of chemical study space represented in these databases, and to begin to assess the sufficiency of data with shared protocols for chemical similarity inferences. Chemical indexing of public genomics databases is a first important step toward integrating chemical, toxicological and genomics data into predictive toxicology.

  8. A protocol for a systematic literature review: comparing the impact of seasonal and meteorological parameters on acute respiratory infections in Indigenous and non-Indigenous peoples.

    PubMed

    Bishop-Williams, Katherine E; Sargeant, Jan M; Berrang-Ford, Lea; Edge, Victoria L; Cunsolo, Ashlee; Harper, Sherilee L

    2017-01-26

    Acute respiratory infections (ARI) are a leading cause of morbidity and mortality globally, and are often linked to seasonal and/or meteorological conditions. Globally, Indigenous peoples may experience a different burden of ARI compared to non-Indigenous peoples. This protocol outlines our process for conducting a systematic review to investigate whether associations between ARI and seasonal or meteorological parameters differ between Indigenous and non-Indigenous groups residing in the same geographical region. A search string will be used to search PubMed ® , CAB Abstracts/CAB Direct © , and Science Citation Index ® aggregator databases. Articles will be screened using inclusion/exclusion criteria applied first at the title and abstract level, and then at the full article level by two independent reviewers. Articles maintained after full article screening will undergo risk of bias assessment and data will be extracted. Heterogeneity tests, meta-analysis, and forest and funnel plots will be used to synthesize the results of eligible studies. This protocol paper describes our systematic review methods to identify and analyze relevant ARI, season, and meteorological literature with robust reporting. The results are intended to improve our understanding of potential associations between seasonal and meteorological parameters and ARI and, if identified, whether this association varies by place, population, or other characteristics. The protocol is registered in the PROSPERO database (#38051).

  9. High efficiency endocrine operation protocol: From design to implementation.

    PubMed

    Mascarella, Marco A; Lahrichi, Nadia; Cloutier, Fabienne; Kleiman, Simcha; Payne, Richard J; Rosenberg, Lawrence

    2016-10-01

    We developed a high efficiency endocrine operative protocol based on a mathematical programming approach, process reengineering, and value-stream mapping to increase the number of operations completed per day without increasing operating room time at a tertiary-care, academic center. Using this protocol, a case-control study of 72 patients undergoing endocrine operation during high efficiency days were age, sex, and procedure-matched to 72 patients undergoing operation during standard days. The demographic profile, operative times, and perioperative complications were noted. The average number of cases per 8-hour workday in the high efficiency and standard operating rooms were 7 and 5, respectively. Mean procedure times in both groups were similar. The turnaround time (mean ± standard deviation) in the high efficiency group was 8.5 (±2.7) minutes as compared with 15.4 (±4.9) minutes in the standard group (P < .001). Transient postoperative hypocalcemia was 6.9% (5/72) and 8.3% (6/72) for the high efficiency and standard groups, respectively (P = .99). In this study, patients undergoing high efficiency endocrine operation had similar procedure times and perioperative complications compared with the standard group. The proposed high efficiency protocol seems to better utilize operative time and decrease the backlog of patients waiting for endocrine operation in a country with a universal national health care program. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Effectiveness of standardized approach versus usual care on physiotherapy treatment for patients submitted to alveolar bone graft: a pilot study.

    PubMed

    Vidotto, Laís Silva; Bigliassi, Marcelo; Alencar, Tatiane Romanini Rodrigues; Silva, Thaísa Maria Santos; Probst, Vanessa Suziane

    2015-07-01

    To compare the acute effects of a standardized physiotherapy protocol versus a typical non-standardized physiotherapy protocol on pain and performance of patients undergoing alveolar bone graft (ABG). Sixteen patients (9 males; 12 [11-13] years) with cleft lip and palate undergoing ABG were allocated into two groups: (1) experimental group--EG (standardized physiotherapy protocol); and (2) control group--CG (typical, non-standardized physiotherapy treatment). Range of motion, muscle strength, gait speed, and pain level were assessed prior to surgical intervention (PRE), as well as on the first, second, and third post-operative days (1st, 2nd, and 3rd PO, respectively). Recovery with respect to range of motion of hip flexion was more pronounced in the EG (64.6 ± 11.0°) in comparison to the CG (48.5 ± 17.7° on the 3rd PO; p < 0.05). In addition, less pain was observed in the EG (0 [0-0.2] versus 2 [0.7-3] in the CG on the 3rd PO; p < 0.05). A standardized physiotherapy protocol appears to be better than a non-standardized physiotherapy protocol for acute improvement of range of motion of hip flexion and for reducing pain in patients undergoing ABG.

  11. A Public-Use, Full-Screen Interface for SPIRES Databases.

    ERIC Educational Resources Information Center

    Kriz, Harry M.

    This paper describes the techniques for implementing a full-screen, custom SPIRES interface for a public-use library database. The database-independent protocol that controls the system is described in detail. Source code for an entire working application using this interface is included. The protocol, with less than 170 lines of procedural code,…

  12. Ultra-low-dose computed tomographic angiography with model-based iterative reconstruction compared with standard-dose imaging after endovascular aneurysm repair: a prospective pilot study.

    PubMed

    Naidu, Sailen G; Kriegshauser, J Scott; Paden, Robert G; He, Miao; Wu, Qing; Hara, Amy K

    2014-12-01

    An ultra-low-dose radiation protocol reconstructed with model-based iterative reconstruction was compared with our standard-dose protocol. This prospective study evaluated 20 men undergoing surveillance-enhanced computed tomography after endovascular aneurysm repair. All patients underwent standard-dose and ultra-low-dose venous phase imaging; images were compared after reconstruction with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction. Objective measures of aortic contrast attenuation and image noise were averaged. Images were subjectively assessed (1 = worst, 5 = best) for diagnostic confidence, image noise, and vessel sharpness. Aneurysm sac diameter and endoleak detection were compared. Quantitative image noise was 26% less with ultra-low-dose model-based iterative reconstruction than with standard-dose adaptive statistical iterative reconstruction and 58% less than with ultra-low-dose adaptive statistical iterative reconstruction. Average subjective noise scores were not different between ultra-low-dose model-based iterative reconstruction and standard-dose adaptive statistical iterative reconstruction (3.8 vs. 4.0, P = .25). Subjective scores for diagnostic confidence were better with standard-dose adaptive statistical iterative reconstruction than with ultra-low-dose model-based iterative reconstruction (4.4 vs. 4.0, P = .002). Vessel sharpness was decreased with ultra-low-dose model-based iterative reconstruction compared with standard-dose adaptive statistical iterative reconstruction (3.3 vs. 4.1, P < .0001). Ultra-low-dose model-based iterative reconstruction and standard-dose adaptive statistical iterative reconstruction aneurysm sac diameters were not significantly different (4.9 vs. 4.9 cm); concordance for the presence of endoleak was 100% (P < .001). Compared with a standard-dose technique, an ultra-low-dose model-based iterative reconstruction protocol provides comparable image quality and diagnostic assessment at a 73% lower radiation dose.

  13. How to benchmark methods for structure-based virtual screening of large compound libraries.

    PubMed

    Christofferson, Andrew J; Huang, Niu

    2012-01-01

    Structure-based virtual screening is a useful computational technique for ligand discovery. To systematically evaluate different docking approaches, it is important to have a consistent benchmarking protocol that is both relevant and unbiased. Here, we describe the designing of a benchmarking data set for docking screen assessment, a standard docking screening process, and the analysis and presentation of the enrichment of annotated ligands among a background decoy database.

  14. Standard Port-Visit Cost Forecasting Model for U.S. Navy Husbanding Contracts

    DTIC Science & Technology

    2009-12-01

    Protocol (HTTP) server.35 2. MySQL . An open-source database.36 3. PHP . A common scripting language used for Web development.37 E. IMPLEMENTATION OF...Inc. (2009). MySQL Community Server (Version 5.1) [Software]. Available from http://dev.mysql.com/downloads/ 37 The PHP Group (2009). PHP (Version...Logistics Services MySQL My Structured Query Language NAVSUP Navy Supply Systems Command NC Non-Contract Items NPS Naval Postgraduate

  15. Patient doses from CT examinations in Turkey.

    PubMed

    Ataç, Gökçe Kaan; Parmaksız, Aydın; İnal, Tolga; Bulur, Emine; Bulgurlu, Figen; Öncü, Tolga; Gündoğdu, Sadi

    2015-01-01

    We aimed to establish the first diagnostic reference levels (DRLs) for computed tomography (CT) examinations in adult and pediatric patients in Turkey and compare these with international DRLs. CT performance information and examination parameters (for head, chest, high-resolution CT of the chest [HRCT-chest], abdominal, and pelvic protocols) from 1607 hospitals were collected via a survey. Dose length products and effective doses for standard patient sizes were calculated from the reported volume CT dose index (CTDIvol). The median number of protocols reported from the 167 responding hospitals (10% response rate) was 102 across five different age groups. Third quartile CTDIvol values for adult pelvic and all pediatric body protocols were higher than the European Commission standards but were comparable to studies conducted in other countries. The radiation dose indicators for adult patients were similar to those reported in the literature, except for those associated with head protocols. CT protocol optimization is necessary for adult head and pediatric chest, HRCT-chest, abdominal, and pelvic protocols. The findings from this study are recommended for use as national DRLs in Turkey.

  16. Potential Projective Material on the Rorschach: Comparing Comprehensive System Protocols to Their Modeled R-Optimized Administration Counterparts.

    PubMed

    Pianowski, Giselle; Meyer, Gregory J; Villemor-Amaral, Anna Elisa de

    2016-01-01

    Exner ( 1989 ) and Weiner ( 2003 ) identified 3 types of Rorschach codes that are most likely to contain personally relevant projective material: Distortions, Movement, and Embellishments. We examine how often these types of codes occur in normative data and whether their frequency changes for the 1st, 2nd, 3rd, 4th, or last response to a card. We also examine the impact on these variables of the Rorschach Performance Assessment System's (R-PAS) statistical modeling procedures that convert the distribution of responses (R) from Comprehensive System (CS) administered protocols to match the distribution of R found in protocols obtained using R-optimized administration guidelines. In 2 normative reference databases, the results indicated that about 40% of responses (M = 39.25) have 1 type of code, 15% have 2 types, and 1.5% have all 3 types, with frequencies not changing by response number. In addition, there were no mean differences in the original CS and R-optimized modeled records (M Cohen's d = -0.04 in both databases). When considered alongside findings showing minimal differences between the protocols of people randomly assigned to CS or R-optimized administration, the data suggest R-optimized administration should not alter the extent to which potential projective material is present in a Rorschach protocol.

  17. Improving post-stroke dysphagia outcomes through a standardized and multidisciplinary protocol: an exploratory cohort study.

    PubMed

    Gandolfi, Marialuisa; Smania, Nicola; Bisoffi, Giulia; Squaquara, Teresa; Zuccher, Paola; Mazzucco, Sara

    2014-12-01

    Stroke is a major cause of dysphagia. Few studies to date have reported on standardized multidisciplinary protocolized approaches to the management of post-stroke dysphagia. The aim of this retrospective cohort study was to evaluate the impact of a standardized multidisciplinary protocol on clinical outcomes in patients with post-stroke dysphagia. We performed retrospective chart reviews of patients with post-stroke dysphagia admitted to the neurological ward of Verona University Hospital from 2004 to 2008. Outcomes after usual treatment for dysphagia (T- group) were compared versus outcomes after treatment under a standardized diagnostic and rehabilitative multidisciplinary protocol (T+ group). Outcome measures were death, pneumonia on X-ray, need for respiratory support, and proportion of patients on tube feeding at discharge. Of the 378 patients admitted with stroke, 84 had dysphagia and were enrolled in the study. A significantly lower risk of in-hospital death (odds ratio [OR] 0.20 [0.53-0.78]), pneumonia (OR 0.33 [0.10-1.03]), need for respiratory support (OR 0.48 [0.14-1.66]), and tube feeding at discharge (OR 0.30 [0.09-0.91]) was recorded for the T+ group (N = 39) as compared to the T- group (N = 45). The adjusted OR showed no difference between the two groups for in-hospital death and tube feeding at discharge. Use of a standardized multidisciplinary protocolized approach to the management of post-stroke dysphagia may significantly reduce rates of aspiration pneumonia, in-hospital mortality, and tube feeding in dysphagic stroke survivors. Consistent with the study's exploratory purposes, our findings suggest that the multidisciplinary protocol applied in this study offers an effective model of management of post-stroke dysphagia.

  18. Ligand efficiency based approach for efficient virtual screening of compound libraries.

    PubMed

    Ke, Yi-Yu; Coumar, Mohane Selvaraj; Shiao, Hui-Yi; Wang, Wen-Chieh; Chen, Chieh-Wen; Song, Jen-Shin; Chen, Chun-Hwa; Lin, Wen-Hsing; Wu, Szu-Huei; Hsu, John T A; Chang, Chung-Ming; Hsieh, Hsing-Pang

    2014-08-18

    Here we report for the first time the use of fit quality (FQ), a ligand efficiency (LE) based measure for virtual screening (VS) of compound libraries. The LE based VS protocol was used to screen an in-house database of 125,000 compounds to identify aurora kinase A inhibitors. First, 20 known aurora kinase inhibitors were docked to aurora kinase A crystal structure (PDB ID: 2W1C); and the conformations of docked ligand were used to create a pharmacophore (PH) model. The PH model was used to screen the database compounds, and rank (PH rank) them based on the predicted IC50 values. Next, LE_Scale, a weight-dependant LE function, was derived from 294 known aurora kinase inhibitors. Using the fit quality (FQ = LE/LE_Scale) score derived from the LE_Scale function, the database compounds were reranked (PH_FQ rank) and the top 151 (0.12% of database) compounds were assessed for aurora kinase A inhibition biochemically. This VS protocol led to the identification of 7 novel hits, with compound 5 showing aurora kinase A IC50 = 1.29 μM. Furthermore, testing of 5 against a panel of 31 kinase reveals that it is selective toward aurora kinase A & B, with <50% inhibition for other kinases at 10 μM concentrations and is a suitable candidate for further development. Incorporation of FQ score in the VS protocol not only helped identify a novel aurora kinase inhibitor, 5, but also increased the hit rate of the VS protocol by improving the enrichment factor (EF) for FQ based screening (EF = 828), compared to PH based screening (EF = 237) alone. The LE based VS protocol disclosed here could be applied to other targets for hit identification in an efficient manner. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  19. A portal for the ocean biogeographic information system

    USGS Publications Warehouse

    Zhang, Yunqing; Grassle, J. F.

    2002-01-01

    Since its inception in 1999 the Ocean Biogeographic Information System (OBIS) has developed into an international science program as well as a globally distributed network of biogeographic databases. An OBIS portal at Rutgers University provides the links and functional interoperability among member database systems. Protocols and standards have been established to support effective communication between the portal and these functional units. The portal provides distributed data searching, a taxonomy name service, a GIS with access to relevant environmental data, biological modeling, and education modules for mariners, students, environmental managers, and scientists. The portal will integrate Census of Marine Life field projects, national data archives, and other functional modules, and provides for network-wide analyses and modeling tools.

  20. Comparison between TG-51 and TG-21: Calibration of photon and electron beams in water using cylindrical chambers.

    PubMed

    Cho, S H; Lowenstein, J R; Balter, P A; Wells, N H; Hanson, W F

    2000-01-01

    A new calibration protocol, developed by the AAPM Task Group 51 (TG-51) to replace the TG-21 protocol, is based on an absorbed-dose to water standard and calibration factor (N(D,w)), while the TG-21 protocol is based on an exposure (or air-kerma) standard and calibration factor (N(x)). Because of differences between these standards and the two protocols, the results of clinical reference dosimetry based on TG-51 may be somewhat different from those based on TG-21. The Radiological Physics Center has conducted a systematic comparison between the two protocols, in which photon and electron beam outputs following both protocols were compared under identical conditions. Cylindrical chambers used in this study were selected from the list given in the TG-51 report, covering the majority of current manufacturers. Measured ratios between absorbed-dose and air-kerma calibration factors, derived from the standards traceable to the NIST, were compared with calculated values using the TG-21 protocol. The comparison suggests that there is roughly a 1% discrepancy between measured and calculated ratios. This discrepancy may provide a reasonable measure of possible changes between the absorbed-dose to water determined by TG-51 and that determined by TG-21 for photon beam calibrations. The typical change in a 6 MV photon beam calibration following the implementation of the TG-51 protocol was about 1%, regardless of the chamber used, and the change was somewhat smaller for an 18 MV photon beam. On the other hand, the results for 9 and 16 MeV electron beams show larger changes up to 2%, perhaps because of the updated electron stopping power data used for the TG-51 protocol, in addition to the inherent 1% discrepancy presented in the calibration factors. The results also indicate that the changes may be dependent on the electron energy.

  1. Wireless access to a pharmaceutical database: a demonstrator for data driven Wireless Application Protocol (WAP) applications in medical information processing.

    PubMed

    Schacht Hansen, M; Dørup, J

    2001-01-01

    The Wireless Application Protocol technology implemented in newer mobile phones has built-in facilities for handling much of the information processing needed in clinical work. To test a practical approach we ported a relational database of the Danish pharmaceutical catalogue to Wireless Application Protocol using open source freeware at all steps. We used Apache 1.3 web software on a Linux server. Data containing the Danish pharmaceutical catalogue were imported from an ASCII file into a MySQL 3.22.32 database using a Practical Extraction and Report Language script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further integration of Wireless Application Protocol phone functions in clinical information processing: Global System for Mobile communication telephony for bilateral communication, asynchronous unilateral communication via e-mail and Short Message Service, built-in calculator, calendar, personal organizer, phone number catalogue and Dictaphone function via answering machine technology. An independent Wireless Application Protocol gateway may be placed within hospital firewalls, which may be an advantage with respect to security. However, if Wireless Application Protocol phones are to become effective tools for physicians, special attention must be paid to the limitations of the devices. Input tools of Wireless Application Protocol phones should be improved, for instance by increased use of speech control.

  2. Wireless access to a pharmaceutical database: A demonstrator for data driven Wireless Application Protocol applications in medical information processing

    PubMed Central

    Hansen, Michael Schacht

    2001-01-01

    Background The Wireless Application Protocol technology implemented in newer mobile phones has built-in facilities for handling much of the information processing needed in clinical work. Objectives To test a practical approach we ported a relational database of the Danish pharmaceutical catalogue to Wireless Application Protocol using open source freeware at all steps. Methods We used Apache 1.3 web software on a Linux server. Data containing the Danish pharmaceutical catalogue were imported from an ASCII file into a MySQL 3.22.32 database using a Practical Extraction and Report Language script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. Results A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. Conclusions We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further integration of Wireless Application Protocol phone functions in clinical information processing: Global System for Mobile communication telephony for bilateral communication, asynchronous unilateral communication via e-mail and Short Message Service, built-in calculator, calendar, personal organizer, phone number catalogue and Dictaphone function via answering machine technology. An independent Wireless Application Protocol gateway may be placed within hospital firewalls, which may be an advantage with respect to security. However, if Wireless Application Protocol phones are to become effective tools for physicians, special attention must be paid to the limitations of the devices. Input tools of Wireless Application Protocol phones should be improved, for instance by increased use of speech control. PMID:11720946

  3. Enabling comparative modeling of closely related genomes: Example genus Brucella

    DOE PAGES

    Faria, José P.; Edirisinghe, Janaka N.; Davis, James J.; ...

    2014-03-08

    For many scientific applications, it is highly desirable to be able to compare metabolic models of closely related genomes. In this study, we attempt to raise awareness to the fact that taking annotated genomes from public repositories and using them for metabolic model reconstructions is far from being trivial due to annotation inconsistencies. We are proposing a protocol for comparative analysis of metabolic models on closely related genomes, using fifteen strains of genus Brucella, which contains pathogens of both humans and livestock. This study lead to the identification and subsequent correction of inconsistent annotations in the SEED database, as wellmore » as the identification of 31 biochemical reactions that are common to Brucella, which are not originally identified by automated metabolic reconstructions. We are currently implementing this protocol for improving automated annotations within the SEED database and these improvements have been propagated into PATRIC, Model-SEED, KBase and RAST. This method is an enabling step for the future creation of consistent annotation systems and high-quality model reconstructions that will support in predicting accurate phenotypes such as pathogenicity, media requirements or type of respiration.« less

  4. Enabling comparative modeling of closely related genomes: Example genus Brucella

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faria, José P.; Edirisinghe, Janaka N.; Davis, James J.

    For many scientific applications, it is highly desirable to be able to compare metabolic models of closely related genomes. In this study, we attempt to raise awareness to the fact that taking annotated genomes from public repositories and using them for metabolic model reconstructions is far from being trivial due to annotation inconsistencies. We are proposing a protocol for comparative analysis of metabolic models on closely related genomes, using fifteen strains of genus Brucella, which contains pathogens of both humans and livestock. This study lead to the identification and subsequent correction of inconsistent annotations in the SEED database, as wellmore » as the identification of 31 biochemical reactions that are common to Brucella, which are not originally identified by automated metabolic reconstructions. We are currently implementing this protocol for improving automated annotations within the SEED database and these improvements have been propagated into PATRIC, Model-SEED, KBase and RAST. This method is an enabling step for the future creation of consistent annotation systems and high-quality model reconstructions that will support in predicting accurate phenotypes such as pathogenicity, media requirements or type of respiration.« less

  5. Montreal Archive of Sleep Studies: an open-access resource for instrument benchmarking and exploratory research.

    PubMed

    O'Reilly, Christian; Gosselin, Nadia; Carrier, Julie; Nielsen, Tore

    2014-12-01

    Manual processing of sleep recordings is extremely time-consuming. Efforts to automate this process have shown promising results, but automatic systems are generally evaluated on private databases, not allowing accurate cross-validation with other systems. In lacking a common benchmark, the relative performances of different systems are not compared easily and advances are compromised. To address this fundamental methodological impediment to sleep study, we propose an open-access database of polysomnographic biosignals. To build this database, whole-night recordings from 200 participants [97 males (aged 42.9 ± 19.8 years) and 103 females (aged 38.3 ± 18.9 years); age range: 18-76 years] were pooled from eight different research protocols performed in three different hospital-based sleep laboratories. All recordings feature a sampling frequency of 256 Hz and an electroencephalography (EEG) montage of 4-20 channels plus standard electro-oculography (EOG), electromyography (EMG), electrocardiography (ECG) and respiratory signals. Access to the database can be obtained through the Montreal Archive of Sleep Studies (MASS) website (http://www.ceams-carsm.ca/en/MASS), and requires only affiliation with a research institution and prior approval by the applicant's local ethical review board. Providing the research community with access to this free and open sleep database is expected to facilitate the development and cross-validation of sleep analysis automation systems. It is also expected that such a shared resource will be a catalyst for cross-centre collaborations on difficult topics such as improving inter-rater agreement on sleep stage scoring. © 2014 European Sleep Research Society.

  6. LANES - LOCAL AREA NETWORK EXTENSIBLE SIMULATOR

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1994-01-01

    The Local Area Network Extensible Simulator (LANES) provides a method for simulating the performance of high speed local area network (LAN) technology. LANES was developed as a design and analysis tool for networking on board the Space Station. The load, network, link and physical layers of a layered network architecture are all modeled. LANES models to different lower-layer protocols, the Fiber Distributed Data Interface (FDDI) and the Star*Bus. The load and network layers are included in the model as a means of introducing upper-layer processing delays associated with message transmission; they do not model any particular protocols. FDDI is an American National Standard and an International Organization for Standardization (ISO) draft standard for a 100 megabit-per-second fiber-optic token ring. Specifications for the LANES model of FDDI are taken from the Draft Proposed American National Standard FDDI Token Ring Media Access Control (MAC), document number X3T9.5/83-16 Rev. 10, February 28, 1986. This is a mature document describing the FDDI media-access-control protocol. Star*Bus, also known as the Fiber Optic Demonstration System, is a protocol for a 100 megabit-per-second fiber-optic star-topology LAN. This protocol, along with a hardware prototype, was developed by Sperry Corporation under contract to NASA Goddard Space Flight Center as a candidate LAN protocol for the Space Station. LANES can be used to analyze performance of a networking system based on either FDDI or Star*Bus under a variety of loading conditions. Delays due to upper-layer processing can easily be nullified, allowing analysis of FDDI or Star*Bus as stand-alone protocols. LANES is a parameter-driven simulation; it provides considerable flexibility in specifying both protocol an run-time parameters. Code has been optimized for fast execution and detailed tracing facilities have been included. LANES was written in FORTRAN 77 for implementation on a DEC VAX under VMS 4.6. It consists of two programs, a simulation program and a user-interface program. The simulation program requires the SLAM II simulation library from Pritsker and Associates, W. Lafayette IN; the user interface is implemented using the Ingres database manager from Relational Technology, Inc. Information about running the simulation program without the user-interface program is contained in the documentation. The memory requirement is 129,024 bytes. LANES was developed in 1988.

  7. The "ComPAS Trial" combined treatment model for acute malnutrition: study protocol for the economic evaluation.

    PubMed

    Lelijveld, Natasha; Bailey, Jeanette; Mayberry, Amy; Trenouth, Lani; N'Diaye, Dieynaba S; Haghparast-Bidgoli, Hassan; Puett, Chloe

    2018-04-24

    Acute malnutrition is currently divided into severe (SAM) and moderate (MAM) based on level of wasting. SAM and MAM currently have separate treatment protocols and products, managed by separate international agencies. For SAM, the dose of treatment is allocated by the child's weight. A combined and simplified protocol for SAM and MAM, with a standardised dose of ready-to-use therapeutic food (RUTF), is being trialled for non-inferior recovery rates and may be more cost-effective than the current standard protocols for treating SAM and MAM. This is the protocol for the economic evaluation of the ComPAS trial, a cluster-randomised controlled, non-inferiority trial that compares a novel combined protocol for treating uncomplicated acute malnutrition compared to the current standard protocol in South Sudan and Kenya. We will calculate the total economic costs of both protocols from a societal perspective, using accounting data, interviews and survey questionnaires. The incremental cost of implementing the combined protocol will be estimated, and all costs and outcomes will be presented as a cost-consequence analysis. Incremental cost-effectiveness ratio will be calculated for primary and secondary outcome, if statistically significant. We hypothesise that implementing the combined protocol will be cost-effective due to streamlined logistics at clinic level, reduced length of treatment, especially for MAM, and reduced dosages of RUTF. The findings of this economic evaluation will be important for policymakers, especially given the hypothesised non-inferiority of the main health outcomes. The publication of this protocol aims to improve rigour of conduct and transparency of data collection and analysis. It is also intended to promote inclusion of economic evaluation in other nutrition intervention studies, especially for MAM, and improve comparability with other studies. ISRCTN 30393230 , date: 16/03/2017.

  8. Practical quantum private query with better performance in resisting joint-measurement attack

    NASA Astrophysics Data System (ADS)

    Wei, Chun-Yan; Wang, Tian-Yin; Gao, Fei

    2016-04-01

    As a kind of practical protocol, quantum-key-distribution (QKD)-based quantum private queries (QPQs) have drawn lots of attention. However, joint-measurement (JM) attack poses a noticeable threat to the database security in such protocols. That is, by JM attack a malicious user can illegally elicit many more items from the database than the average amount an honest one can obtain. Taking Jacobi et al.'s protocol as an example, by JM attack a malicious user can obtain as many as 500 bits, instead of the expected 2.44 bits, from a 104-bit database in one query. It is a noticeable security flaw in theory, and would also arise in application with the development of quantum memories. To solve this problem, we propose a QPQ protocol based on a two-way QKD scheme, which behaves much better in resisting JM attack. Concretely, the user Alice cannot get more database items by conducting JM attack on the qubits because she has to send them back to Bob (the database holder) before knowing which of them should be jointly measured. Furthermore, JM attack by both Alice and Bob would be detected with certain probability, which is quite different from previous protocols. Moreover, our protocol retains the good characters of QKD-based QPQs, e.g., it is loss tolerant and robust against quantum memory attack.

  9. Exosome-like vesicles in uterine aspirates: a comparison of ultracentrifugation-based isolation protocols.

    PubMed

    Campoy, Irene; Lanau, Lucia; Altadill, Tatiana; Sequeiros, Tamara; Cabrera, Silvia; Cubo-Abert, Montserrat; Pérez-Benavente, Assumpción; Garcia, Angel; Borrós, Salvador; Santamaria, Anna; Ponce, Jordi; Matias-Guiu, Xavier; Reventós, Jaume; Gil-Moreno, Antonio; Rigau, Marina; Colas, Eva

    2016-06-18

    Uterine aspirates are used in the diagnostic process of endometrial disorders, yet further applications could emerge if its complex milieu was simplified. Exosome-like vesicles isolated from uterine aspirates could become an attractive source of biomarkers, but there is a need to standardize isolation protocols. The objective of the study was to determine whether exosome-like vesicles exist in the fluid fraction of uterine aspirates and to compare protocols for their isolation, characterization, and analysis. We collected uterine aspirates from 39 pre-menopausal women suffering from benign gynecological diseases. The fluid fraction of 27 of those aspirates were pooled and split into equal volumes to evaluate three differential centrifugation-based procedures: (1) a standard protocol, (2) a filtration protocol, and (3) a sucrose cushion protocol. Characterization of isolated vesicles was assessed by electron microscopy, nanoparticle tracking analysis and immunoblot. Specifically for RNA material, we evaluate the effect of sonication and RNase A treatment at different steps of the protocol. We finally confirmed the efficiency of the selected methods in non-pooled samples. All protocols were useful to isolate exosome-like vesicles. However, the Standard procedure was the best performing protocol to isolate exosome-like vesicles from uterine aspirates: nanoparticle tracking analysis revealed a higher concentration of vesicles with a mode of 135 ± 5 nm, and immunoblot showed a higher expression of exosome-related markers (CD9, CD63, and CD81) thus verifying an enrichment in this type of vesicles. RNA contained in exosome-like vesicles was successfully extracted with no sonication treatment and exogenous nucleic acids digestion with RNaseA, allowing the analysis of the specific inner cargo by Real-Time qPCR. We confirmed the existence of exosome-like vesicles in the fluid fraction of uterine aspirates. They were successfully isolated by differential centrifugation giving sufficient proteomic and transcriptomic material for further analyses. The Standard protocol was the best performing procedure since the other two tested protocols did not ameliorate neither yield nor purity of exosome-like vesicles. This study contributes to establishing the basis for future comparative studies to foster the field of biomarker research in gynecology.

  10. Who needs inpatient detox? Development and implementation of a hospitalist protocol for the evaluation of patients for alcohol detoxification.

    PubMed

    Stephens, John R; Liles, E Allen; Dancel, Ria; Gilchrist, Michael; Kirsch, Jonathan; DeWalt, Darren A

    2014-04-01

    Clinicians caring for patients seeking alcohol detoxification face many challenges, including lack of evidence-based guidelines for treatment and high recidivism rates. To develop a standardized protocol for determining which alcohol dependent patients seeking detoxification need inpatient versus outpatient treatment, and to study the protocol's implementation. Review of best evidence by ad hoc task force and subsequent creation of standardized protocol. Prospective observational evaluation of initial protocol implementation. Patients presenting for alcohol detoxification. Development and implementation of a protocol for evaluation and treatment of patients requesting alcohol detoxification. Number of admissions per month with primary alcohol related diagnosis (DRG), 30-day readmission rate, and length of stay, all measured before and after protocol implementation. We identified one randomized clinical trial and three cohort studies to inform the choice of inpatient versus outpatient detoxification, along with one prior protocol in this population, and combined that data with clinical experience to create an institutional protocol. After implementation, the average number of alcohol related admissions was 15.9 per month, compared with 18.9 per month before implementation (p = 0.037). There was no difference in readmission rate or length of stay. Creation and utilization of a protocol led to standardization of care for patients requesting detoxification from alcohol. Initial evaluation of protocol implementation showed a decrease in number of admissions.

  11. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA).

    PubMed

    Muharemovic, O; Troelsen, A; Thomsen, M G; Kallemose, T; Gosvig, K K

    2018-05-01

    Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time and the number of exposure repetitions. Forty patients undergoing primary total hip arthroplasty were equally randomized to either a case or a control group. Radiographers in the case group were assisted by personalized patient protocols containing information about each patient's post-operative RSA imaging. Radiographers in the control group used a standard RSA protocol. At three months, radiographers in the case group significantly reduced (p < 0.001) the number of exposures by 1.6, examination time with 19.2 min, and distance between centrum of prosthesis and centrum of calibration field with 34.1 mm when compared to post-operative (baseline) results. At twelve months, the case group significantly reduced (p < 0.001) number of exposures by two, examination time with 22.5 min, and centrum of prosthesis to centrum of calibration field distance with 43.1 mm when compared to baseline results. No significant improvements were found in the control group at any time point. There is strong evidence that personalized RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute to the reduction of examination time, thus ensuring a cost benefit for department and patient safety. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  12. Summary Report Panel 1: The Need for Protocols and Standards in Research on Underwater Noise Impacts on Marine Life.

    PubMed

    Erbe, Christine; Ainslie, Michael A; de Jong, Christ A F; Racca, Roberto; Stocker, Michael

    2016-01-01

    As concern about anthropogenic noise and its impacts on marine fauna is increasing around the globe, data are being compared across populations, species, noise sources, geographic regions, and time. However, much of the raw and processed data are not comparable due to differences in measurement methodology, analysis and reporting, and a lack of metadata. Common protocols and more formal, international standards are needed to ensure the effectiveness of research, conservation, regulation and practice, and unambiguous communication of information and ideas. Developing standards takes time and effort, is largely driven by a few expert volunteers, and would benefit from stakeholders' contribution and support.

  13. Resident choice and the survey process: the need for standardized observation and transparency.

    PubMed

    Schnelle, John F; Bertrand, Rosanna; Hurd, Donna; White, Alan; Squires, David; Feuerberg, Marvin; Hickey, Kelly; Simmons, Sandra F

    2009-08-01

    To describe a standardized observation protocol to determine if nursing home (NH) staff offer choice to residents during 3 morning activities of daily living (ADL) and compare the observational data with deficiency statements cited by state survey staff. Morning ADL care was observed in 20 NHs in 5 states by research staff using a standardized observation protocol. The number of observations in which choice was not offered was documented for 3 morning ADL care activities and compared with deficiency statements made by surveyors. Staff failed to offer choice during morning ADL care delivery for at least 1 of 3 ADL care activities in all 20 NHs. Observational data showed residents were not offered choice about when to get out of bed (11%), what to wear (25%), and breakfast dining location (39%). In comparison, survey staff issued only 2 deficiencies in all 20 NHs relevant to choice in the targeted ADL care activities, and neither deficiency was based on observational data. Survey interpretative guidelines instruct surveyors to observe if residents are offered choice during daily care provision, but standardized observation protocols are not provided to surveyors to make this determination. The use of a standardized observation protocol in the survey process similar to that used by research staff in this study would improve the accuracy and transparency of the survey process.

  14. Effect of ultra-low doses, ASIR and MBIR on density and noise levels of MDCT images of dental implant sites.

    PubMed

    Widmann, Gerlig; Al-Shawaf, Reema; Schullian, Peter; Al-Sadhan, Ra'ed; Hörmann, Romed; Al-Ekrish, Asma'a A

    2017-05-01

    Differences in noise and density values in MDCT images obtained using ultra-low doses with FBP, ASIR, and MBIR may possibly affect implant site density analysis. The aim of this study was to compare density and noise measurements recorded from dental implant sites using ultra-low doses combined with FBP, ASIR, and MBIR. Cadavers were scanned using a standard protocol and four low-dose protocols. Scans were reconstructed using FBP, ASIR-50, ASIR-100, and MBIR, and either a bone or standard reconstruction kernel. Density (mean Hounsfield units [HUs]) of alveolar bone and noise levels (mean standard deviation of HUs) was recorded from all datasets and measurements were compared by paired t tests and two-way ANOVA with repeated measures. Significant differences in density and noise were found between the reference dose/FBP protocol and almost all test combinations. Maximum mean differences in HU were 178.35 (bone kernel) and 273.74 (standard kernel), and in noise, were 243.73 (bone kernel) and 153.88 (standard kernel). Decreasing radiation dose increased density and noise regardless of reconstruction technique and kernel. The effect of reconstruction technique on density and noise depends on the reconstruction kernel used. • Ultra-low-dose MDCT protocols allowed more than 90 % reductions in dose. • Decreasing the dose generally increased density and noise. • Effect of IRT on density and noise varies with reconstruction kernel. • Accuracy of low-dose protocols for interpretation of bony anatomy not known. • Effect of low doses on accuracy of computer-aided design models unknown.

  15. Ontology-based geospatial data query and integration

    USGS Publications Warehouse

    Zhao, T.; Zhang, C.; Wei, M.; Peng, Z.-R.

    2008-01-01

    Geospatial data sharing is an increasingly important subject as large amount of data is produced by a variety of sources, stored in incompatible formats, and accessible through different GIS applications. Past efforts to enable sharing have produced standardized data format such as GML and data access protocols such as Web Feature Service (WFS). While these standards help enabling client applications to gain access to heterogeneous data stored in different formats from diverse sources, the usability of the access is limited due to the lack of data semantics encoded in the WFS feature types. Past research has used ontology languages to describe the semantics of geospatial data but ontology-based queries cannot be applied directly to legacy data stored in databases or shapefiles, or to feature data in WFS services. This paper presents a method to enable ontology query on spatial data available from WFS services and on data stored in databases. We do not create ontology instances explicitly and thus avoid the problems of data replication. Instead, user queries are rewritten to WFS getFeature requests and SQL queries to database. The method also has the benefits of being able to utilize existing tools of databases, WFS, and GML while enabling query based on ontology semantics. ?? 2008 Springer-Verlag Berlin Heidelberg.

  16. Implementation of the Spanish ERAS program in bariatric surgery.

    PubMed

    Ruiz-Tovar, Jaime; Muñoz, José Luis; Royo, Pablo; Duran, Manuel; Redondo, Elisabeth; Ramirez, Jose Manuel

    2018-03-08

    The essence of Enhanced Recovery After Surgery (ERAS) programs is the multimodal approach, and many authors have demonstrated safety and feasibility in fast track bariatric surgery. According to this concept, a multidisciplinary ERAS program for bariatric surgery has been developed by the Spanish Fast Track Group (ERAS Spain). The aim of this study was to analyze the initial implementation of this Spanish National ERAS protocol in bariatric surgery, comparing it with a historical cohort receiving standard care. A multi-centric prospective study was performed, including 233 consecutive patients undergoing bariatric surgery during 2015 and following ERAS protocol. It was compared with a historical cohort of 286 patients, who underwent bariatric surgery at the same institutions between 2013 and 2014 and following standard care. Compliance with the protocol, morbidity, mortality, hospital stay and readmission were evaluated. Bariatric techniques performed were Roux-en-Y gastric bypass and sleeve gastrectomy. There were no significant differences in complications, mortality and readmission. Postoperative pain and hospital stay were significantly lower in the ERAS group. The total compliance to protocol was 80%. The Spanish National ERAS protocol is a safe issue, obtaining similar results to standard care in terms of complications, reoperations, mortality and readmissions. It is associated with less postoperative pain and earlier hospital discharge.

  17. Deep COI sequencing of standardized benthic samples unveils overlooked diversity of Jordanian coral reefs in the northern Red Sea.

    PubMed

    Al-Rshaidat, Mamoon M D; Snider, Allison; Rosebraugh, Sydney; Devine, Amanda M; Devine, Thomas D; Plaisance, Laetitia; Knowlton, Nancy; Leray, Matthieu

    2016-09-01

    High-throughput sequencing (HTS) of DNA barcodes (metabarcoding), particularly when combined with standardized sampling protocols, is one of the most promising approaches for censusing overlooked cryptic invertebrate communities. We present biodiversity estimates based on sequencing of the cytochrome c oxidase subunit 1 (COI) gene for coral reefs of the Gulf of Aqaba, a semi-enclosed system in the northern Red Sea. Samples were obtained from standardized sampling devices (Autonomous Reef Monitoring Structures (ARMS)) deployed for 18 months. DNA barcoding of non-sessile specimens >2 mm revealed 83 OTUs in six phyla, of which only 25% matched a reference sequence in public databases. Metabarcoding of the 2 mm - 500 μm and sessile bulk fractions revealed 1197 OTUs in 15 animal phyla, of which only 4.9% matched reference barcodes. These results highlight the scarcity of COI data for cryptobenthic organisms of the Red Sea. Compared with data obtained using similar methods, our results suggest that Gulf of Aqaba reefs are less diverse than two Pacific coral reefs but much more diverse than an Atlantic oyster reef at a similar latitude. The standardized approaches used here show promise for establishing baseline data on biodiversity, monitoring the impacts of environmental change, and quantifying patterns of diversity at regional and global scales.

  18. Time Trends of Period Prevalence Rates of Patients with Inhaled Long-Acting Beta-2-Agonists-Containing Prescriptions: A European Comparative Database Study

    PubMed Central

    Rottenkolber, Marietta; Voogd, Eef; van Dijk, Liset; Primatesta, Paola; Becker, Claudia; Schlienger, Raymond; de Groot, Mark C. H.; Alvarez, Yolanda; Durand, Julie; Slattery, Jim; Afonso, Ana; Requena, Gema; Gil, Miguel; Alvarez, Arturo; Hesse, Ulrik; Gerlach, Roman; Hasford, Joerg; Fischer, Rainald; Klungel, Olaf H.; Schmiedl, Sven

    2015-01-01

    Background Inhaled, long-acting beta-2-adrenoceptor agonists (LABA) have well-established roles in asthma and/or COPD treatment. Drug utilisation patterns for LABA have been described, but few studies have directly compared LABA use in different countries. We aimed to compare the prevalence of LABA-containing prescriptions in five European countries using a standardised methodology. Methods A common study protocol was applied to seven European healthcare record databases (Denmark, Germany, Spain, the Netherlands (2), and the UK (2)) to calculate crude and age- and sex-standardised annual period prevalence rates (PPRs) of LABA-containing prescriptions from 2002–2009. Annual PPRs were stratified by sex, age, and indication (asthma, COPD, asthma and COPD). Results From 2002–2009, age- and sex-standardised PPRs of patients with LABA-containing medications increased in all databases (58.2%–185.1%). Highest PPRs were found in men ≥ 80 years old and women 70–79 years old. Regarding the three indications, the highest age- and sex-standardised PPRs in all databases were found in patients with “asthma and COPD” but with large inter-country variation. In those with asthma or COPD, lower PPRs and smaller inter-country variations were found. For all three indications, PPRs for LABA-containing prescriptions increased with age. Conclusions Using a standardised protocol that allowed direct inter-country comparisons, we found highest rates of LABA-containing prescriptions in elderly patients and distinct differences in the increased utilisation of LABA-containing prescriptions within the study period throughout the five European countries. PMID:25706152

  19. Identification of genus Acinetobacter: Standardization of in-house PCR and its comparison with conventional phenotypic methods.

    PubMed

    Kulkarni, Sughosh S; Madalgi, Radhika; Ajantha, Ganavalli S; Kulkarni, Raghavendra D

    2017-01-01

    Acinetobacter is grouped under nonfermenting Gram-negative bacilli. It is increasingly isolated from pathological samples. The ability of this genus to acquire drug resistance and spread in the hospital settings is posing a grave problem in healthcare. Specific treatment protocols are advocated for Acinetobacter infections. Hence, rapid identification and drug susceptibility profiling are critical in the management of these infections. To standardize an in-house polymerase chain reaction (PCR) for identification of genus Acinetobacter and to compare PCR with two protocols for its phenotypic identification. A total of 96 clinical isolates of Acinetobacter were included in the study. An in-house PCR for genus level identification of Acinetobacter was standardized. All the isolates were phenotypically identified by two protocols. The results of PCR and phenotypic identification protocols were compared. The in-house PCR standardized was highly sensitive and specific for the genus Acinetobacter . There was 100% agreement between the phenotypic and molecular identification of the genus. The preliminary identification tests routinely used in clinical laboratories were also in complete agreement with phenotypic and molecular identification. The in-house PCR for genus level identification is specific and sensitive. However, it may not be essential for routine identification as the preliminary phenotypic identification tests used in the clinical laboratory reliably identify the genus Acinetobacter .

  20. Improving preterm infant outcomes: implementing an evidence-based oral feeding advancement protocol in the neonatal intensive care unit.

    PubMed

    Kish, Mary Z

    2014-10-01

    The ability of a preterm infant to exclusively oral feed is a necessary standard for discharge readiness from the neonatal intensive care unit (NICU). Many of the interventions related to oral feeding advancement currently employed for preterm infants in the NICU are based on individual nursing observations and judgment. Studies involving standardized feeding protocols for oral feeding advancement have been shown to decrease variability in feeding practices, facilitate shortened transition times from gavage to oral feedings, improve bottle feeding performance, and significantly decrease the length of stay (LOS) in the NICU. This project critically evaluated the implementation of an oral feeding advancement protocol in a 74-bed level III NICU in an attempt to standardize the process of advancing oral feedings in medically stable preterm infants. A comprehensive review of the literature identified key features for successful oral feeding in preterm infants. Strong levels of evidence suggested an association between both nonnutritive sucking (NNS) opportunities and standardized feeding advancement protocols with successful oral feeding in preterm infants. These findings prompted a pilot practice change using a feeding advancement protocol and consisted of NNS and standardized oral feeding advancement opportunities. Time to exclusive oral feedings and LOS were compared pre- and postprotocol implementation during more than a 2-month evaluation period. Infants using NNS and the standardized oral feeding advancement protocol had an observed reduction in time to exclusive oral feedings and LOS, although statistical significance was not achieved.

  1. The Icelandic 16-electrode electrohysterogram database

    PubMed Central

    Alexandersson, Asgeir; Steingrimsdottir, Thora; Terrien, Jeremy; Marque, Catherine; Karlsson, Brynjar

    2015-01-01

    External recordings of the electrohysterogram (EHG) can provide new knowledge on uterine electrical activity associated with contractions. Better understanding of the mechanisms underlying labor can contribute to preventing preterm birth which is the main cause of mortality and morbidity in newborns. Promising results using the EHG for labor prediction and other uses in obstetric care are the drivers of this work. This paper presents a database of 122 4-by-4 electrode EHG recordings performed on 45 pregnant women using a standardized recording protocol and a placement guide system. The recordings were performed in Iceland between 2008 and 2010. Of the 45 participants, 32 were measured repeatedly during the same pregnancy and participated in two to seven recordings. Recordings were performed in the third trimester (112 recordings) and during labor (10 recordings). The database includes simultaneously recorded tocographs, annotations of events and obstetric information on participants. The publication of this database enables independent and novel analysis of multi-electrode EHG by the researchers in the field and hopefully development towards new life-saving technology. PMID:25984349

  2. The Biomolecular Interaction Network Database and related tools 2005 update

    PubMed Central

    Alfarano, C.; Andrade, C. E.; Anthony, K.; Bahroos, N.; Bajec, M.; Bantoft, K.; Betel, D.; Bobechko, B.; Boutilier, K.; Burgess, E.; Buzadzija, K.; Cavero, R.; D'Abreo, C.; Donaldson, I.; Dorairajoo, D.; Dumontier, M. J.; Dumontier, M. R.; Earles, V.; Farrall, R.; Feldman, H.; Garderman, E.; Gong, Y.; Gonzaga, R.; Grytsan, V.; Gryz, E.; Gu, V.; Haldorsen, E.; Halupa, A.; Haw, R.; Hrvojic, A.; Hurrell, L.; Isserlin, R.; Jack, F.; Juma, F.; Khan, A.; Kon, T.; Konopinsky, S.; Le, V.; Lee, E.; Ling, S.; Magidin, M.; Moniakis, J.; Montojo, J.; Moore, S.; Muskat, B.; Ng, I.; Paraiso, J. P.; Parker, B.; Pintilie, G.; Pirone, R.; Salama, J. J.; Sgro, S.; Shan, T.; Shu, Y.; Siew, J.; Skinner, D.; Snyder, K.; Stasiuk, R.; Strumpf, D.; Tuekam, B.; Tao, S.; Wang, Z.; White, M.; Willis, R.; Wolting, C.; Wong, S.; Wrong, A.; Xin, C.; Yao, R.; Yates, B.; Zhang, S.; Zheng, K.; Pawson, T.; Ouellette, B. F. F.; Hogue, C. W. V.

    2005-01-01

    The Biomolecular Interaction Network Database (BIND) (http://bind.ca) archives biomolecular interaction, reaction, complex and pathway information. Our aim is to curate the details about molecular interactions that arise from published experimental research and to provide this information, as well as tools to enable data analysis, freely to researchers worldwide. BIND data are curated into a comprehensive machine-readable archive of computable information and provides users with methods to discover interactions and molecular mechanisms. BIND has worked to develop new methods for visualization that amplify the underlying annotation of genes and proteins to facilitate the study of molecular interaction networks. BIND has maintained an open database policy since its inception in 1999. Data growth has proceeded at a tremendous rate, approaching over 100 000 records. New services provided include a new BIND Query and Submission interface, a Standard Object Access Protocol service and the Small Molecule Interaction Database (http://smid.blueprint.org) that allows users to determine probable small molecule binding sites of new sequences and examine conserved binding residues. PMID:15608229

  3. The Icelandic 16-electrode electrohysterogram database

    NASA Astrophysics Data System (ADS)

    Alexandersson, Asgeir; Steingrimsdottir, Thora; Terrien, Jeremy; Marque, Catherine; Karlsson, Brynjar

    2015-04-01

    External recordings of the electrohysterogram (EHG) can provide new knowledge on uterine electrical activity associated with contractions. Better understanding of the mechanisms underlying labor can contribute to preventing preterm birth which is the main cause of mortality and morbidity in newborns. Promising results using the EHG for labor prediction and other uses in obstetric care are the drivers of this work. This paper presents a database of 122 4-by-4 electrode EHG recordings performed on 45 pregnant women using a standardized recording protocol and a placement guide system. The recordings were performed in Iceland between 2008 and 2010. Of the 45 participants, 32 were measured repeatedly during the same pregnancy and participated in two to seven recordings. Recordings were performed in the third trimester (112 recordings) and during labor (10 recordings). The database includes simultaneously recorded tocographs, annotations of events and obstetric information on participants. The publication of this database enables independent and novel analysis of multi-electrode EHG by the researchers in the field and hopefully development towards new life-saving technology.

  4. Self-collected versus clinician-collected sampling for sexually transmitted infections: a systematic review and meta-analysis protocol.

    PubMed

    Taylor, Darlene; Lunny, Carole; Wong, Tom; Gilbert, Mark; Li, Neville; Lester, Richard; Krajden, Mel; Hoang, Linda; Ogilvie, Gina

    2013-10-10

    Three meta-analyses and one systematic review have been conducted on the question of whether self-collected specimens are as accurate as clinician-collected specimens for STI screening. However, these reviews predate 2007 and did not analyze rectal or pharyngeal collection sites. Currently, there is no consensus on which sampling method is the most effective for the diagnosis of genital chlamydia (CT), gonorrhea (GC) or human papillomavirus (HPV) infection. Our meta-analysis aims to be comprehensive in that it will examine the evidence of whether self-collected vaginal, urine, pharyngeal and rectal specimens provide as accurate a clinical diagnosis as clinician-collected samples (reference standard). Eligible studies include both randomized and non-randomized controlled trials, pre- and post-test designs, and controlled observational studies. The databases that will be searched include the Cochrane Database of Systematic Reviews, Web of Science, Database of Abstracts of Reviews of Effects (DARE), EMBASE and PubMed/Medline. Data will be abstracted independently by two reviewers using a standardized pre-tested data abstraction form. Heterogeneity will be assessed using the Q2 test. Sensitivity and specificity estimates with 95% confidence intervals as well as negative and positive likelihood ratios will be pooled and weighted using random effects meta-analysis, if appropriate. A hierarchical summary receiver operating characteristics curve for self-collected specimens will be generated. This synthesis involves a meta-analysis of self-collected samples (urine, vaginal, pharyngeal and rectal swabs) versus clinician-collected samples for the diagnosis of CT, GC and HPV, the most prevalent STIs. Our systematic review will allow patients, clinicians and researchers to determine the diagnostic accuracy of specimens collected by patients compared to those collected by clinicians in the detection of chlamydia, gonorrhea and HPV.

  5. Proposal for the Development of a Standardized Protocol for Assessing the Economic Costs of HIV Prevention Interventions

    PubMed Central

    Pinkerton, Steven D.; Pearson, Cynthia R.; Eachus, Susan R.; Berg, Karina M.; Grimes, Richard M.

    2008-01-01

    Summary Maximizing our economic investment in HIV prevention requires balancing the costs of candidate interventions against their effects and selecting the most cost-effective interventions for implementation. However, many HIV prevention intervention trials do not collect cost information, and those that do use a variety of cost data collection methods and analysis techniques. Standardized cost data collection procedures, instrumentation, and analysis techniques are needed to facilitate the task of assessing intervention costs and to ensure comparability across intervention trials. This article describes the basic elements of a standardized cost data collection and analysis protocol and outlines a computer-based approach to implementing this protocol. Ultimately, the development of such a protocol would require contributions and “buy-in” from a diverse range of stakeholders, including HIV prevention researchers, cost-effectiveness analysts, community collaborators, public health decision makers, and funding agencies. PMID:18301128

  6. Recent updates and developments to plant genome size databases

    PubMed Central

    Garcia, Sònia; Leitch, Ilia J.; Anadon-Rosell, Alba; Canela, Miguel Á.; Gálvez, Francisco; Garnatje, Teresa; Gras, Airy; Hidalgo, Oriane; Johnston, Emmeline; Mas de Xaxars, Gemma; Pellicer, Jaume; Siljak-Yakovlev, Sonja; Vallès, Joan; Vitales, Daniel; Bennett, Michael D.

    2014-01-01

    Two plant genome size databases have been recently updated and/or extended: the Plant DNA C-values database (http://data.kew.org/cvalues), and GSAD, the Genome Size in Asteraceae database (http://www.asteraceaegenomesize.com). While the first provides information on nuclear DNA contents across land plants and some algal groups, the second is focused on one of the largest and most economically important angiosperm families, Asteraceae. Genome size data have numerous applications: they can be used in comparative studies on genome evolution, or as a tool to appraise the cost of whole-genome sequencing programs. The growing interest in genome size and increasing rate of data accumulation has necessitated the continued update of these databases. Currently, the Plant DNA C-values database (Release 6.0, Dec. 2012) contains data for 8510 species, while GSAD has 1219 species (Release 2.0, June 2013), representing increases of 17 and 51%, respectively, in the number of species with genome size data, compared with previous releases. Here we provide overviews of the most recent releases of each database, and outline new features of GSAD. The latter include (i) a tool to visually compare genome size data between species, (ii) the option to export data and (iii) a webpage containing information about flow cytometry protocols. PMID:24288377

  7. Comparison of Bruce treadmill exercise test protocols: is ramped Bruce equal or superior to standard bruce in producing clinically valid studies for patients presenting for evaluation of cardiac ischemia or arrhythmia with body mass index equal to or greater than 30?

    PubMed

    Bires, Angela Macci; Lawson, Dori; Wasser, Thomas E; Raber-Baer, Donna

    2013-12-01

    Clinically valid cardiac evaluation via treadmill stress testing requires patients to achieve specific target heart rates and to successfully complete the cardiac examination. A comparison of the standard Bruce protocol and the ramped Bruce protocol was performed using data collected over a 1-y period from a targeted patient population with a body mass index (BMI) equal to or greater than 30 to determine which treadmill protocol provided more successful examination results. The functional capacity, metabolic equivalent units achieved, pressure rate product, and total time on the treadmill as measured for the obese patients were clinically valid and comparable to normal-weight and overweight patients (P < 0.001). Data gathered from each protocol demonstrated that the usage of the ramped Bruce protocol achieved more consistent results in comparison across all BMI groups in achieving 80%-85% of their age-predicted maximum heart rate. This study did not adequately establish that the ramped Bruce protocol was superior to the standard Bruce protocol for the examination of patients with a BMI equal to or greater than 30.

  8. An engineering database management system for spacecraft operations

    NASA Technical Reports Server (NTRS)

    Cipollone, Gregorio; Mckay, Michael H.; Paris, Joseph

    1993-01-01

    Studies at ESOC have demonstrated the feasibility of a flexible and powerful Engineering Database Management System in support for spacecraft operations documentation. The objectives set out were three-fold: first an analysis of the problems encountered by the Operations team in obtaining and managing operations documents; secondly, the definition of a concept for operations documentation and the implementation of prototype to prove the feasibility of the concept; and thirdly, definition of standards and protocols required for the exchange of data between the top-level partners in a satellite project. The EDMS prototype was populated with ERS-l satellite design data and has been used by the operations team at ESOC to gather operational experience. An operational EDMS would be implemented at the satellite prime contractor's site as a common database for all technical information surrounding a project and would be accessible by the cocontractor's and ESA teams.

  9. Evaluation of a continuous-rotation, high-speed scanning protocol for micro-computed tomography.

    PubMed

    Kerl, Hans Ulrich; Isaza, Cristina T; Boll, Hanne; Schambach, Sebastian J; Nolte, Ingo S; Groden, Christoph; Brockmann, Marc A

    2011-01-01

    Micro-computed tomography is used frequently in preclinical in vivo research. Limiting factors are radiation dose and long scan times. The purpose of the study was to compare a standard step-and-shoot to a continuous-rotation, high-speed scanning protocol. Micro-computed tomography of a lead grid phantom and a rat femur was performed using a step-and-shoot and a continuous-rotation protocol. Detail discriminability and image quality were assessed by 3 radiologists. The signal-to-noise ratio and the modulation transfer function were calculated, and volumetric analyses of the femur were performed. The radiation dose of the scan protocols was measured using thermoluminescence dosimeters. The 40-second continuous-rotation protocol allowed a detail discriminability comparable to the step-and-shoot protocol at significantly lower radiation doses. No marked differences in volumetric or qualitative analyses were observed. Continuous-rotation micro-computed tomography significantly reduces scanning time and radiation dose without relevantly reducing image quality compared with a normal step-and-shoot protocol.

  10. Optimizing radiation exposure in screening of body packing: image quality and diagnostic acceptability of an 80 kVp protocol with automated tube current modulation.

    PubMed

    Aissa, Joel; Boos, Johannes; Rubbert, Christian; Caspers, Julian; Schleich, Christoph; Thomas, Christoph; Kröpil, Patric; Antoch, Gerald; Miese, Falk

    2017-06-01

    The aim of this study was to evaluate the objective and subjective image quality of a novel computed tomography (CT) protocol with reduced radiation dose for body packing with 80 kVp and automated tube current modulation (ATCM) compared to a standard body packing CT protocol. 80 individuals who were examined between March 2012 and July 2015 in suspicion of ingested drug packets were retrospectively included in this study. Thirty-one CT examinations were performed using ATCM and a fixed tube voltage of 80 kVp (group A). Forty-nine CT examinations were performed using a standard protocol with a tube voltage of 120 kVp and a fixed tube current time product of 40 mAs (group B). Subjective and objective image quality and visibility of drug packets were assessed. Radiation exposure of both protocols was compared. Contrast-to-noise ratio (group A: 0.56 ± 0.36; group B: 1.13 ± 0.91) and Signal-to-noise ratio (group A: 3.69 ± 0.98; group B: 7.08 ± 2.67) were significantly lower for group A compared to group B (p < 0.001). Subjectively, image quality was decreased for group A compared to group B (2.5 ± 0.8 vs. 1.2 ± 0.4; p < 0.001). Attenuation of body packets was higher with the new protocol (group A: 362.2 ± 70.3 Hounsfield Units (HU); group B: 210.6 ± 60.2 HU; p = 0.005). Volumetric Computed Tomography Dose Index (CTDIvol) and Dose Length Product (DLP) were significantly lower in group A (CTDIvol 2.2 ± 0.9 mGy, DLP 105.7 ± 52.3 mGycm) as compared to group B (CTDIvol 2.7 ± 0.1 mGy, DLP 126.0 ± 9.7 mGycm, p = 0.002 and p = 0.01). The novel 80 kVp CT protocol with ATCM leads to a significant dose reduction compared to a standard CT body packing protocol. The novel protocol led to a diagnostic image quality and cocaine body packets were reliably detected due to the high attenuation.

  11. C-arm flat-panel CT arthrography of the shoulder: Radiation dose considerations and preliminary data on diagnostic performance.

    PubMed

    Guggenberger, Roman; Ulbrich, Erika J; Dietrich, Tobias J; Scholz, Rosemarie; Kaelin, Pascal; Köhler, Christoph; Elsässer, Thilo; Le Corroller, Thomas; Pfammatter, Thomas; Alkadhi, Hatem; Andreisek, Gustav

    2017-02-01

    To investigate radiation dose and diagnostic performance of C-arm flat-panel CT (FPCT) versus standard multi-detector CT (MDCT) shoulder arthrography using MRI-arthrography as reference standard. Radiation dose of two different FPCT acquisitions (5 and 20 s) and standard MDCT of the shoulder were assessed using phantoms and thermoluminescence dosimetry. FPCT arthrographies were performed in 34 patients (mean age 44 ± 15 years). Different joint structures were quantitatively and qualitatively assessed by two independent radiologists. Inter-reader agreement and diagnostic performance were calculated. Effective radiation dose was markedly lower in FPCT 5 s (0.6 mSv) compared to MDCT (1.7 mSv) and FPCT 20 s (3.4 mSv). Contrast-to-noise ratios (CNRs) were significantly (p < 0.05) higher in FPCT 20-s versus 5-s protocols. Inter-reader agreements of qualitative ratings ranged between к = 0.47-1.0. Sensitivities for cartilage and rotator cuff pathologies were low for FPCT 5-s (40 % and 20 %) and moderate for FPCT 20-s protocols (75 % and 73 %). FPCT showed high sensitivity (81-86 % and 89-99 %) for bone and acromioclavicular-joint pathologies. Using a 5-s protocol FPCT shoulder arthrography provides lower radiation dose compared to MDCT but poor sensitivity for cartilage and rotator cuff pathologies. FPCT 20-s protocol is moderately sensitive for cartilage and rotator cuff tendon pathology with markedly higher radiation dose compared to MDCT. • FPCT shoulder arthrography is feasible with fluoroscopy and CT in one workflow. • A 5-s FPCT protocol applies a lower radiation dose than MDCT. • A 20-s FPCT protocol is moderately sensitive for cartilage and tendon pathology.

  12. Apneic Oxygenation May Not Prevent Severe Hypoxemia During Rapid Sequence Intubation: A Retrospective Helicopter Emergency Medical Service Study.

    PubMed

    Riyapan, Sattha; Lubin, Jeffrey

    This study sought to determine the effectiveness of apneic oxygenation in preventing hypoxemia during prehospital rapid sequence intubation (RSI). We performed a case-cohort study using a pre-existing database looking at intubation management by a single helicopter emergency medical service between July 2013 and June 2015. Apneic oxygenation using high-flow nasal cannula (15 L/min) was introduced to the standard RSI protocol in July 2014. Severe hypoxemia was defined as an incidence of oxygen saturation less than 90%. We compared patients who received apneic oxygenation during RSI with patients who did not using the Fisher exact test. Ninety-three patients were identified from the database; 29 (31.2%) received apneic oxygenation. Nineteen patients had an incidence of severe hypoxemia during RSI (20.43%; 95% confidence interval, 12.77%-30.05%). There was no statistically significant difference between the rate of severe hypoxemia between patients in the apneic oxygenation group versus the control group (17.2% vs. 21.9%, P = .78). In this study, patients who received apneic oxygenation did not show a statistically significant difference in severe hypoxemia during RSI. Copyright © 2016 Air Medical Journal Associates. Published by Elsevier Inc. All rights reserved.

  13. Establishment and maintenance of a standardized glioma tissue bank: Huashan experience.

    PubMed

    Aibaidula, Abudumijiti; Lu, Jun-feng; Wu, Jin-song; Zou, He-jian; Chen, Hong; Wang, Yu-qian; Qin, Zhi-yong; Yao, Yu; Gong, Ye; Che, Xiao-ming; Zhong, Ping; Li, Shi-qi; Bao, Wei-min; Mao, Ying; Zhou, Liang-fu

    2015-06-01

    Cerebral glioma is the most common brain tumor as well as one of the top ten malignant tumors in human beings. In spite of the great progress on chemotherapy and radiotherapy as well as the surgery strategies during the past decades, the mortality and morbidity are still high. One of the major challenges is to explore the pathogenesis and invasion of glioma at various "omics" levels (such as proteomics or genomics) and the clinical implications of biomarkers for diagnosis, prognosis or treatment of glioma patients. Establishment of a standardized tissue bank with high quality biospecimens annotated with clinical information is pivotal to the solution of these questions as well as the drug development process and translational research on glioma. Therefore, based on previous experience of tissue banks, standardized protocols for sample collection and storage were developed. We also developed two systems for glioma patient and sample management, a local database for medical records and a local image database for medical images. For future set-up of a regional biobank network in Shanghai, we also founded a centralized database for medical records. Hence we established a standardized glioma tissue bank with sufficient clinical data and medical images in Huashan Hospital. By September, 2013, tissues samples from 1,326 cases were collected. Histological diagnosis revealed that 73 % were astrocytic tumors, 17 % were oligodendroglial tumors, 2 % were oligoastrocytic tumors, 4 % were ependymal tumors and 4 % were other central nervous system neoplasms.

  14. Determination of absorbed dose to water for high-energy photon and electron beams-comparison of the standards DIN 6800-2 (1997), IAEA TRS 398 (2000) and DIN 6800-2 (2006)

    PubMed Central

    Zakaria, Golam Abu; Schuette, Wilhelm

    2007-01-01

    For the determination of the absorbed dose to water for high-energy photon and electron beams the IAEA code of practice TRS-398 (2000) is applied internationally. In Germany, the German dosimetry protocol DIN 6800-2 (1997) is used. Recently, the DIN standard has been revised and published as Draft National Standard DIN 6800-2 (2006). It has adopted widely the methodology and dosimetric data of the code of practice. This paper compares these three dosimetry protocols systematically and identifies similarities as well as differences. The investigation was done with 6 and 18 MV photon as well as 5 to 21 MeV electron beams. While only cylindrical chambers were used for photon beams, measurements of electron beams were performed using cylindrical as well as plane-parallel chambers. The discrepancies in the determination of absorbed dose to water between the three protocols were 0.4% for photon beams and 1.5% for electron beams. Comparative measurements showed a deviation of less than 0.5% between our measurements following protocol DIN 6800-2 (2006) and TLD inter-comparison procedure in an external audit. PMID:21217912

  15. Determination of absorbed dose to water for high-energy photon and electron beams-comparison of the standards DIN 6800-2 (1997), IAEA TRS 398 (2000) and DIN 6800-2 (2006).

    PubMed

    Zakaria, Golam Abu; Schuette, Wilhelm

    2007-01-01

    For the determination of the absorbed dose to water for high-energy photon and electron beams the IAEA code of practice TRS-398 (2000) is applied internationally. In Germany, the German dosimetry protocol DIN 6800-2 (1997) is used. Recently, the DIN standard has been revised and published as Draft National Standard DIN 6800-2 (2006). It has adopted widely the methodology and dosimetric data of the code of practice. This paper compares these three dosimetry protocols systematically and identifies similarities as well as differences. The investigation was done with 6 and 18 MV photon as well as 5 to 21 MeV electron beams. While only cylindrical chambers were used for photon beams, measurements of electron beams were performed using cylindrical as well as plane-parallel chambers. The discrepancies in the determination of absorbed dose to water between the three protocols were 0.4% for photon beams and 1.5% for electron beams. Comparative measurements showed a deviation of less than 0.5% between our measurements following protocol DIN 6800-2 (2006) and TLD inter-comparison procedure in an external audit.

  16. Comparison of eye lens dose on neuroimaging protocols between 16- and 64-section multidetector CT: achieving the lowest possible dose.

    PubMed

    Tan, J S P; Tan, K-L; Lee, J C L; Wan, C-M; Leong, J-L; Chan, L-L

    2009-02-01

    To our knowledge, there has been no study that compares the radiation dose delivered to the eye lens by 16- and 64-section multidetector CT (MDCT) for standard clinical neuroimaging protocols. Our aim was to assess radiation-dose differences between 16- and 64-section MDCT from the same manufacturer, by using near-identical neuroimaging protocols. Three cadaveric heads were scanned on 16- and 64-section MDCT by using standard neuroimaging CT protocols. Eye lens dose was measured by using thermoluminescent dosimeters (TLD), and each scanning was repeated to reduce random error. The dose-length product, volume CT dose index (CTDI(vol)), and TLD readings for each imaging protocol were averaged and compared between scanners and protocols, by using the paired Student t test. Statistical significance was defined at P < .05. The radiation dose delivered and eye lens doses were lower by 28.1%-45.7% (P < .000) on the 64-section MDCT for near-identical imaging protocols. On the 16-section MDCT, lens dose reduction was greatest (81.1%) on a tilted axial mode, compared with a nontilted helical mode for CT brain scans. Among the protocols studied, CT of the temporal bone delivered the greatest radiation dose to the eye lens. Eye lens radiation doses delivered by the 64-section MDCT are significantly lower, partly due to improvements in automatic tube current modulation technology. However, where applicable, protection of the eyes from the radiation beam by either repositioning the head or tilting the gantry remains the best way to reduce eye lens dose.

  17. The Role of Data Archives in Synoptic Solar Physics

    NASA Astrophysics Data System (ADS)

    Reardon, Kevin

    The detailed study of solar cycle variations requires analysis of recorded datasets spanning many years of observations, that is, a data archive. The use of digital data, combined with powerful database server software, gives such archives new capabilities to provide, quickly and flexibly, selected pieces of information to scientists. Use of standardized protocols will allow multiple databases, independently maintained, to be seamlessly joined, allowing complex searches spanning multiple archives. These data archives also benefit from being developed in parallel with the telescope itself, which helps to assure data integrity and to provide close integration between the telescope and archive. Development of archives that can guarantee long-term data availability and strong compatibility with other projects makes solar-cycle studies easier to plan and realize.

  18. Evaluation of refractive correction for standard automated perimetry in eyes wearing multifocal contact lenses

    PubMed Central

    Hirasawa, Kazunori; Ito, Hikaru; Ohori, Yukari; Takano, Yui; Shoji, Nobuyuki

    2017-01-01

    AIM To evaluate the refractive correction for standard automated perimetry (SAP) in eyes with refractive multifocal contact lenses (CL) in healthy young participants. METHODS Twenty-nine eyes of 29 participants were included. Accommodation was paralyzed in all participants with 1% cyclopentolate hydrochloride. SAP was performed using the Humphrey SITA-standard 24-2 and 10-2 protocol under three refractive conditions: monofocal CL corrected for near distance (baseline); multifocal CL corrected for distance (mCL-D); and mCL-D corrected for near vision using a spectacle lens (mCL-N). Primary outcome measures were the foveal threshold, mean deviation (MD), and pattern standard deviation (PSD). RESULTS The foveal threshold of mCL-N with both the 24-2 and 10-2 protocols significantly decreased by 2.2-2.5 dB (P<0.001), while that of mCL-D with the 24-2 protocol significantly decreased by 1.5 dB (P=0.0427), as compared with that of baseline. Although there was no significant difference between the MD of baseline and mCL-D with the 24-2 and 10-2 protocols, the MD of mCL-N was significantly decreased by 1.0-1.3 dB (P<0.001) as compared with that of both baseline and mCL-D, with both 24-2 and 10-2 protocols. There was no significant difference in the PSD among the three refractive conditions with both the 24-2 and 10-2 protocols. CONCLUSION Despite the induced mydriasis and the optical design of the multifocal lens used in this study, our results indicated that, when the dome-shaped visual field test is performed with eyes with large pupils and wearing refractive multifocal CLs, distance correction without additional near correction is to be recommended. PMID:29062776

  19. Evaluation of refractive correction for standard automated perimetry in eyes wearing multifocal contact lenses.

    PubMed

    Hirasawa, Kazunori; Ito, Hikaru; Ohori, Yukari; Takano, Yui; Shoji, Nobuyuki

    2017-01-01

    To evaluate the refractive correction for standard automated perimetry (SAP) in eyes with refractive multifocal contact lenses (CL) in healthy young participants. Twenty-nine eyes of 29 participants were included. Accommodation was paralyzed in all participants with 1% cyclopentolate hydrochloride. SAP was performed using the Humphrey SITA-standard 24-2 and 10-2 protocol under three refractive conditions: monofocal CL corrected for near distance (baseline); multifocal CL corrected for distance (mCL-D); and mCL-D corrected for near vision using a spectacle lens (mCL-N). Primary outcome measures were the foveal threshold, mean deviation (MD), and pattern standard deviation (PSD). The foveal threshold of mCL-N with both the 24-2 and 10-2 protocols significantly decreased by 2.2-2.5 dB ( P <0.001), while that of mCL-D with the 24-2 protocol significantly decreased by 1.5 dB ( P =0.0427), as compared with that of baseline. Although there was no significant difference between the MD of baseline and mCL-D with the 24-2 and 10-2 protocols, the MD of mCL-N was significantly decreased by 1.0-1.3 dB ( P <0.001) as compared with that of both baseline and mCL-D, with both 24-2 and 10-2 protocols. There was no significant difference in the PSD among the three refractive conditions with both the 24-2 and 10-2 protocols. Despite the induced mydriasis and the optical design of the multifocal lens used in this study, our results indicated that, when the dome-shaped visual field test is performed with eyes with large pupils and wearing refractive multifocal CLs, distance correction without additional near correction is to be recommended.

  20. Robust QKD-based private database queries based on alternative sequences of single-qubit measurements

    NASA Astrophysics Data System (ADS)

    Yang, YuGuang; Liu, ZhiChao; Chen, XiuBo; Zhou, YiHua; Shi, WeiMin

    2017-12-01

    Quantum channel noise may cause the user to obtain a wrong answer and thus misunderstand the database holder for existing QKD-based quantum private query (QPQ) protocols. In addition, an outside attacker may conceal his attack by exploiting the channel noise. We propose a new, robust QPQ protocol based on four-qubit decoherence-free (DF) states. In contrast to existing QPQ protocols against channel noise, only an alternative fixed sequence of single-qubit measurements is needed by the user (Alice) to measure the received DF states. This property makes it easy to implement the proposed protocol by exploiting current technologies. Moreover, to retain the advantage of flexible database queries, we reconstruct Alice's measurement operators so that Alice needs only conditioned sequences of single-qubit measurements.

  1. Optimizing the high-resolution manometry (HRM) study protocol.

    PubMed

    Patel, A; Ding, A; Mirza, F; Gyawali, C P

    2015-02-01

    Intolerance of the esophageal manometry catheter may prolong high-resolution manometry (HRM) studies and increase patient distress. We assessed the impact of obtaining the landmark phase at the end of the study when the patient has acclimatized to the HRM catheter. 366 patients (mean age 55.4 ± 0.8 years, 62.0% female) undergoing esophageal HRM over a 1-year period were studied. The standard protocol consisted of the landmark phase, 10 5 mL water swallows 20-30 s apart, and multiple rapid swallows where 4-6 2 mL swallows were administered in rapid succession. The modified protocol consisted of the landmark phase at the end of the study after test swallows. Study duration, technical characteristics, indications, and motor findings were compared between standard and modified protocols. Of the 366 patients, 89.6% underwent the standard protocol (study duration 12.9 ± 0.3 min). In 10.4% with poor catheter tolerance undergoing the modified protocol, study duration was significantly longer (15.6 ± 1.0 min, p = 0.004) despite similar duration of study maneuvers. Only elevated upper esophageal sphincter basal pressures at the beginning of the study segregated modified protocol patients. The 95th percentile time to landmark phase in the standard protocol patients was 6.1 min; as many as 31.4% of modified protocol patients could not obtain their first study maneuver within this period (p = 0.0003). Interpretation was not impacted by shifting the landmark phase to the end of the study. Modification of the HRM study protocol with the landmark phase obtained at the end of the study optimizes study duration without compromising quality. © 2014 John Wiley & Sons Ltd.

  2. Innovative Multimodal Physical Therapy Reduces Incidence of Repeat Manipulation under Anesthesia in Post-Total Knee Arthroplasty Patients Who Had an Initial Manipulation under Anesthesia.

    PubMed

    Chughtai, Morad; McGinn, Tanner; Bhave, Anil; Khan, Sabahat; Vashist, Megha; Khlopas, Anton; Mont, Michael A

    2016-11-01

    Manipulation under anesthesia (MUA) is performed for knee stiffness following a total knee arthroplasty (TKA) when nonoperative treatments fail. It is important to develop an optimal outpatient physical therapy protocol following an MUA, to avoid a repeat procedure. The purpose of this study was to evaluate and compare: (1) range of motion and (2) the rate of repeat MUA in patients who either underwent innovative multimodal physical therapy (IMMPT) or standard-of-care physical therapy (standard) following an MUA after a TKA. We performed a retrospective database study of patients who underwent an MUA following a TKA between January 2013 to December 2014 ( N  = 57). There were 16 (28%) men and 41 (72%) women who had a mean age of 59 years (range, 32-81 years). The patients were stratified into those who underwent IMMPT ( n  = 22) and those who underwent standard physical therapy ( n  = 35). The 6-month range of motion and rate of repeat manipulation between the two cohorts was analyzed by using Student t-test and Chi-square tests. In addition, we performed a Kaplan-Meier analysis of time to repeat MUA. The IMMPT cohort had a statistically significant higher proportion of TKAs with an optimal range of motion as compared with the standard cohort. There was statistically significant lower proportion of patients who underwent a repeat MUA in the IMMPT as compared with the standard cohort. There was also a significantly lower incidence and longer time to MUA in the IMMPT cohort as compared with the standard cohort in the Kaplan-Meier analysis. The group who underwent IMMPT utilizing Astym therapy had a significantly higher proportion of patients with optimal range of motion, which implies the potential efficacy of this regimen to improve range of motion. Furthermore, the IMMPT cohort had a significantly lower proportion of repeat manipulations as compared with the standard cohort, which implies that an IMMPT approach could potentially reduce the need for a repeat MUA. These findings warrant further investigation into outcomes of different rehab approaches. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  3. Nonsurgical Strategies in Patients With NET Liver Metastases: A Protocol of Four Systematic Reviews.

    PubMed

    Limani, Perparim; Tschuor, Christoph; Gort, Laura; Balmer, Bettina; Gu, Alexander; Ceresa, Christos; Raptis, Dimitri Aristotle; Lesurtel, Mickael; Puhan, Milo; Breitenstein, Stefan

    2014-03-07

    Patients diagnosed with neuroendocrine tumors (NETs) with hepatic metastases generally have a worse prognosis as compared with patients with nonmetastasized NETs. Due to tumor location and distant metastases, a surgical approach is often not possible and nonsurgical therapeutic strategies may apply. The aim of these systematic reviews is to evaluate the role of nonsurgical therapy options for patients with nonresectable liver metastases of NETs. An objective group of librarians will provide an electronic search strategy to examine the MEDLINE, EMBASE, and The Cochrane Library (Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, Cochrane Central Register of Controlled Trials [CENTRAL]) databases. There will be no restriction concerning language and publication date. The qualitative and quantitative synthesis of the systematic review will be conducted with randomized controlled trials (RCT), prospective, and retrospective comparative cohort, and case-control studies. Case series will be collected in a separate database and only used for descriptive purposes. This study is ongoing and presents a protocol of four systematic reviews to assess the role of nonsurgical treatment options in patients with neuroendocrine liver metastases. These systematic reviews, performed according to this protocol, will assess the value of noninvasive therapy options for patients with nonresectable liver metastases of NETs in combination with invasive techniques, such as percutaneous liver-directed techniques and local ablation techniques. International Prospective Register of Systematic Reviews (PROSPERO): CRD42012002657; http://www.metaxis.com/PROSPERO/full_doc.asp?RecordID=2657 (Archived by WebCite at http://www.webcitation.org/6NDlYi37O); CRD42012002658; http://www.metaxis.com/PROSPERO/full_doc.asp?RecordID=2658 (Archived by WebCite at http://www.webcitation.org/6NDlfWSuD); CRD42012002659; www.metaxis.com/PROSPERO/full_doc.asp?RecordID=2659 (Arichived by Webcite at http://www.webcitation.org/6NDlmWAFM); and CRD42012002660; http://www.metaxis.com/PROSPERO/full_doc.asp?RecordID=2660 (Archived by WebCite at http://www.webcitation.org/6NDmnylzp).

  4. Comparison and analysis of reoperations in two different treatment protocols for trochanteric hip fractures - postoperative technical complications with dynamic hip screw, intramedullary nail and Medoff sliding plate.

    PubMed

    Paulsson, Johnny; Stig, Josefine Corin; Olsson, Ola

    2017-08-24

    In treatment of unstable trochanteric fractures dynamic hip screw and Medoff sliding plate devices are designed to allow secondary fracture impaction, whereas intramedullary nails aim to maintain fracture alignment. Different treatment protocols are used by two similar Swedish regional emergency care hospitals. Dynamic hip screw is used for fractures considered as stable within the respective treatment protocol, whereas one treatment protocol (Medoff sliding plate/dynamic hip screw) uses biaxial Medoff sliding plate for unstable pertrochanteric fractures and uniaxial Medoff sliding plate for subtrochanteric fractures, the second (intramedullary nail/dynamic hip screw) uses intramedullary nail for subtrochanteric fractures and for pertrochanteric fractures with intertrochanteric comminution or subtrochanteric extension. All orthopedic surgeries are registered in a regional database. All consecutive trochanteric fracture operations during 2011-2012 (n = 856) and subsequent technical reoperations (n = 40) were derived from the database. Reoperations were analysed and classified into the categories adjustment (percutaneous removal of the locking screw of the Medoff sliding plate or the intramedullary nail, followed by fracture healing) or minor, intermediate (reosteosynthesis) or major (hip joint replacement, Girdlestone or persistent nonunion) technical complications. The relative risk of intermediate or major technical complications was 4.2 (1.2-14) times higher in unstable pertrochanteric fractures and 4.6 (1.1-19) times higher in subtrochanteric fractures with treatment protocol: intramedullary nail/dynamic hip screw, compared to treatment protocol: Medoff sliding plate/dynamic hip screw. Overall rates of intermediate and major technical complications in unstable pertrochanteric and subtrochanteric fractures were with biaxial Medoff sliding plate 0.68%, with uniaxial Medoff sliding plate 1.4%, with dynamic hip screw 3.4% and with intramedullary nail 7.2%. The treatment protocol based on use of biaxial Medoff sliding plate for unstable pertrochanteric and uniaxial Medoff sliding plate for subtrochanteric fractures reduced the risk of severe technical complications compared to using the treatment protocol based on dynamic hip screw and intramedullary nail.

  5. Academic consortium for the evaluation of computer-aided diagnosis (CADx) in mammography

    NASA Astrophysics Data System (ADS)

    Mun, Seong K.; Freedman, Matthew T.; Wu, Chris Y.; Lo, Shih-Chung B.; Floyd, Carey E., Jr.; Lo, Joseph Y.; Chan, Heang-Ping; Helvie, Mark A.; Petrick, Nicholas; Sahiner, Berkman; Wei, Datong; Chakraborty, Dev P.; Clarke, Laurence P.; Kallergi, Maria; Clark, Bob; Kim, Yongmin

    1995-04-01

    Computer aided diagnosis (CADx) is a promising technology for the detection of breast cancer in screening mammography. A number of different approaches have been developed for CADx research that have achieved significant levels of performance. Research teams now recognize the need for a careful and detailed evaluation study of approaches to accelerate the development of CADx, to make CADx more clinically relevant and to optimize the CADx algorithms based on unbiased evaluations. The results of such a comparative study may provide each of the participating teams with new insights into the optimization of their individual CADx algorithms. This consortium of experienced CADx researchers is working as a group to compare results of the algorithms and to optimize the performance of CADx algorithms by learning from each other. Each institution will be contributing an equal number of cases that will be collected under a standard protocol for case selection, truth determination, and data acquisition to establish a common and unbiased database for the evaluation study. An evaluation procedure for the comparison studies are being developed to analyze the results of individual algorithms for each of the test cases in the common database. Optimization of individual CADx algorithms can be made based on the comparison studies. The consortium effort is expected to accelerate the eventual clinical implementation of CADx algorithms at participating institutions.

  6. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications

    NASA Astrophysics Data System (ADS)

    Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A. F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.

    2014-05-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species' phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological "status", or the ability to track presence-absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  7. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications

    USGS Publications Warehouse

    Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A.F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.

    2014-01-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species’ phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological “status”, or the ability to track presence–absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  8. Standardized phenology monitoring methods to track plant and animal activity for science and resource management applications.

    PubMed

    Denny, Ellen G; Gerst, Katharine L; Miller-Rushing, Abraham J; Tierney, Geraldine L; Crimmins, Theresa M; Enquist, Carolyn A F; Guertin, Patricia; Rosemartin, Alyssa H; Schwartz, Mark D; Thomas, Kathryn A; Weltzin, Jake F

    2014-05-01

    Phenology offers critical insights into the responses of species to climate change; shifts in species' phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological "status", or the ability to track presence-absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.

  9. Chinese patent medicine Fei-Liu-Ping ointment as an adjunctive treatment for non-small cell lung cancer: protocol for a systematic review.

    PubMed

    Zheng, Honggang; He, Shulin; Liu, Rui; Xu, Xinyao; Xu, Tao; Chen, Shuntai; Guo, Qiujun; Gao, Yebo; Hua, Baojin

    2017-01-16

    Fei-Liu-Ping ointment has been widely applied as adjunctive drug in the treatment of non-small cell lung cancer (NSCLC). However, there has been no systematic review of research findings regarding the efficacy of this treatment. Here, we provide a protocol for assessing the effectiveness and safety of Fei-Liu-Ping ointment in the treatment of NSCLC. The electronic databases to be searched will include MEDLINE (PubMed), Cochrane Central Register of Controlled Trials (CENTRAL) in the Cochrane Library, Excerpt Medica Database (EMBASE), China National Knowledge Infrastructure (CNKI), China Scientific Journal Database (VIP), Wanfang Database and Chinese Biomedical Literature Database (CBM). Papers in English or Chinese published from inception to 2016 will be included without any restrictions. We will conduct a meta-analysis of randomised controlled trial if possible. The therapeutic effects according to the standard for treatment of solid tumours by the WHO and the quality of life as evaluated by Karnofsky score and weight will be applied as the primary outcomes. We will also evaluate the data synthesis and risk of bias using Review Manager 5.3 software. The results of this review will offer implications for the use of Fei-Liu-Ping ointment as an adjunctive treatment for NSCLC. This knowledge will inform recommendations by surgeons and researchers who are interested in the treatment of NSCLC. The results of this systematic review will be disseminated through presentation at a conference and publication of the data in a peer-reviewed journal. PROSPERO CRD42016036911. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  10. Protocol for Usability Testing and Validation of the ISO Draft International Standard 19223 for Lung Ventilators

    PubMed Central

    2017-01-01

    Background Clinicians, such as respiratory therapists and physicians, are often required to set up pieces of medical equipment that use inconsistent terminology. Current lung ventilator terminology that is used by different manufacturers contributes to the risk of usage errors, and in turn the risk of ventilator-associated lung injuries and other conditions. Human factors and communication issues are often associated with ventilator-related sentinel events, and inconsistent ventilator terminology compounds these issues. This paper describes our proposed protocol, which will be implemented at the University of Waterloo, Canada when this project is externally funded. Objective We propose to determine whether a standardized vocabulary improves the ease of use, safety, and utility as it relates to the usability of medical devices, compared to legacy medical devices from multiple manufacturers, which use different terms. Methods We hypothesize that usage errors by clinicians will be lower when standardization is consistently applied by all manufacturers. The proposed study will experimentally examine the impact of standardized nomenclature on performance declines in the use of an unfamiliar ventilator product in clinically relevant scenarios. Participants will be respiratory therapy practitioners and trainees, and we propose studying approximately 60 participants. Results The work reported here is in the proposal phase. Once the protocol is implemented, we will report the results in a follow-up paper. Conclusions The proposed study will help us better understand the effects of standardization on medical device usability. The study will also help identify any terms in the International Organization for Standardization (ISO) Draft International Standard (DIS) 19223 that may be associated with recurrent errors. Amendments to the standard will be proposed if recurrent errors are identified. This report contributes a protocol that can be used to assess the effect of standardization in any given domain that involves equipment, multiple manufacturers, inconsistent vocabulary, symbology, audio tones, or patterns in interface navigation. Second, the protocol can be used to experimentally evaluate the ISO DIS 19223 for its effectiveness, as researchers around the world may wish to conduct such tests and compare results. PMID:28887292

  11. Minimizing variance in pediatric gastrostomy: does standardized perioperative feeding plan decrease cost and improve outcomes?

    PubMed

    Sunstrom, Rachel; Hamilton, Nicholas; Fialkowski, Elizabeth; Lofberg, Katrine; McKee, Julie; Sims, Thomas; Krishnaswami, Sanjay; Azarow, Kenneth

    2016-05-01

    A protocol for laparoscopic gastrostomy placement was implemented which specified perioperative antibiotics, feeding regimens, and discharge criteria. Our hypothesis was that hospital cost could be decreased, whereas at the same time improving or maintaining patient outcomes. Data were collected on consecutive patients beginning 6 months after implementation of our protocol. We recorded surgeon compliance, patient outcomes (as defined by 30-day NSQIP complication rates), and cost of initial hospitalization, which was then compare to a 6-month historical control period. Our control group n = 26 and protocol group n = 39. Length of stay was shorter in the protocol group (P ≤ .05 by nonparametric analysis). The complication rate was similar in both groups (23% control vs 15% protocol, P = .43). Initial hospital costs were not different. Surgeon compliance to protocol was 82%. A standard protocol is achievable for gastrostomy tube management. After implementation of our protocol, we were able to show a significant decrease in length of stay, whereas maintaining quality. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Implementation of Quantum Private Queries Using Nuclear Magnetic Resonance

    NASA Astrophysics Data System (ADS)

    Wang, Chuan; Hao, Liang; Zhao, Lian-Jie

    2011-08-01

    We present a modified protocol for the realization of a quantum private query process on a classical database. Using one-qubit query and CNOT operation, the query process can be realized in a two-mode database. In the query process, the data privacy is preserved as the sender would not reveal any information about the database besides her query information, and the database provider cannot retain any information about the query. We implement the quantum private query protocol in a nuclear magnetic resonance system. The density matrix of the memory registers are constructed.

  13. Compatible topologies and parameters for NMR structure determination of carbohydrates by simulated annealing.

    PubMed

    Feng, Yingang

    2017-01-01

    The use of NMR methods to determine the three-dimensional structures of carbohydrates and glycoproteins is still challenging, in part because of the lack of standard protocols. In order to increase the convenience of structure determination, the topology and parameter files for carbohydrates in the program Crystallography & NMR System (CNS) were investigated and new files were developed to be compatible with the standard simulated annealing protocols for proteins and nucleic acids. Recalculating the published structures of protein-carbohydrate complexes and glycosylated proteins demonstrates that the results are comparable to the published structures which employed more complex procedures for structure calculation. Integrating the new carbohydrate parameters into the standard structure calculation protocol will facilitate three-dimensional structural study of carbohydrates and glycosylated proteins by NMR spectroscopy.

  14. Compatible topologies and parameters for NMR structure determination of carbohydrates by simulated annealing

    PubMed Central

    2017-01-01

    The use of NMR methods to determine the three-dimensional structures of carbohydrates and glycoproteins is still challenging, in part because of the lack of standard protocols. In order to increase the convenience of structure determination, the topology and parameter files for carbohydrates in the program Crystallography & NMR System (CNS) were investigated and new files were developed to be compatible with the standard simulated annealing protocols for proteins and nucleic acids. Recalculating the published structures of protein-carbohydrate complexes and glycosylated proteins demonstrates that the results are comparable to the published structures which employed more complex procedures for structure calculation. Integrating the new carbohydrate parameters into the standard structure calculation protocol will facilitate three-dimensional structural study of carbohydrates and glycosylated proteins by NMR spectroscopy. PMID:29232406

  15. Seebeck Coefficient Metrology: Do Contemporary Protocols Measure Up?

    NASA Astrophysics Data System (ADS)

    Martin, Joshua; Wong-Ng, Winnie; Green, Martin L.

    2015-06-01

    Comparative measurements of the Seebeck coefficient are challenging due to the diversity of instrumentation and measurement protocols. With the implementation of standardized measurement protocols and the use of Standard Reference Materials (SRMs®), for example, the recently certified National Institute of Standards and Technology (NIST) SRM® 3451 ``Low Temperature Seebeck Coefficient Standard (10-390 K)'', researchers can reliably analyze and compare data, both intra- and inter-laboratory, thereby accelerating the development of more efficient thermoelectric materials and devices. We present a comparative overview of commonly adopted Seebeck coefficient measurement practices. First, we examine the influence of asynchronous temporal and spatial measurement of electric potential and temperature. Temporal asynchronicity introduces error in the absolute Seebeck coefficient of the order of ≈10%, whereas spatial asynchronicity introduces error of the order of a few percent. Second, we examine the influence of poor thermal contact between the measurement probes and the sample. This is especially critical at high temperature, wherein the prevalent mode of measuring surface temperature is facilitated by pressure contact. Each topic will include the comparison of data measured using different measurement techniques and using different probe arrangements. We demonstrate that the probe arrangement is the primary limit to high accuracy, wherein the Seebeck coefficients measured by the 2-probe arrangement and those measured by the 4-probe arrangement diverge with the increase in temperature, approaching ≈14% at 900 K. Using these analyses, we provide recommended measurement protocols to guide members of the thermoelectric materials community in performing more accurate measurements and in evaluating more comprehensive uncertainty limits.

  16. A Community Data Model for Hydrologic Observations

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Horsburgh, J. S.; Zaslavsky, I.; Maidment, D. R.; Valentine, D.; Jennings, B.

    2006-12-01

    The CUAHSI Hydrologic Information System project is developing information technology infrastructure to support hydrologic science. Hydrologic information science involves the description of hydrologic environments in a consistent way, using data models for information integration. This includes a hydrologic observations data model for the storage and retrieval of hydrologic observations in a relational database designed to facilitate data retrieval for integrated analysis of information collected by multiple investigators. It is intended to provide a standard format to facilitate the effective sharing of information between investigators and to facilitate analysis of information within a single study area or hydrologic observatory, or across hydrologic observatories and regions. The observations data model is designed to store hydrologic observations and sufficient ancillary information (metadata) about the observations to allow them to be unambiguously interpreted and used and provide traceable heritage from raw measurements to usable information. The design is based on the premise that a relational database at the single observation level is most effective for providing querying capability and cross dimension data retrieval and analysis. This premise is being tested through the implementation of a prototype hydrologic observations database, and the development of web services for the retrieval of data from and ingestion of data into the database. These web services hosted by the San Diego Supercomputer center make data in the database accessible both through a Hydrologic Data Access System portal and directly from applications software such as Excel, Matlab and ArcGIS that have Standard Object Access Protocol (SOAP) capability. This paper will (1) describe the data model; (2) demonstrate the capability for representing diverse data in the same database; (3) demonstrate the use of the database from applications software for the performance of hydrologic analysis across different observation types.

  17. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  18. A DICOM based radiotherapy plan database for research collaboration and reporting

    NASA Astrophysics Data System (ADS)

    Westberg, J.; Krogh, S.; Brink, C.; Vogelius, I. R.

    2014-03-01

    Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.

  19. Standards for Environmental Measurement Using GIS: Toward a Protocol for Protocols.

    PubMed

    Forsyth, Ann; Schmitz, Kathryn H; Oakes, Michael; Zimmerman, Jason; Koepp, Joel

    2006-02-01

    Interdisciplinary research regarding how the built environment influences physical activity has recently increased. Many research projects conducted jointly by public health and environmental design professionals are using geographic information systems (GIS) to objectively measure the built environment. Numerous methodological issues remain, however, and environmental measurements have not been well documented with accepted, common definitions of valid, reliable variables. This paper proposes how to create and document standardized definitions for measures of environmental variables using GIS with the ultimate goal of developing reliable, valid measures. Inherent problems with software and data that hamper environmental measurement can be offset by protocols combining clear conceptual bases with detailed measurement instructions. Examples demonstrate how protocols can more clearly translate concepts into specific measurement. This paper provides a model for developing protocols to allow high quality comparative research on relationships between the environment and physical activity and other outcomes of public health interest.

  20. The immune response to anesthesia: part 2 sedatives, opioids, and injectable anesthetic agents.

    PubMed

    Anderson, Stacy L; Duke-Novakovski, Tanya; Singh, Baljit

    2014-11-01

    To review the immune response to injectable anesthetics and sedatives and to compare the immunomodulatory properties between inhalation and injectable anesthetic protocols. Review. Multiple literature searches were performed using PubMed and Google Scholar from March 2012 through November 2013. Relevant anesthetic and immune terms were used to search databases without year published or species constraints. The online database for Veterinary Anaesthesia and Analgesia and the Journal of Veterinary Emergency and Critical Care were searched by issue starting in 2000 for relevant articles. Sedatives, injectable anesthetics, opioids, and local anesthetics have immunomodulatory effects that may have positive or negative consequences on disease processes such as endotoxemia, generalized sepsis, tumor growth and metastasis, and ischemia-reperfusion injury. Therefore, anesthetists should consider the immunomodulatory effects of anesthetic drugs when designing anesthetic protocols for their patients. © 2014 Association of Veterinary Anaesthetists and the American College of Veterinary Anesthesia and Analgesia.

  1. Comparative Study Of Image Enhancement Algorithms For Digital And Film Mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgado-Gonzalez, A.; Sanmiguel, R. E.

    2008-08-11

    Here we discuss the application of edge enhancement algorithms on images obtained with a Mammography System which has a Selenium Detector and on the other hand, on images obtained from digitized film mammography. Comparative analysis of such images includes the study of technical aspects of image acquisition, storage, compression and display. A protocol for a local database has been created as a result of this study.

  2. Exchange, interpretation, and database-search of ion mobility spectra supported by data format JCAMP-DX

    NASA Technical Reports Server (NTRS)

    Baumback, J. I.; Davies, A. N.; Vonirmer, A.; Lampen, P. H.

    1995-01-01

    To assist peak assignment in ion mobility spectrometry it is important to have quality reference data. The reference collection should be stored in a database system which is capable of being searched using spectral or substance information. We propose to build such a database customized for ion mobility spectra. To start off with it is important to quickly reach a critical mass of data in the collection. We wish to obtain as many spectra combined with their IMS parameters as possible. Spectra suppliers will be rewarded for their participation with access to the database. To make the data exchange between users and system administration possible, it is important to define a file format specially made for the requirements of ion mobility spectra. The format should be computer readable and flexible enough for extensive comments to be included. In this document we propose a data exchange format, and we would like you to give comments on it. For the international data exchange it is important, to have a standard data exchange format. We propose to base the definition of this format on the JCAMP-DX protocol, which was developed for the exchange of infrared spectra. This standard made by the Joint Committee on Atomic and Molecular Physical Data is of a flexible design. The aim of this paper is to adopt JCAMP-DX to the special requirements of ion mobility spectra.

  3. A database for estimating organ dose for coronary angiography and brain perfusion CT scans for arbitrary spectra and angular tube current modulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rupcich, Franco; Badal, Andreu; Kyprianou, Iacovos

    Purpose: The purpose of this study was to develop a database for estimating organ dose in a voxelized patient model for coronary angiography and brain perfusion CT acquisitions with any spectra and angular tube current modulation setting. The database enables organ dose estimation for existing and novel acquisition techniques without requiring Monte Carlo simulations. Methods: The study simulated transport of monoenergetic photons between 5 and 150 keV for 1000 projections over 360 Degree-Sign through anthropomorphic voxelized female chest and head (0 Degree-Sign and 30 Degree-Sign tilt) phantoms and standard head and body CTDI dosimetry cylinders. The simulations resulted in tablesmore » of normalized dose deposition for several radiosensitive organs quantifying the organ dose per emitted photon for each incident photon energy and projection angle for coronary angiography and brain perfusion acquisitions. The values in a table can be multiplied by an incident spectrum and number of photons at each projection angle and then summed across all energies and angles to estimate total organ dose. Scanner-specific organ dose may be approximated by normalizing the database-estimated organ dose by the database-estimated CTDI{sub vol} and multiplying by a physical CTDI{sub vol} measurement. Two examples are provided demonstrating how to use the tables to estimate relative organ dose. In the first, the change in breast and lung dose during coronary angiography CT scans is calculated for reduced kVp, angular tube current modulation, and partial angle scanning protocols relative to a reference protocol. In the second example, the change in dose to the eye lens is calculated for a brain perfusion CT acquisition in which the gantry is tilted 30 Degree-Sign relative to a nontilted scan. Results: Our database provides tables of normalized dose deposition for several radiosensitive organs irradiated during coronary angiography and brain perfusion CT scans. Validation results indicate total organ doses calculated using our database are within 1% of those calculated using Monte Carlo simulations with the same geometry and scan parameters for all organs except red bone marrow (within 6%), and within 23% of published estimates for different voxelized phantoms. Results from the example of using the database to estimate organ dose for coronary angiography CT acquisitions show 2.1%, 1.1%, and -32% change in breast dose and 2.1%, -0.74%, and 4.7% change in lung dose for reduced kVp, tube current modulated, and partial angle protocols, respectively, relative to the reference protocol. Results show -19.2% difference in dose to eye lens for a tilted scan relative to a nontilted scan. The reported relative changes in organ doses are presented without quantification of image quality and are for the sole purpose of demonstrating the use of the proposed database. Conclusions: The proposed database and calculation method enable the estimation of organ dose for coronary angiography and brain perfusion CT scans utilizing any spectral shape and angular tube current modulation scheme by taking advantage of the precalculated Monte Carlo simulation results. The database can be used in conjunction with image quality studies to develop optimized acquisition techniques and may be particularly beneficial for optimizing dual kVp acquisitions for which numerous kV, mA, and filtration combinations may be investigated.« less

  4. Standardizing Quality Assessment of Fused Remotely Sensed Images

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  5. An Information System for European culture collections: the way forward.

    PubMed

    Casaregola, Serge; Vasilenko, Alexander; Romano, Paolo; Robert, Vincent; Ozerskaya, Svetlana; Kopf, Anna; Glöckner, Frank O; Smith, David

    2016-01-01

    Culture collections contain indispensable information about the microorganisms preserved in their repositories, such as taxonomical descriptions, origins, physiological and biochemical characteristics, bibliographic references, etc. However, information currently accessible in databases rarely adheres to common standard protocols. The resultant heterogeneity between culture collections, in terms of both content and format, notably hampers microorganism-based research and development (R&D). The optimized exploitation of these resources thus requires standardized, and simplified, access to the associated information. To this end, and in the interest of supporting R&D in the fields of agriculture, health and biotechnology, a pan-European distributed research infrastructure, MIRRI, including over 40 public culture collections and research institutes from 19 European countries, was established. A prime objective of MIRRI is to unite and provide universal access to the fragmented, and untapped, resources, information and expertise available in European public collections of microorganisms; a key component of which is to develop a dynamic Information System. For the first time, both culture collection curators as well as their users have been consulted and their feedback, concerning the needs and requirements for collection databases and data accessibility, utilised. Users primarily noted that databases were not interoperable, thus rendering a global search of multiple databases impossible. Unreliable or out-of-date and, in particular, non-homogenous, taxonomic information was also considered to be a major obstacle to searching microbial data efficiently. Moreover, complex searches are rarely possible in online databases thus limiting the extent of search queries. Curators also consider that overall harmonization-including Standard Operating Procedures, data structure, and software tools-is necessary to facilitate their work and to make high-quality data easily accessible to their users. Clearly, the needs of culture collection curators coincide with those of users on the crucial point of database interoperability. In this regard, and in order to design an appropriate Information System, important aspects on which the culture collection community should focus include: the interoperability of data sets with the ontologies to be used; setting best practice in data management, and the definition of an appropriate data standard.

  6. Validity of linear measurements of the jaws using ultralow-dose MDCT and the iterative techniques of ASIR and MBIR.

    PubMed

    Al-Ekrish, Asma'a A; Al-Shawaf, Reema; Schullian, Peter; Al-Sadhan, Ra'ed; Hörmann, Romed; Widmann, Gerlig

    2016-10-01

    To assess the comparability of linear measurements of dental implant sites recorded from multidetector computed tomography (MDCT) images obtained using standard-dose filtered backprojection (FBP) technique with those from various ultralow doses combined with FBP, adaptive statistical iterative reconstruction (ASIR), and model-based iterative reconstruction (MBIR) techniques. The results of the study may contribute to MDCT dose optimization for dental implant site imaging. MDCT scans of two cadavers were acquired using a standard reference protocol and four ultralow-dose test protocols (TP). The volume CT dose index of the different dose protocols ranged from a maximum of 30.48-36.71 mGy to a minimum of 0.44-0.53 mGy. All scans were reconstructed using FBP, ASIR-50, ASIR-100, and MBIR, and either a bone or standard reconstruction kernel. Linear measurements were recorded from standardized images of the jaws by two examiners. Intra- and inter-examiner reliability of the measurements were analyzed using Cronbach's alpha and inter-item correlation. Agreement between the measurements obtained with the reference-dose/FBP protocol and each of the test protocols was determined with Bland-Altman plots and linear regression. Statistical significance was set at a P-value of 0.05. No systematic variation was found between the linear measurements obtained with the reference protocol and the other imaging protocols. The only exceptions were TP3/ASIR-50 (bone kernel) and TP4/ASIR-100 (bone and standard kernels). The mean measurement differences between these three protocols and the reference protocol were within ±0.1 mm, with the 95 % confidence interval limits being within the range of ±1.15 mm. A nearly 97.5 % reduction in dose did not significantly affect the height and width measurements of edentulous jaws regardless of the reconstruction algorithm used.

  7. Effective Dose of CT- and Fluoroscopy-Guided Perineural/Epidural Injections of the Lumbar Spine: A Comparative Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmid, Gebhard; Schmitz, Alexander; Borchardt, Dieter

    The objective of this study was to compare the effective radiation dose of perineural and epidural injections of the lumbar spine under computed tomography (CT) or fluoroscopic guidance with respect to dose-reduced protocols. We assessed the radiation dose with an Alderson Rando phantom at the lumbar segment L4/5 using 29 thermoluminescence dosimeters. Based on our clinical experience, 4-10 CT scans and 1-min fluoroscopy are appropriate. Effective doses were calculated for CT for a routine lumbar spine protocol and for maximum dose reduction; as well as for fluoroscopy in a continuous and a pulsed mode (3-15 pulses/s). Effective doses under CTmore » guidance were 1.51 mSv for 4 scans and 3.53 mSv for 10 scans using a standard protocol and 0.22 mSv and 0.43 mSv for the low-dose protocol. In continuous mode, the effective doses ranged from 0.43 to 1.25 mSv for 1-3 min of fluoroscopy. Using 1 min of pulsed fluoroscopy, the effective dose was less than 0.1 mSv for 3 pulses/s. A consequent low-dose CT protocol reduces the effective dose compared to a standard lumbar spine protocol by more than 85%. The latter dose might be expected when applying about 1 min of continuous fluoroscopy for guidance. A pulsed mode further reduces the effective dose of fluoroscopy by 80-90%.« less

  8. Cost analysis of negative-pressure wound therapy with instillation for wound bed preparation preceding split-thickness skin grafts for massive (>100 cm(2)) chronic venous leg ulcers.

    PubMed

    Yang, C Kevin; Alcantara, Sean; Goss, Selena; Lantis, John C

    2015-04-01

    Massive (≥100 cm(2)) venous leg ulcers (VLUs) demonstrate very low closure rates with standard compression therapy and are costly to manage. Negative-pressure wound therapy (NPWT), followed by a split-thickness skin graft (STSG), can be a cost-effective alternative to this standard care. We performed a cost analysis of these two treatments. A retrospective review was performed of 10 ulcers treated with surgical debridement, 7 days of inpatient NPWT with topical antiseptic instillation (NPWTi), and STSG, with 4 additional days of inpatient NPWT bolster over the graft. Independent medical cost estimators were used to compare the cost of this treatment protocol with standard outpatient compression therapy. The average length of time ulcers were present before patients entered the study was 38 months (range, 3-120 months). Eight of 10 patients had complete VLU closure by 6 months after NPWTi with STSG. The 6-month costs of the proposed treatment protocol and standard twice-weekly compression therapy were estimated to be $27,000 and $28,000, respectively. NPWTi with STSG treatment is more effective for closure of massive VLUs at 6 months than that reported for standard compression therapy. Further, the cost of the proposed treatment protocol is comparable with standard compression therapy. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  9. The Behavior of TCP and Its Extensions in Space

    NASA Technical Reports Server (NTRS)

    Wang, Ruhai; Horan, Stephen

    2001-01-01

    The performance of Transmission Control Protocol (TCP) in space has been examined from the observations of simulation and experimental tests for several years at National Aeronautics and Space Administration (NASA), Department of Defense (DoD) and universities. At New Mexico State University (NMSU), we have been concentrating on studying the performance of two protocol suites: the file transfer protocol (ftp) running over Transmission Control Protocol/Internet Protocol (TCP/IP) stack and the file protocol (fp) running over the Space Communications Protocol Standards (SCPS)-Transport Protocol (TP) developed under the Consultative Committee for Space Data Systems (CCSDS) standards process. SCPS-TP is considered to be TCP's extensions for space communications. This dissertation experimentally studies the behavior of TCP and SCPS-TP by running the protocol suites over both the Space-to-Ground Link Simulator (SGLS) test-bed and realistic satellite link. The study concentrates on comparing protocol behavior by plotting the averaged file transfer times for different experimental configurations and analyzing them using Statistical Analysis System (SAS) based procedures. The effects of different link delays and various Bit-Error-Rates (BERS) on each protocol performance are also studied and linear regression models are built for experiments over SGLS test-bed to reflect the relationships between the file transfer time and various transmission conditions.

  10. Development of a data entry auditing protocol and quality assurance for a tissue bank database.

    PubMed

    Khushi, Matloob; Carpenter, Jane E; Balleine, Rosemary L; Clarke, Christine L

    2012-03-01

    Human transcription error is an acknowledged risk when extracting information from paper records for entry into a database. For a tissue bank, it is critical that accurate data are provided to researchers with approved access to tissue bank material. The challenges of tissue bank data collection include manual extraction of data from complex medical reports that are accessed from a number of sources and that differ in style and layout. As a quality assurance measure, the Breast Cancer Tissue Bank (http:\\\\www.abctb.org.au) has implemented an auditing protocol and in order to efficiently execute the process, has developed an open source database plug-in tool (eAuditor) to assist in auditing of data held in our tissue bank database. Using eAuditor, we have identified that human entry errors range from 0.01% when entering donor's clinical follow-up details, to 0.53% when entering pathological details, highlighting the importance of an audit protocol tool such as eAuditor in a tissue bank database. eAuditor was developed and tested on the Caisis open source clinical-research database; however, it can be integrated in other databases where similar functionality is required.

  11. Integrating Borrowed Records into a Database: Impact on Thesaurus Development and Retrieval.

    ERIC Educational Resources Information Center

    And Others; Kirtland, Monika

    1980-01-01

    Discusses three approaches to thesaurus and indexing/retrieval language maintenance for combined databases: reindexing, merging, and initial standardization. Two thesauri for a combined database are evaluated in terms of their compatibility, and indexing practices are compared. Tables and figures help illustrate aspects of the comparison. (SW)

  12. The Virtual Insect Brain protocol: creating and comparing standardized neuroanatomy

    PubMed Central

    Jenett, Arnim; Schindelin, Johannes E; Heisenberg, Martin

    2006-01-01

    Background In the fly Drosophila melanogaster, new genetic, physiological, molecular and behavioral techniques for the functional analysis of the brain are rapidly accumulating. These diverse investigations on the function of the insect brain use gene expression patterns that can be visualized and provide the means for manipulating groups of neurons as a common ground. To take advantage of these patterns one needs to know their typical anatomy. Results This paper describes the Virtual Insect Brain (VIB) protocol, a script suite for the quantitative assessment, comparison, and presentation of neuroanatomical data. It is based on the 3D-reconstruction and visualization software Amira, version 3.x (Mercury Inc.) [1]. Besides its backbone, a standardization procedure which aligns individual 3D images (series of virtual sections obtained by confocal microscopy) to a common coordinate system and computes average intensities for each voxel (volume pixel) the VIB protocol provides an elaborate data management system for data administration. The VIB protocol facilitates direct comparison of gene expression patterns and describes their interindividual variability. It provides volumetry of brain regions and helps to characterize the phenotypes of brain structure mutants. Using the VIB protocol does not require any programming skills since all operations are carried out at an intuitively usable graphical user interface. Although the VIB protocol has been developed for the standardization of Drosophila neuroanatomy, the program structure can be used for the standardization of other 3D structures as well. Conclusion Standardizing brains and gene expression patterns is a new approach to biological shape and its variability. The VIB protocol provides a first set of tools supporting this endeavor in Drosophila. The script suite is freely available at [2] PMID:17196102

  13. Low-Contrast and Low-Radiation Dose Protocol in Cardiac Computed Tomography: Usefulness of Low Tube Voltage and Knowledge-Based Iterative Model Reconstruction Algorithm.

    PubMed

    Iyama, Yuji; Nakaura, Takeshi; Yokoyama, Koichi; Kidoh, Masafumi; Harada, Kazunori; Oda, Seitaro; Tokuyasu, Shinichi; Yamashita, Yasuyuki

    This study aimed to evaluate the feasibility of a low contrast, low-radiation dose protocol of 80-peak kilovoltage (kVp) with prospective electrocardiography-gated cardiac computed tomography (CT) using knowledge-based iterative model reconstruction (IMR). Thirty patients underwent an 80-kVp prospective electrocardiography-gated cardiac CT with low-contrast agent (222-mg iodine per kilogram of body weight) dose. We also enrolled 30 consecutive patients who were scanned with a 120-kVp cardiac CT with filtered back projection using the standard contrast agent dose (370-mg iodine per kilogram of body weight) as a historical control group. We evaluated the radiation dose for the 2 groups. The 80-kVp images were reconstructed with filtered back projection (protocol A), hybrid iterative reconstruction (HIR, protocol B), and IMR (protocol C). We compared CT numbers, image noise, and contrast-to-noise ratio among 120-kVp protocol, protocol A, protocol B, and protocol C. In addition, we compared the noise reduction rate between HIR and IMR. Two independent readers compared image contrast, image noise, image sharpness, unfamiliar image texture, and overall image quality among the 4 protocols. The estimated effective dose (ED) of the 80-kVp protocol was 74% lower than that of the 120-kVp protocol (1.4 vs 5.4 mSv). The contrast-to-noise ratio of protocol C was significantly higher than that of protocol A. The noise reduction rate of IMR was significantly higher than that of HIR (P < 0.01). There was no significant difference in almost all qualitative image quality between 120-kVp protocol and protocol C except for image contrast. A 80-kVp protocol with IMR yields higher image quality with 74% decreased radiation dose and 40% decreased contrast agent dose as compared with a 120-kVp protocol, while decreasing more image noise compared with the 80-kVp protocol with HIR.

  14. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    ERIC Educational Resources Information Center

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  15. Impacts on health outcomes and on resource utilisation of home-based parenteral chemotherapy administration: a systematic review protocol.

    PubMed

    Mittaine-Marzac, Benedicte; De Stampa, Matthieu; Bagaragaza, Emmanuel; Ankri, Joël; Aegerter, Philippe

    2018-05-09

    Despite the demonstrated feasibility and policies to enable more to receive chemotherapy at home, in a few countries, parenteral chemotherapy administration at home remains currently marginal. Of note, findings of different studies on health outcomes and resources utilisation vary, leading to conflicting results. This protocol outlines a systematic review that seeks to synthesise and critically appraise the current state of evidence on the comparison between home setting and hospital setting for parenteral chemotherapy administration within the same high standards of clinical care. This protocol has been prepared following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocols approach. Electronic searches will be conducted on bibliographic databases selected from the earliest available data through 15 November 2017 published in French and English languages. Additional potential papers in the selected studies and grey literature will be also included in the review. The review will include all types of studies exploring patients receiving anticancer drugs for injection at home compared with patients receiving the drugs in a hospital setting, and will assess at least one of the following criteria: patients' health outcomes, patients' or caregivers' satisfaction, resource utilisation with cost savings, and incentives and/or barriers of each admission setting according to patients' and relatives' points of view. Two reviewers will independently screen studies and extract relevant data from the included studies. Methodological quality of studies will be assessed using the 'Quality Assessment Tool for Quantitative Studies' developed by the Effective Public Health Practice Project tool, in addition to the Consolidated Health Economic Evaluation Reporting Standards statement for economic studies. As the review is focused on the analysis of secondary data, it does not require ethics approval. The results of the study will be disseminated through articles in peer-reviewed journals and trade publications, as well as presentations at relevant conferences. CRD42017068164. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Reducing radiation dose to the female breast during conventional and dedicated breast computed tomography

    NASA Astrophysics Data System (ADS)

    Rupcich, Franco John

    The purpose of this study was to quantify the effectiveness of techniques intended to reduce dose to the breast during CT coronary angiography (CTCA) scans with respect to task-based image quality, and to evaluate the effectiveness of optimal energy weighting in improving contrast-to-noise ratio (CNR), and thus the potential for reducing breast dose, during energy-resolved dedicated breast CT. A database quantifying organ dose for several radiosensitive organs irradiated during CTCA, including the breast, was generated using Monte Carlo simulations. This database facilitates estimation of organ-specific dose deposited during CTCA protocols using arbitrary x-ray spectra or tube-current modulation schemes without the need to run Monte Carlo simulations. The database was used to estimate breast dose for simulated CT images acquired for a reference protocol and five protocols intended to reduce breast dose. For each protocol, the performance of two tasks (detection of signals with unknown locations) was compared over a range of breast dose levels using a task-based, signal-detectability metric: the estimator of the area under the exponential free-response relative operating characteristic curve, AFE. For large-diameter/medium-contrast signals, when maintaining equivalent AFE, the 80 kV partial, 80 kV, 120 kV partial, and 120 kV tube-current modulated protocols reduced breast dose by 85%, 81%, 18%, and 6%, respectively, while the shielded protocol increased breast dose by 68%. Results for the small-diameter/high-contrast signal followed similar trends, but with smaller magnitude of the percent changes in dose. The 80 kV protocols demonstrated the greatest reduction to breast dose, however, the subsequent increase in noise may be clinically unacceptable. Tube output for these protocols can be adjusted to achieve more desirable noise levels with lesser dose reduction. The improvement in CNR of optimally projection-based and image-based weighted images relative to photon-counting was investigated for six different energy bin combinations using a bench-top energy-resolving CT system with a cadmium zinc telluride (CZT) detector. The non-ideal spectral response reduced the CNR for the projection-based weighted images, while image-based weighting improved CNR for five out of the six investigated bin combinations, despite this non-ideal response, indicating potential for image-based weighting to reduce breast dose during dedicated breast CT.

  17. High SNR Acquisitions Improve the Repeatability of Liver Fat Quantification Using Confounder-corrected Chemical Shift-encoded MR Imaging

    PubMed Central

    Motosugi, Utaroh; Hernando, Diego; Wiens, Curtis; Bannas, Peter; Reeder, Scott. B

    2017-01-01

    Purpose: To determine whether high signal-to-noise ratio (SNR) acquisitions improve the repeatability of liver proton density fat fraction (PDFF) measurements using confounder-corrected chemical shift-encoded magnetic resonance (MR) imaging (CSE-MRI). Materials and Methods: Eleven fat-water phantoms were scanned with 8 different protocols with varying SNR. After repositioning the phantoms, the same scans were repeated to evaluate the test-retest repeatability. Next, an in vivo study was performed with 20 volunteers and 28 patients scheduled for liver magnetic resonance imaging (MRI). Two CSE-MRI protocols with standard- and high-SNR were repeated to assess test-retest repeatability. MR spectroscopy (MRS)-based PDFF was acquired as a standard of reference. The standard deviation (SD) of the difference (Δ) of PDFF measured in the two repeated scans was defined to ascertain repeatability. The correlation between PDFF of CSE-MRI and MRS was calculated to assess accuracy. The SD of Δ and correlation coefficients of the two protocols (standard- and high-SNR) were compared using F-test and t-test, respectively. Two reconstruction algorithms (complex-based and magnitude-based) were used for both the phantom and in vivo experiments. Results: The phantom study demonstrated that higher SNR improved the repeatability for both complex- and magnitude-based reconstruction. Similarly, the in vivo study demonstrated that the repeatability of the high-SNR protocol (SD of Δ = 0.53 for complex- and = 0.85 for magnitude-based fit) was significantly higher than using the standard-SNR protocol (0.77 for complex, P < 0.001; and 0.94 for magnitude-based fit, P = 0.003). No significant difference was observed in the accuracy between standard- and high-SNR protocols. Conclusion: Higher SNR improves the repeatability of fat quantification using confounder-corrected CSE-MRI. PMID:28190853

  18. Systematic Evaluation of the Patient-Reported Outcome (PRO) Content of Clinical Trial Protocols

    PubMed Central

    Kyte, Derek; Duffy, Helen; Fletcher, Benjamin; Gheorghe, Adrian; Mercieca-Bebber, Rebecca; King, Madeleine; Draper, Heather; Ives, Jonathan; Brundage, Michael; Blazeby, Jane; Calvert, Melanie

    2014-01-01

    Background Qualitative evidence suggests patient-reported outcome (PRO) information is frequently absent from clinical trial protocols, potentially leading to inconsistent PRO data collection and risking bias. Direct evidence regarding PRO trial protocol content is lacking. The aim of this study was to systematically evaluate the PRO-specific content of UK National Institute for Health Research (NIHR) Health Technology Assessment (HTA) programme trial protocols. Methods and Findings We conducted an electronic search of the NIHR HTA programme database (inception to August 2013) for protocols describing a randomised controlled trial including a primary/secondary PRO. Two investigators independently reviewed the content of each protocol, using a specially constructed PRO-specific protocol checklist, alongside the ‘Standard Protocol Items: Recommendations for Interventional Trials’ (SPIRIT) checklist. Disagreements were resolved through discussion with a third investigator. 75 trial protocols were included in the analysis. Protocols included a mean of 32/51 (63%) SPIRIT recommendations (range 16–41, SD 5.62) and 11/33 (33%) PRO-specific items (range 4–18, SD 3.56). Over half (61%) of the PRO items were incomplete. Protocols containing a primary PRO included slightly more PRO checklist items (mean 14/33 (43%)). PRO protocol content was not associated with general protocol completeness; thus, protocols judged as relatively ‘complete’ using SPIRIT were still likely to have omitted a large proportion of PRO checklist items. Conclusions The PRO components of HTA clinical trial protocols require improvement. Information on the PRO rationale/hypothesis, data collection methods, training and management was often absent. This low compliance is unsurprising; evidence shows existing PRO guidance for protocol developers remains difficult to access and lacks consistency. Study findings suggest there are a number of PRO protocol checklist items that are not fully addressed by the current SPIRIT statement. We therefore advocate the development of consensus-based supplementary guidelines, aimed at improving the completeness and quality of PRO content in clinical trial protocols. PMID:25333349

  19. Phantom dosimetry and image quality of i-CAT FLX cone-beam computed tomography

    PubMed Central

    Ludlow, John B.; Walker, Cameron

    2013-01-01

    Introduction Increasing use of cone-beam computed tomography in orthodontics has been coupled with heightened concern with the long-term risks of x-ray exposure in orthodontic populations. An industry response to this has been to offer low-exposure alternative scanning options in newer cone-beam computed tomography models. Methods Effective doses resulting from various combinations of field size, and field location comparing child and adult anthropomorphic phantoms using the recently introduced i-CAT FLX cone-beam computed tomography unit were measured with Optical Stimulated Dosimetry using previously validated protocols. Scan protocols included High Resolution (360° rotation, 600 image frames, 120 kVp, 5 mA, 7.4 sec), Standard (360°, 300 frames, 120 kVp, 5 mA, 3.7 sec), QuickScan (180°, 160 frames, 120 kVp, 5 mA, 2 sec) and QuickScan+ (180°, 160 frames, 90 kVp, 3 mA, 2 sec). Contrast-to-noise ratio (CNR) was calculated as a quantitative measure of image quality for the various exposure options using the QUART DVT phantom. Results Child phantom doses were on average 36% greater than Adult phantom doses. QuickScan+ protocols resulted in significantly lower doses than Standard protocols for child (p=0.0167) and adult (p=0.0055) phantoms. 13×16 cm cephalometric fields of view ranged from 11–85 μSv in the adult phantom and 18–120 μSv in the child for QuickScan+ and Standard protocols respectively. CNR was reduced by approximately 2/3rds comparing QuickScan+ to Standard exposure parameters. Conclusions QuickScan+ effective doses are comparable to conventional panoramic examinations. Significant dose reductions are accompanied by significant reductions in image quality. However, this trade-off may be acceptable for certain diagnostic tasks such as interim assessment of treatment results. PMID:24286904

  20. From Chaos to Content: An Integrated Approach to Government Web Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demuth, Nora H.; Knudson, Christa K.

    2005-01-03

    The web development team of the Environmental Technology Directorate (ETD) at the U.S. Department of Energy’s Pacific Northwest National Laboratory (PNNL) redesigned the ETD website as a database-driven system, powered by the newly designed ETD Common Information System (ETD-CIS). The ETD website was redesigned in response to an analysis that showed the previous ETD websites were inefficient, costly, and lacking in a consistent focus. Redesigned and newly created websites based on a new ETD template provide a consistent image, meet or exceed accessibility standards, and are linked through a common database. The protocols used in developing the ETD website supportmore » integration of further organizational sites and facilitate internal use by staff and training on ETD website development and maintenance. Other PNNL organizations have approached the ETD web development team with an interest in applying the methods established by the ETD system. The ETD system protocol could potentially be used by other DOE laboratories to improve their website efficiency and content focus. “The tools by which we share science information must be as extraordinary as the information itself.[ ]” – DOE Science Director Raymond Orbach« less

  1. SPANG: a SPARQL client supporting generation and reuse of queries for distributed RDF databases.

    PubMed

    Chiba, Hirokazu; Uchiyama, Ikuo

    2017-02-08

    Toward improved interoperability of distributed biological databases, an increasing number of datasets have been published in the standardized Resource Description Framework (RDF). Although the powerful SPARQL Protocol and RDF Query Language (SPARQL) provides a basis for exploiting RDF databases, writing SPARQL code is burdensome for users including bioinformaticians. Thus, an easy-to-use interface is necessary. We developed SPANG, a SPARQL client that has unique features for querying RDF datasets. SPANG dynamically generates typical SPARQL queries according to specified arguments. It can also call SPARQL template libraries constructed in a local system or published on the Web. Further, it enables combinatorial execution of multiple queries, each with a distinct target database. These features facilitate easy and effective access to RDF datasets and integrative analysis of distributed data. SPANG helps users to exploit RDF datasets by generation and reuse of SPARQL queries through a simple interface. This client will enhance integrative exploitation of biological RDF datasets distributed across the Web. This software package is freely available at http://purl.org/net/spang .

  2. A systematic review of postoperative hand therapy management of basal joint arthritis.

    PubMed

    Wolfe, Terri; Chu, Jennifer Y; Woods, Tammy; Lubahn, John D

    2014-04-01

    There are a variety of postoperative immobilization and therapy options for patients with basal joint arthritis. Although prior systematic reviews have compared surgical procedures used to treat basal joint arthritis, none to our knowledge compares therapy protocols for this condition, which are considered an important part of the treatment. (1) We sought to determine whether differences in the length and type of postoperative immobilization affect clinical results after basal joint arthritis surgery. (2) We also compared specific therapy protocols that were prescribed. (3) Finally, we evaluated published protocols to determine when patients were released to full activity to see whether these appeared to affect clinical results. A systematic review of English-language studies in the PubMed and Cochrane databases was performed. Studies were then reviewed to determine what postoperative immobilization and therapy protocols the authors used and when patients were released to full activities. A total of 19 studies were identified using the search criteria. All but one of the studies included a postoperative period of immobilization in either a cast or splint. Immobilization time varied depending on whether Kirschner wires were used for the surgery and whether an implant was placed. Postoperative therapy protocols also varied but followed three general patterns. Some therapy protocols involved teaching patients a home exercise program only, whereas some authors described routine referral to a therapist. The third group consisted of studies in which patients were only referred for therapy if the physicians determined it was necessary during followup. Many studies did not give a specific time for full return to activity and instead described a gradual transition to full activity after immobilization was discontinued. Because of the variability and small numbers, no conclusive recommendations could be made on any of the three study questions. Comparative, multicenter studies comparing different immobilization and therapy protocols after the surgical treatment of basal joint arthritis would be helpful for both surgeons and therapists looking to refine their treatment protocols.

  3. Impact of voxel size variation on CBCT-based diagnostic outcome in dentistry: a systematic review.

    PubMed

    Spin-Neto, Rubens; Gotfredsen, Erik; Wenzel, Ann

    2013-08-01

    The objective of this study was to make a systematic review on the impact of voxel size in cone beam computed tomography (CBCT)-based image acquisition, retrieving evidence regarding the diagnostic outcome of those images. The MEDLINE bibliographic database was searched from 1950 to June 2012 for reports comparing diverse CBCT voxel sizes. The search strategy was limited to English-language publications using the following combined terms in the search strategy: (voxel or FOV or field of view or resolution) and (CBCT or cone beam CT). The results from the review identified 20 publications that qualitatively or quantitatively assessed the influence of voxel size on CBCT-based diagnostic outcome, and in which the methodology/results comprised at least one of the expected parameters (image acquisition, reconstruction protocols, type of diagnostic task, and presence of a gold standard). The diagnostic task assessed in the studies was diverse, including the detection of root fractures, the detection of caries lesions, and accuracy of 3D surface reconstruction and of bony measurements, among others. From the studies assessed, it is clear that no general protocol can be yet defined for CBCT examination of specific diagnostic tasks in dentistry. Rationale in this direction is an important step to define the utility of CBCT imaging.

  4. Factors influencing early rehabilitation after THA: a systematic review.

    PubMed

    Sharma, Vivek; Morgan, Patrick M; Cheng, Edward Y

    2009-06-01

    A wide variation exists in rehabilitation after total hip arthroplasty (THA) in part due to a paucity of evidence-based literature. We asked whether a minimally invasive surgical approach, a multimodal approach to pain control with revised anesthesia protocols, hip restrictions, or preoperative physiotherapy achieved a faster rehabilitation and improved immediate short-term outcome. We conducted a systematic review of 16 level I and II studies after a strategy-based search of English literature on OVID Medline, PubMed, CINAHL, Cochrane, and EMBASE databases. We defined the endpoint of assessment as independent ambulation and ability to perform activities of daily living. Literature supports the use of a multimodal pain control to improve patient compliance in accelerated rehabilitation. Multimodal pain control with revised anesthesia protocols and accelerated rehabilitation speeds recovery after minimally invasive THA compared to the standard approach THA, but a smaller incision length or minimally invasive approach does not demonstrably improve the short-term outcome. Available studies justify no hip restrictions following an anterolateral approach but none have examined the question for a posterior approach. Preoperative physiotherapy may facilitate faster postoperative functional recovery but multicenter and well-designed prospective randomized studies with outcome measures are necessary to confirm its efficacy. Level II, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.

  5. Development of new method and protocol for cryopreservation related to embryo and oocytes freezing in terms of fertilization rate: A comparative study including review of literature.

    PubMed

    Barik, Mayadhar; Bajpai, Minu; Patnaik, Santosh; Mishra, Pravash; Behera, Priyamadhaba; Dwivedi, Sada Nanda

    2016-01-01

    Cryopreservation is basically related to meritorious thin samples or small clumps of cells that are cooled quickly without loss. Our main objective is to establish and formulate an innovative method and protocol development for cryopreservation as a gold standard for clinical uses in laboratory practice and treatment. The knowledge regarding usefulness of cryopreservation in clinical practice is essential to carry forward the clinical practice and research. We are trying to compare different methods of cryopreservation (in two dozen of cells) at the same time we compare the embryo and oocyte freezing interms of fertilization rate according to the International standard protocol. The combination of cryoprotectants and regimes of rapid cooling and rinsing during warming often allows successful cryopreservation of biological materials, particularly cell suspensions or thin tissue samples. Examples include semen, blood, tissue samples like tumors, histological cross-sections, human eggs and human embryos. Although presently many studies have reported that the children born from frozen embryos or "frosties," show consistently positive results with no increase in birth defects or development abnormalities is quite good enough and similar to our study (50-85%). We ensure that cryopreservation technology provided useful cell survivability, tissue and organ preservation in a proper way. Although it varies according to different laboratory conditions, it is certainly beneficial for patient's treatment and research. Further studies are needed for standardization and development of new protocol.

  6. Refining animal models in fracture research: seeking consensus in optimising both animal welfare and scientific validity for appropriate biomedical use.

    PubMed

    Auer, Jorg A; Goodship, Allen; Arnoczky, Steven; Pearce, Simon; Price, Jill; Claes, Lutz; von Rechenberg, Brigitte; Hofmann-Amtenbrinck, Margarethe; Schneider, Erich; Müller-Terpitz, R; Thiele, F; Rippe, Klaus-Peter; Grainger, David W

    2007-08-01

    In an attempt to establish some consensus on the proper use and design of experimental animal models in musculoskeletal research, AOVET (the veterinary specialty group of the AO Foundation) in concert with the AO Research Institute (ARI), and the European Academy for the Study of Scientific and Technological Advance, convened a group of musculoskeletal researchers, veterinarians, legal experts, and ethicists to discuss, in a frank and open forum, the use of animals in musculoskeletal research. The group narrowed the field to fracture research. The consensus opinion resulting from this workshop can be summarized as follows: Anaesthesia and pain management protocols for research animals should follow standard protocols applied in clinical work for the species involved. This will improve morbidity and mortality outcomes. A database should be established to facilitate selection of anaesthesia and pain management protocols for specific experimental surgical procedures and adopted as an International Standard (IS) according to animal species selected. A list of 10 golden rules and requirements for conduction of animal experiments in musculoskeletal research was drawn up comprising 1) Intelligent study designs to receive appropriate answers; 2) Minimal complication rates (5 to max. 10%); 3) Defined end-points for both welfare and scientific outputs analogous to quality assessment (QA) audit of protocols in GLP studies; 4) Sufficient details for materials and methods applied; 5) Potentially confounding variables (genetic background, seasonal, hormonal, size, histological, and biomechanical differences); 6) Post-operative management with emphasis on analgesia and follow-up examinations; 7) Study protocols to satisfy criteria established for a "justified animal study"; 8) Surgical expertise to conduct surgery on animals; 9) Pilot studies as a critical part of model validation and powering of the definitive study design; 10) Criteria for funding agencies to include requirements related to animal experiments as part of the overall scientific proposal review protocols. Such agencies are also encouraged to seriously consider and adopt the recommendations described here when awarding funds for specific projects. Specific new requirements and mandates related both to improving the welfare and scientific rigour of animal-based research models are urgently needed as part of international harmonization of standards.

  7. Evaluation of a new very low dose imaging protocol: feasibility and impact on X-ray dose levels in electrophysiology procedures

    PubMed Central

    Bourier, Felix; Reents, Tilko; Ammar-Busch, Sonia; Buiatti, Alessandra; Kottmaier, Marc; Semmler, Verena; Telishevska, Marta; Brkic, Amir; Grebmer, Christian; Lennerz, Carsten; Kolb, Christof; Hessling, Gabriele; Deisenhofer, Isabel

    2016-01-01

    Aims This study presents and evaluates the impact of a new lowest-dose fluoroscopy protocol (Siemens AG), especially designed for electrophysiology (EP) procedures, on X-ray dose levels. Methods and results From October 2014 to March 2015, 140 patients underwent an EP study on an Artis zee angiography system. The standard low-dose protocol was operated at 23 nGy (fluoroscopy) and at 120 nGy (cine-loop), the new lowest-dose protocol was operated at 8 nGy (fluoroscopy) and at 36 nGy (cine-loop). Procedural data, X-ray times, and doses were analysed in 100 complex left atrial and in 40 standard EP procedures. The resulting dose–area products were 877.9 ± 624.7 µGym² (n = 50 complex procedures, standard low dose), 199 ± 159.6 µGym² (n = 50 complex procedures, lowest dose), 387.7 ± 36.0 µGym² (n = 20 standard procedures, standard low dose), and 90.7 ± 62.3 µGym² (n = 20 standard procedures, lowest dose), P < 0.01. In the low-dose and lowest-dose groups, procedure times were 132.6 ± 35.7 vs. 126.7 ± 34.7 min (P = 0.40, complex procedures) and 72.3 ± 20.9 vs. 85.2 ± 44.1 min (P = 0.24, standard procedures), radiofrequency (RF) times were 53.8 ± 26.1 vs. 50.4 ± 29.4 min (P = 0.54, complex procedures) and 10.1 ± 9.9 vs. 12.2 ± 14.7 min (P = 0.60, standard procedures). One complication occurred in the standard low-dose and lowest-dose groups (P = 1.0). Conclusion The new lowest-dose imaging protocol reduces X-ray dose levels by 77% compared with the currently available standard low-dose protocol. From an operator standpoint, lowest X-ray dose levels create a different, reduced image quality. The new image quality did not significantly affect procedure or RF times and did not result in higher complication rates. Regarding radiological protection, operating at lowest-dose settings should become standard in EP procedures. PMID:26589627

  8. Systematic Review of Liposomal Bupivacaine (Exparel) for Postoperative Analgesia.

    PubMed

    Vyas, Krishna S; Rajendran, Sibi; Morrison, Shane D; Shakir, Afaaf; Mardini, Samir; Lemaine, Valerie; Nahabedian, Maurice Y; Baker, Stephen B; Rinker, Brian D; Vasconez, Henry C

    2016-10-01

    Management of postoperative pain often requires multimodal approaches. Suboptimal dosages of current therapies can leave patients experiencing periods of insufficient analgesia, often requiring rescue therapy. With absence of a validated and standardized approach to pain management, further refinement of treatment protocols and targeted therapeutics is needed. Liposomal bupivacaine (Exparel) is a longer acting form of traditional bupivacaine that delivers the drug by means of a multivesicular liposomal system. The effectiveness of liposomal bupivacaine has not been systematically analyzed relative to conventional treatments in plastic surgery. A comprehensive literature search of the MEDLINE, PubMed, and Google Scholar databases was conducted for studies published through October of 2015 with search terms related to liposomal bupivacaine and filtered for relevance to postoperative pain control in plastic surgery. Data on techniques, outcomes, complications, and patient satisfaction were collected. A total of eight articles were selected and reviewed from 160 identified. Articles covered a variety of techniques using liposomal bupivacaine for postoperative pain management. Four hundred five patients underwent procedures (including breast reconstruction, augmentation mammaplasty, abdominal wall reconstruction, mastectomy, and abdominoplasty) where pain was managed with liposomal bupivacaine and compared with those receiving traditional pain management. Liposomal bupivacaine use showed adequate safety and tolerability and, compared to traditional protocols, was equivalent or more effective in postoperative pain management. Liposomal bupivacaine is a safe method for postoperative pain control in the setting of plastic surgery and may represent an alternative to more invasive pain management systems such as patient-controlled analgesia, epidurals, peripheral nerve catheters, or intravenous narcotics.

  9. CSE database: extended annotations and new recommendations for ECG software testing.

    PubMed

    Smíšek, Radovan; Maršánová, Lucie; Němcová, Andrea; Vítek, Martin; Kozumplík, Jiří; Nováková, Marie

    2017-08-01

    Nowadays, cardiovascular diseases represent the most common cause of death in western countries. Among various examination techniques, electrocardiography (ECG) is still a highly valuable tool used for the diagnosis of many cardiovascular disorders. In order to diagnose a person based on ECG, cardiologists can use automatic diagnostic algorithms. Research in this area is still necessary. In order to compare various algorithms correctly, it is necessary to test them on standard annotated databases, such as the Common Standards for Quantitative Electrocardiography (CSE) database. According to Scopus, the CSE database is the second most cited standard database. There were two main objectives in this work. First, new diagnoses were added to the CSE database, which extended its original annotations. Second, new recommendations for diagnostic software quality estimation were established. The ECG recordings were diagnosed by five new cardiologists independently, and in total, 59 different diagnoses were found. Such a large number of diagnoses is unique, even in terms of standard databases. Based on the cardiologists' diagnoses, a four-round consensus (4R consensus) was established. Such a 4R consensus means a correct final diagnosis, which should ideally be the output of any tested classification software. The accuracy of the cardiologists' diagnoses compared with the 4R consensus was the basis for the establishment of accuracy recommendations. The accuracy was determined in terms of sensitivity = 79.20-86.81%, positive predictive value = 79.10-87.11%, and the Jaccard coefficient = 72.21-81.14%, respectively. Within these ranges, the accuracy of the software is comparable with the accuracy of cardiologists. The accuracy quantification of the correct classification is unique. Diagnostic software developers can objectively evaluate the success of their algorithm and promote its further development. The annotations and recommendations proposed in this work will allow for faster development and testing of classification software. As a result, this might facilitate cardiologists' work and lead to faster diagnoses and earlier treatment.

  10. The Efficacy of Sheltered Instruction Observation Protocol (SIOP) in Mathematics Instruction on English Language Learner Students

    ERIC Educational Resources Information Center

    Vidot, Jose L.

    2011-01-01

    Studies by the National Association for Educational Progress found that English Language Learner (ELL) students perform poorly compared to other students on standardized mathematics exams. The research problem addressed how Sheltered Instruction Observation Protocol (SIOP) affected the instructional practices of high school mathematics teachers.…

  11. Targeted Full Energy and Protein Delivery in Critically Ill Patients: A Pilot Randomized Controlled Trial (FEED Trial).

    PubMed

    Fetterplace, Kate; Deane, Adam M; Tierney, Audrey; Beach, Lisa J; Knight, Laura D; Presneill, Jeffrey; Rechnitzer, Thomas; Forsyth, Adrienne; Gill, Benjamin M T; Mourtzakis, Marina; MacIsaac, Christopher

    2018-04-27

    International guidelines recommend greater protein delivery to critically ill patients than they currently receive. This pilot randomized clinical trial aimed to determine whether a volume-target enteral protocol with supplemental protein delivered greater amounts of protein and energy to critically ill patients compared with standard care. Sixty participants received either the intervention (volume-based protocol, with protein supplementation) or standard nutrition care (hourly-rate-based protocol, without protein supplementation) in the intensive care unit (ICU). Coprimary outcomes were average daily protein and energy delivery. Secondary outcomes included change in quadriceps muscle layer thickness (QMLT, ultrasound) and malnutrition (subjective global assessment) at ICU discharge. Mean (SD) protein and energy delivery per day from nutrition therapy for the intervention were 1.2 (0.30) g/kg and 21 (5.2) kcal/kg compared with 0.75 (0.11) g/kg and 18 (2.7) kcal/kg for standard care. The mean difference between groups in protein and energy delivery per day was 0.45 g/kg (95% CI, 0.33-0.56; P < .001) and 2.8 kcal/kg (95% CI, 0.67-4.9, P = .01). Muscle loss (QMLT) at discharge was attenuated by 0.22 cm (95% CI, 0.06-0.38, P = .01) in patients receiving the intervention compared with standard care. The number of malnourished patients was fewer in the intervention [2 (7%) vs 8 (28%); P = .04]. Mortality and duration of admission were similar between groups. A high-protein volume-based protocol with protein supplementation delivered greater amounts of protein and energy. This intervention was associated with attenuation of QMLT loss and reduced prevalence of malnutrition at ICU discharge. © 2018 American Society for Parenteral and Enteral Nutrition.

  12. Survey of protocols for the manual segmentation of the hippocampus: preparatory steps towards a joint EADC-ADNI harmonized protocol.

    PubMed

    Boccardi, Marina; Ganzola, Rossana; Bocchetta, Martina; Pievani, Michela; Redolfi, Alberto; Bartzokis, George; Camicioli, Richard; Csernansky, John G; de Leon, Mony J; deToledo-Morrell, Leyla; Killiany, Ronald J; Lehéricy, Stéphane; Pantel, Johannes; Pruessner, Jens C; Soininen, H; Watson, Craig; Duchesne, Simon; Jack, Clifford R; Frisoni, Giovanni B

    2011-01-01

    Manual segmentation from magnetic resonance imaging (MR) is the gold standard for evaluating hippocampal atrophy in Alzheimer's disease (AD). Nonetheless, different segmentation protocols provide up to 2.5-fold volume differences. Here we surveyed the most frequently used segmentation protocols in the AD literature as a preliminary step for international harmonization. The anatomical landmarks (anteriormost and posteriormost slices, superior, inferior, medial, and lateral borders) were identified from 12 published protocols for hippocampal manual segmentation ([Abbreviation] first author, publication year: [B] Bartzokis, 1998; [C] Convit, 1997; [dTM] deToledo-Morrell, 2004; [H] Haller, 1997; [J] Jack, 1994; [K] Killiany, 1993; [L] Lehericy, 1994; [M] Malykhin, 2007; [Pa] Pantel, 2000; [Pr] Pruessner, 2000; [S] Soininen, 1994; [W] Watson, 1992). The hippocampi of one healthy control and one AD patient taken from the 1.5T MR ADNI database were segmented by a single rater according to each protocol. The accuracy of the protocols' interpretation and translation into practice was checked with lead authors of protocols through individual interactive web conferences. Semantically harmonized landmarks and differences were then extracted, regarding: (a) the posteriormost slice, protocol [B] being the most restrictive, and [H, M, Pa, Pr, S] the most inclusive; (b) inclusion [C, dTM, J, L, M, Pr, W] or exclusion [B, H, K, Pa, S] of alveus/fimbria; (c) separation from the parahippocampal gyrus, [C] being the most restrictive, [B, dTM, H, J, Pa, S] the most inclusive. There were no substantial differences in the definition of the anteriormost slice. This survey will allow us to operationalize differences among protocols into tracing units, measure their impact on the repeatability and diagnostic accuracy of manual hippocampal segmentation, and finally develop a harmonized protocol.

  13. SCPS-TP, TCP, and Rate-Based Protocol Evaluation. Revised

    NASA Technical Reports Server (NTRS)

    Tran, Diepchi T.; Lawas-Grodek, Frances J.; Dimond, Robert P.; Ivancic, William D.

    2005-01-01

    Tests were performed at Glenn Research Center to compare the performance of the Space Communications Protocol Standard Transport Protocol (SCPS TP, otherwise known as "TCP Tranquility") relative to other variants of TCP and to determine the implementation maturity level of these protocols, particularly for higher speeds. The testing was performed over reasonably high data rates of up to 100 Mbps with delays that are characteristic of near-planetary environments. The tests were run for a fixed packet size, but for variously errored environments. This report documents the testing performed to date.

  14. Review and standardization of cell phone exposure calculations using the SAM phantom and anatomically correct head models.

    PubMed

    Beard, Brian B; Kainz, Wolfgang

    2004-10-13

    We reviewed articles using computational RF dosimetry to compare the Specific Anthropomorphic Mannequin (SAM) to anatomically correct models of the human head. Published conclusions based on such comparisons have varied widely. We looked for reasons that might cause apparently similar comparisons to produce dissimilar results. We also looked at the information needed to adequately compare the results of computational RF dosimetry studies. We concluded studies were not comparable because of differences in definitions, models, and methodology. Therefore we propose a protocol, developed by an IEEE standards group, as an initial step in alleviating this problem. The protocol calls for a benchmark validation study comparing the SAM phantom to two anatomically correct models of the human head. It also establishes common definitions and reporting requirements that will increase the comparability of all computational RF dosimetry studies of the human head.

  15. Review and standardization of cell phone exposure calculations using the SAM phantom and anatomically correct head models

    PubMed Central

    Beard, Brian B; Kainz, Wolfgang

    2004-01-01

    We reviewed articles using computational RF dosimetry to compare the Specific Anthropomorphic Mannequin (SAM) to anatomically correct models of the human head. Published conclusions based on such comparisons have varied widely. We looked for reasons that might cause apparently similar comparisons to produce dissimilar results. We also looked at the information needed to adequately compare the results of computational RF dosimetry studies. We concluded studies were not comparable because of differences in definitions, models, and methodology. Therefore we propose a protocol, developed by an IEEE standards group, as an initial step in alleviating this problem. The protocol calls for a benchmark validation study comparing the SAM phantom to two anatomically correct models of the human head. It also establishes common definitions and reporting requirements that will increase the comparability of all computational RF dosimetry studies of the human head. PMID:15482601

  16. Interfacing the PACS and the HIS: results of a 5-year implementation.

    PubMed

    Kinsey, T V; Horton, M C; Lewis, T E

    2000-01-01

    An interface was created between the Department of Defense's hospital information system (HIS) and its two picture archiving and communication system (PACS)-based radiology information systems (RISs). The HIS is called the Composite Healthcare Computer System (CHCS), and the RISs are called the Medical Diagnostic Imaging System (MDIS) and the Digital Imaging Network (DIN)-PACS. Extensive mapping between dissimilar data protocols was required to translate data from the HIS into both RISs. The CHCS uses a Health Level 7 (HL7) protocol, whereas the MDIS uses the American College of Radiology-National Electrical Manufacturers Association 2.0 protocol and the DIN-PACS uses the Digital Imaging and Communications in Medicine (DICOM) 3.0 protocol. An interface engine was required to change some data formats, as well as to address some nonstandard HL7 data being output from the CHCS. In addition, there are differences in terminology between fields and segments in all three protocols. This interface is in use at 20 military facilities throughout the world. The interface reduces the amount of manual entry into more than one automated system to the smallest level possible. Data mapping during installation saved time, improved productivity, and increased user acceptance during PACS implementation. It also resulted in more standardized database entries in both the HIS (CHCS) and the RIS (PACS).

  17. Recommendations for a service framework to access astronomical archives

    NASA Technical Reports Server (NTRS)

    Travisano, J. J.; Pollizzi, J.

    1992-01-01

    There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.

  18. Lessons Learned From Developing Reactor Pressure Vessel Steel Embrittlement Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jy-An John

    Materials behaviors caused by neutron irradiation under fission and/or fusion environments can be little understood without practical examination. Easily accessible material information system with large material database using effective computers is necessary for design of nuclear materials and analyses or simulations of the phenomena. The developed Embrittlement Data Base (EDB) at ORNL is this comprehensive collection of data. EDB database contains power reactor pressure vessel surveillance data, the material test reactor data, foreign reactor data (through bilateral agreements authorized by NRC), and the fracture toughness data. The lessons learned from building EDB program and the associated database management activity regardingmore » Material Database Design Methodology, Architecture and the Embedded QA Protocol are described in this report. The development of IAEA International Database on Reactor Pressure Vessel Materials (IDRPVM) and the comparison of EDB database and IAEA IDRPVM database are provided in the report. The recommended database QA protocol and database infrastructure are also stated in the report.« less

  19. On the designing of a tamper resistant prescription RFID access control system.

    PubMed

    Safkhani, Masoumeh; Bagheri, Nasour; Naderi, Majid

    2012-12-01

    Recently, Chen et al. have proposed a novel tamper resistant prescription RFID access control system, published in the Journal of Medical Systems. In this paper we consider the security of the proposed protocol and identify some existing weaknesses. The main attack is a reader impersonation attack which allows an active adversary to impersonate a legitimate doctor, e.g. the patient's doctor, to access the patient's tag and change the patient prescription. The presented attack is quite efficient. To impersonate a doctor, the adversary should eavesdrop one session between the doctor and the patient's tag and then she can impersonate the doctor with the success probability of '1'. In addition, we present efficient reader-tag to back-end database impersonation, de-synchronization and traceability attacks against the protocol. Finally, we propose an improved version of protocol which is more efficient compared to the original protocol while provides the desired security against the presented attacks.

  20. Large scale study of multiple-molecule queries

    PubMed Central

    2009-01-01

    Background In ligand-based screening, as well as in other chemoinformatics applications, one seeks to effectively search large repositories of molecules in order to retrieve molecules that are similar typically to a single molecule lead. However, in some case, multiple molecules from the same family are available to seed the query and search for other members of the same family. Multiple-molecule query methods have been less studied than single-molecule query methods. Furthermore, the previous studies have relied on proprietary data and sometimes have not used proper cross-validation methods to assess the results. In contrast, here we develop and compare multiple-molecule query methods using several large publicly available data sets and background. We also create a framework based on a strict cross-validation protocol to allow unbiased benchmarking for direct comparison in future studies across several performance metrics. Results Fourteen different multiple-molecule query methods were defined and benchmarked using: (1) 41 publicly available data sets of related molecules with similar biological activity; and (2) publicly available background data sets consisting of up to 175,000 molecules randomly extracted from the ChemDB database and other sources. Eight of the fourteen methods were parameter free, and six of them fit one or two free parameters to the data using a careful cross-validation protocol. All the methods were assessed and compared for their ability to retrieve members of the same family against the background data set by using several performance metrics including the Area Under the Accumulation Curve (AUAC), Area Under the Curve (AUC), F1-measure, and BEDROC metrics. Consistent with the previous literature, the best parameter-free methods are the MAX-SIM and MIN-RANK methods, which score a molecule to a family by the maximum similarity, or minimum ranking, obtained across the family. One new parameterized method introduced in this study and two previously defined methods, the Exponential Tanimoto Discriminant (ETD), the Tanimoto Power Discriminant (TPD), and the Binary Kernel Discriminant (BKD), outperform most other methods but are more complex, requiring one or two parameters to be fit to the data. Conclusion Fourteen methods for multiple-molecule querying of chemical databases, including novel methods, (ETD) and (TPD), are validated using publicly available data sets, standard cross-validation protocols, and established metrics. The best results are obtained with ETD, TPD, BKD, MAX-SIM, and MIN-RANK. These results can be replicated and compared with the results of future studies using data freely downloadable from http://cdb.ics.uci.edu/. PMID:20298525

  1. Fault-tolerant symmetrically-private information retrieval

    NASA Astrophysics Data System (ADS)

    Wang, Tian-Yin; Cai, Xiao-Qiu; Zhang, Rui-Ling

    2016-08-01

    We propose two symmetrically-private information retrieval protocols based on quantum key distribution, which provide a good degree of database and user privacy while being flexible, loss-resistant and easily generalized to a large database similar to the precedent works. Furthermore, one protocol is robust to a collective-dephasing noise, and the other is robust to a collective-rotation noise.

  2. High-quality mtDNA control region sequences from 680 individuals sampled across the Netherlands to establish a national forensic mtDNA reference database.

    PubMed

    Chaitanya, Lakshmi; van Oven, Mannis; Brauer, Silke; Zimmermann, Bettina; Huber, Gabriela; Xavier, Catarina; Parson, Walther; de Knijff, Peter; Kayser, Manfred

    2016-03-01

    The use of mitochondrial DNA (mtDNA) for maternal lineage identification often marks the last resort when investigating forensic and missing-person cases involving highly degraded biological materials. As with all comparative DNA testing, a match between evidence and reference sample requires a statistical interpretation, for which high-quality mtDNA population frequency data are crucial. Here, we determined, under high quality standards, the complete mtDNA control-region sequences of 680 individuals from across the Netherlands sampled at 54 sites, covering the entire country with 10 geographic sub-regions. The complete mtDNA control region (nucleotide positions 16,024-16,569 and 1-576) was amplified with two PCR primers and sequenced with ten different sequencing primers using the EMPOP protocol. Haplotype diversity of the entire sample set was very high at 99.63% and, accordingly, the random-match probability was 0.37%. No population substructure within the Netherlands was detected with our dataset. Phylogenetic analyses were performed to determine mtDNA haplogroups. Inclusion of these high-quality data in the EMPOP database (accession number: EMP00666) will improve its overall data content and geographic coverage in the interest of all EMPOP users worldwide. Moreover, this dataset will serve as (the start of) a national reference database for mtDNA applications in forensic and missing person casework in the Netherlands. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Metallic artifacts from internal scaphoid fracture fixation screws: comparison between C-arm flat-panel, cone-beam, and multidetector computed tomography.

    PubMed

    Finkenstaedt, Tim; Morsbach, Fabian; Calcagni, Maurizio; Vich, Magdalena; Pfirrmann, Christian W A; Alkadhi, Hatem; Runge, Val M; Andreisek, Gustav; Guggenberger, Roman

    2014-08-01

    The aim of this study was to compare image quality and extent of artifacts from scaphoid fracture fixation screws using different computed tomography (CT) modalities and radiation dose protocols. Imaging of 6 cadaveric wrists with artificial scaphoid fractures and different fixation screws was performed in 2 screw positions (45° and 90° orientation in relation to the x/y-axis) using multidetector CT (MDCT) and 2 flat-panel CT modalities, C-arm flat-panel CT (FPCT) and cone-beam CT (CBCT), the latter 2 with low and standard radiation dose protocols. Mean cartilage attenuation and metal artifact-induced absolute Hounsfield unit changes (= artifact extent) were measured. Two independent radiologists evaluated different image quality criteria using a 5-point Likert-scale. Interreader agreements (Cohen κ) were calculated. Mean absolute Hounsfield unit changes and quality ratings were compared using Friedman and Wilcoxon signed-rank tests. Artifact extent was significantly smaller for MDCT and standard-dose FPCT compared with CBCT low- and standard-dose acquisitions (all P < 0.05). No significant differences in artifact extent among different screw types and scanning positions were noted (P > 0.05). Both MDCT and FPCT standard-dose protocols showed equal ratings for screw bone interface, fracture line, and trabecular bone evaluation (P = 0.06, 0.2, and 0.2, respectively) and performed significantly better than FPCT low- and CBCT low- and standard-dose acquisitions (all P < 0.05). Good interreader agreement was found for image quality comparisons (Cohen κ = 0.76-0.78). Both MDCT and FPCT standard-dose acquisition showed comparatively less metal-induced artifacts and better overall image quality compared with FPCT low-dose and both CBCT acquisitions. Flat-panel CT may provide sufficient image quality to serve as a versatile CT alternative for postoperative imaging of internally fixated wrist fractures.

  4. An iPTH based protocol for the prevention and treatment of symptomatic hypocalcemia after thyroidectomy

    PubMed Central

    Carter, Yvette; Chen, Herbert; Sippel, Rebecca S.

    2013-01-01

    Background Symptomatic hypocalcemia after thyroidectomy is a barrier to same day surgery, and the cause of ER visits. A standard protocol of calcium and vitamin D supplementation, dependent on intact parathyroid hormone (iPTH) levels, can address this issue. How effective is it? When does it fail? Methods We performed a retrospective review of the prospective Thyroid Database from January 2006 to December 2010. 620 patients underwent completion (CT) or total thyroidectomy (TT), and followed our post-operative protocol of calcium carbonate administration for iPTH levels ≥10pg/ml and calcium carbonate and 0.25μg calcitriol BID for iPTH <10pg/ml. Calcium and iPTH values, pathology and medication, were compared to evaluate protocol efficacy. A p value <0.05 was considered statistically significant. Results Using the protocol, sixty-one (10.2%) patients were chemically hypocalcemic but never developed symptoms and twenty-four (3.9%) patients developed breakthrough symptomatic hypocalcemia. The symptomatic (SX) and asymptomatic (ASX) groups were similar with regard to gender, cancer diagnosis, and pre-operative calcium and iPTH. The symptomatic group was significantly younger (39.6 ± 2.8 vs. 49 ± 0.6 years, p=0.01), with lower post-operative iPTH levels. 33% (n=8) of SX patients had an iPTH ≤5 pg/ml vs. only 6% (n=37) of ASX patients. While the majority of patients with a PTH <5 pg/ml were asymptomatic, 62.5% (n=5) of SX patients with iPTH levels ≤5 pg/ml, required an increased in calcitriol dose to achieve both biochemical correction and symptom relief. Conclusion Prophylactic calcium and vitamin D supplementation based on post-operative iPTH levels can minimize symptomatic hypocalcemia after thyroidectomy. An iPTH ≤ 5pg/ml may warrant higher initial doses of calcitriol in order to prevent symptoms. PMID:24144426

  5. Organ donation in the ICU: A document analysis of institutional policies, protocols, and order sets.

    PubMed

    Oczkowski, Simon J W; Centofanti, John E; Durepos, Pamela; Arseneau, Erika; Kelecevic, Julija; Cook, Deborah J; Meade, Maureen O

    2018-04-01

    To better understand how local policies influence organ donation rates. We conducted a document analysis of our ICU organ donation policies, protocols and order sets. We used a systematic search of our institution's policy library to identify documents related to organ donation. We used Mindnode software to create a publication timeline, basic statistics to describe document characteristics, and qualitative content analysis to extract document themes. Documents were retrieved from Hamilton Health Sciences, an academic hospital system with a high volume of organ donation, from database inception to October 2015. We retrieved 12 active organ donation documents, including six protocols, two policies, two order sets, and two unclassified documents, a majority (75%) after the introduction of donation after circulatory death in 2006. Four major themes emerged: organ donation process, quality of care, patient and family-centred care, and the role of the institution. These themes indicate areas where documented institutional standards may be beneficial. Further research is necessary to determine the relationship of local policies, protocols, and order sets to actual organ donation practices, and to identify barriers and facilitators to improving donation rates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Using SQL Databases for Sequence Similarity Searching and Analysis.

    PubMed

    Pearson, William R; Mackey, Aaron J

    2017-09-13

    Relational databases can integrate diverse types of information and manage large sets of similarity search results, greatly simplifying genome-scale analyses. By focusing on taxonomic subsets of sequences, relational databases can reduce the size and redundancy of sequence libraries and improve the statistical significance of homologs. In addition, by loading similarity search results into a relational database, it becomes possible to explore and summarize the relationships between all of the proteins in an organism and those in other biological kingdoms. This unit describes how to use relational databases to improve the efficiency of sequence similarity searching and demonstrates various large-scale genomic analyses of homology-related data. It also describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. The unit also introduces search_demo, a database that stores sequence similarity search results. The search_demo database is then used to explore the evolutionary relationships between E. coli proteins and proteins in other organisms in a large-scale comparative genomic analysis. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  7. Using relational databases for improved sequence similarity searching and large-scale genomic analyses.

    PubMed

    Mackey, Aaron J; Pearson, William R

    2004-10-01

    Relational databases are designed to integrate diverse types of information and manage large sets of search results, greatly simplifying genome-scale analyses. Relational databases are essential for management and analysis of large-scale sequence analyses, and can also be used to improve the statistical significance of similarity searches by focusing on subsets of sequence libraries most likely to contain homologs. This unit describes using relational databases to improve the efficiency of sequence similarity searching and to demonstrate various large-scale genomic analyses of homology-related data. This unit describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. These include basic use of the database to generate a novel sequence library subset, how to extend and use seqdb_demo for the storage of sequence similarity search results and making use of various kinds of stored search results to address aspects of comparative genomic analysis.

  8. 15 years of monitoring occupational exposure to respirable dust and quartz within the European industrial minerals sector.

    PubMed

    Zilaout, Hicham; Vlaanderen, Jelle; Houba, Remko; Kromhout, Hans

    2017-07-01

    In 2000, a prospective Dust Monitoring Program (DMP) was started in which measurements of worker's exposure to respirable dust and quartz are collected in member companies from the European Industrial Minerals Association (IMA-Europe). After 15 years, the resulting IMA-DMP database allows a detailed overview of exposure levels of respirable dust and quartz over time within this industrial sector. Our aim is to describe the IMA-DMP and the current state of the corresponding database which due to continuation of the IMA-DMP is still growing. The future use of the database will also be highlighted including its utility for the industrial minerals producing sector. Exposure data are being obtained following a common protocol including a standardized sampling strategy, standardized sampling and analytical methods and a data management system. Following strict quality control procedures, exposure data are consequently added to a central database. The data comprises personal exposure measurements including auxiliary information on work and other conditions during sampling. Currently, the IMA-DMP database consists of almost 28,000 personal measurements which have been performed from 2000 until 2015 representing 29 half-yearly sampling campaigns. The exposure data have been collected from 160 different worksites owned by 35 industrial mineral companies and comes from 23 European countries and approximately 5000 workers. The IMA-DMP database provides the European minerals sector with reliable data regarding worker personal exposures to respirable dust and quartz. The database can be used as a powerful tool to address outstanding scientific issues on long-term exposure trends and exposure variability, and importantly, as a surveillance tool to evaluate exposure control measures. The database will be valuable for future epidemiological studies on respiratory health effects and will allow for estimation of quantitative exposure response relationships. Copyright © 2017 The Authors. Published by Elsevier GmbH.. All rights reserved.

  9. Evaluation of Cross-Protocol Stability of a Fully Automated Brain Multi-Atlas Parcellation Tool.

    PubMed

    Liang, Zifei; He, Xiaohai; Ceritoglu, Can; Tang, Xiaoying; Li, Yue; Kutten, Kwame S; Oishi, Kenichi; Miller, Michael I; Mori, Susumu; Faria, Andreia V

    2015-01-01

    Brain parcellation tools based on multiple-atlas algorithms have recently emerged as a promising method with which to accurately define brain structures. When dealing with data from various sources, it is crucial that these tools are robust for many different imaging protocols. In this study, we tested the robustness of a multiple-atlas, likelihood fusion algorithm using Alzheimer's Disease Neuroimaging Initiative (ADNI) data with six different protocols, comprising three manufacturers and two magnetic field strengths. The entire brain was parceled into five different levels of granularity. In each level, which defines a set of brain structures, ranging from eight to 286 regions, we evaluated the variability of brain volumes related to the protocol, age, and diagnosis (healthy or Alzheimer's disease). Our results indicated that, with proper pre-processing steps, the impact of different protocols is minor compared to biological effects, such as age and pathology. A precise knowledge of the sources of data variation enables sufficient statistical power and ensures the reliability of an anatomical analysis when using this automated brain parcellation tool on datasets from various imaging protocols, such as clinical databases.

  10. Comparative recovery of uninjured and heat-injured Listeria monocytogenes cells from bovine milk.

    PubMed Central

    Crawford, R G; Beliveau, C M; Peeler, J T; Donnelly, C W; Bunning, V K

    1989-01-01

    The standard selective enrichment protocols of the Food and Drug Administration (FDA) and U.S. Department of Agriculture (USDA) were compared with an experimental nonselective broth enrichment (NSB) protocol and variations of the standard cold-enrichment (CE) protocol for the recovery of heat-injured Listeria monocytogenes. Bacterial cells (10(7)/ml) were suspended in sterile milk and heated at 71.7 degrees C in a slug-flow heat exchanger for holding times ranging from 1 to 30 s. Surviving cells were determined (50% endpoint) by the given protocols, and the following D values were obtained: NSB, D = 2.0 +/- 0.5 s; FDA, D = 1.4 +/- 0.3 s; USDA, D = 0.6 +/- 0.2 s; CE, D less than or equal to 1.2 s. The respective direct-plating media used in these enrichments were also analyzed for recovery, and the following D values were calculated from the enumeration of surviving cells; NSB, D = 2.7 +/- 0.8 s; FDA, D = 1.3 +/- 0.4 s; USDA, D = 0.7 +/- 0.2 s. The low levels of heat-injured L. monocytogenes cells which were detected at inactivation endpoints on the optimal nonselective media (25 degrees C for 7 days) failed to recover and multiply during experimental CEs (4 degrees C for 28 days). Initial inactivation experiments in which raw whole milk was used as the heating menstruum gave much lower recoveries with all protocols. The detectable limits for uninjured cells that were suspended in raw milk were similar (0.35 to 3.2 cells per ml) for the standard CE, FDA, and USDA protocols.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:2504109

  11. The process of orthognathic care in an NHS region

    PubMed Central

    Parbatani, R; Williams, AC; Ireland, AJ; Sandy, JR

    2010-01-01

    INTRODUCTION The aim of this study was to evaluate, within an NHS region, the process of care and the standard of record for orthognathic patients. PATIENTS AND METHODS A retrospective analysis of the medical records of 372 patients who underwent orthognathic surgery 1 January 1995 and 31 April 2000 in the South West Region of the UK. RESULTS Most patients underwent joint orthodontic and maxillofacial planning and had third molars extracted under general anaesthesia prior to orthognathic surgery. There was a significant difference in the median operation times and length of stay for bimaxillary surgery (4 h/4 days) compared with single jaw surgery (2 h/3 days; P < 0.001). Just over 15% of patients required removal of internal fixation plates after surgery, with nearly 90% of these requiring a further episode of general anaesthesia. The level of record keeping and patient review was variable with no regional standardisation. CONCLUSIONS This study is evidence of a generally acceptable standard in the process of care, which was found to follow international and national practices. However, at the time of the study there was no regional protocol for patient records or patient review, highlighting the need for the establishment of a regional database. PMID:19887023

  12. Automatic cardiac LV segmentation in MRI using modified graph cuts with smoothness and interslice constraints.

    PubMed

    Albà, Xènia; Figueras I Ventura, Rosa M; Lekadir, Karim; Tobon-Gomez, Catalina; Hoogendoorn, Corné; Frangi, Alejandro F

    2014-12-01

    Magnetic resonance imaging (MRI), specifically late-enhanced MRI, is the standard clinical imaging protocol to assess cardiac viability. Segmentation of myocardial walls is a prerequisite for this assessment. Automatic and robust multisequence segmentation is required to support processing massive quantities of data. A generic rule-based framework to automatically segment the left ventricle myocardium is presented here. We use intensity information, and include shape and interslice smoothness constraints, providing robustness to subject- and study-specific changes. Our automatic initialization considers the geometrical and appearance properties of the left ventricle, as well as interslice information. The segmentation algorithm uses a decoupled, modified graph cut approach with control points, providing a good balance between flexibility and robustness. The method was evaluated on late-enhanced MRI images from a 20-patient in-house database, and on cine-MRI images from a 15-patient open access database, both using as reference manually delineated contours. Segmentation agreement, measured using the Dice coefficient, was 0.81±0.05 and 0.92±0.04 for late-enhanced MRI and cine-MRI, respectively. The method was also compared favorably to a three-dimensional Active Shape Model approach. The experimental validation with two magnetic resonance sequences demonstrates increased accuracy and versatility. © 2013 Wiley Periodicals, Inc.

  13. Variations in measured performance of CAD schemes due to database composition and scoring protocol

    NASA Astrophysics Data System (ADS)

    Nishikawa, Robert M.; Yarusso, Laura M.

    1998-06-01

    There is now a large effort towards developing computer- aided diagnosis (CAD) techniques. It is important to be able to compare performance of different approaches to be able to determine which ones are the most efficacious. There are currently a number of barriers preventing meaningful (statistical) comparisons, two of which are discussed in this paper: database composition and scoring protocol. We have examined how the choice of cases used to test a CAD scheme can affect its performance. We found that our computer scheme varied between a sensitivity of 100% to 77%, at a false-positive rate of 1.0 per image, with only 100% change in the composition of the database. To evaluate the performance of a CAD scheme the output of the computer must be graded. There are a number of different criteria that are being used by different investigators. We have found that for the same set of detection results, the measured sensitivity can be between 40 - 90% depending on the scoring methodology. Clearly consensus must be reached on these two issues in order for the field to make rapid progress. As it stands now, it is not possible to make meaningful comparisons of different techniques.

  14. Survey of Protocols for the Manual Segmentation of the Hippocampus: Preparatory Steps Towards a Joint EADC-ADNI Harmonized Protocol

    PubMed Central

    Boccardi, Marina; Ganzola, Rossana; Bocchetta, Martina; Pievani, Michela; Redolfi, Alberto; Bartzokis, George; Camicioli, Richard; Csernansky, John G.; de Leon, Mony J.; deToledo-Morrell, Leyla; Killiany, Ronald J.; Lehéricy, Stéphane; Pantel, Johannes; Pruessner, Jens C.; Soininen, H.; Watson, Craig; Duchesne, Simon; Jack, Clifford R.; Frisoni, Giovanni B.

    2013-01-01

    Manual segmentation from magnetic resonance imaging (MR) is the gold standard for evaluating hippocampal atrophy in Alzheimer’s disease (AD). Nonetheless, different segmentation protocols provide up to 2.5-fold volume differences. Here we surveyed the most frequently used segmentation protocols in the AD literature as a preliminary step for international harmonization. The anatomical landmarks (anteriormost and posteriormost slices, superior, inferior, medial, and lateral borders) were identified from 12 published protocols for hippocampal manual segmentation ([Abbreviation] first author, publication year: [B] Bartzokis, 1998; [C] Convit, 1997; [dTM] deToledo-Morrell, 2004; [H] Haller, 1997; [J] Jack, 1994; [K] Killiany, 1993; [L] Lehericy, 1994; [M] Malykhin, 2007; [Pa] Pantel, 2000; [Pr] Pruessner, 2000; [S] Soininen, 1994; [W] Watson, 1992). The hippocampi of one healthy control and one AD patient taken from the 1.5T MR ADNI database were segmented by a single rater according to each protocol. The accuracy of the protocols’ interpretation and translation into practice was checked with lead authors of protocols through individual interactive web conferences. Semantically harmonized landmarks and differences were then extracted, regarding: (a) the posteriormost slice, protocol [B] being the most restrictive, and [H, M, Pa, Pr, S] the most inclusive; (b) inclusion [C, dTM, J, L, M, Pr, W] or exclusion [B, H, K, Pa, S] of alveus/fimbria; (c) separation from the parahippocampal gyrus, [C] being the most restrictive, [B, dTM, H, J, Pa, S] the most inclusive. There were no substantial differences in the definition of the anteriormost slice. This survey will allow us to operationalize differences among protocols into tracing units, measure their impact on the repeatability and diagnostic accuracy of manual hippocampal segmentation, and finally develop a harmonized protocol. PMID:21971451

  15. Molecular Identification and Databases in Fusarium

    USDA-ARS?s Scientific Manuscript database

    DNA sequence-based methods for identifying pathogenic and mycotoxigenic Fusarium isolates have become the gold standard worldwide. Moreover, fusarial DNA sequence data are increasing rapidly in several web-accessible databases for comparative purposes. Unfortunately, the use of Basic Alignment Sea...

  16. Electronic protocol of respiratory physical therapy in patients with idiopathic adolescent scoliosis.

    PubMed

    Cano, Danila Vieira Baldini; Malafaia, Osvaldo; Alves, Vera Lúcia dos Santos; Avanzi, Osmar; Pinto, José Simão de Paula

    2011-01-01

    To create a clinical database of respiratory function in patients with adolescent idiopathic scoliosis; computerize and store this clinical data through the use of a software; incorporate this electronic protocol to the SINPE© (Integrated Electronic Protocols System) and analyze a pilot project with interpretation of results. From the literature review a computerized data bank of clinical data of postural deviations was set up (master protocol). Upon completion of the master protocol a specific protocol of respiratory function in patients with adolescent idiopathic scoliosis was designed and a pilot project was conducted to collect and analyze data from ten patients. It was possible to create the master protocol of postural deviations and the specific protocol of respiratory function in patients with adolescent idiopathic scoliosis. The data collected in the pilot project was processed by the SINPE ANALYZER©, generating charts and statistics. The establishment of the clinical database of adolescent idiopathic scoliosis was possible. Computerization and storage of clinical data using the software were viable. The electronic protocol of adolescent idiopathic scoliosis could be incorporated into the SINPE© and its use in the pilot project was successful.

  17. Database integration of protocol-specific neurological imaging datasets

    PubMed Central

    Pacurar, Emil E.; Sethi, Sean K.; Habib, Charbel; Laze, Marius O.; Martis-Laze, Rachel; Haacke, E. Mark

    2016-01-01

    For many years now, Magnetic Resonance Innovations (MR Innovations), a magnetic resonance imaging (MRI) software development, technology, and research company, has been aggregating a multitude of MRI data from different scanning sites through its collaborations and research contracts. The majority of the data has adhered to neuroimaging protocols developed by our group which has helped ensure its quality and consistency. The protocols involved include the study of: traumatic brain injury, extracranial venous imaging for multiple sclerosis and Parkinson's disease, and stroke. The database has proven invaluable in helping to establish disease biomarkers, validate findings across multiple data sets, develop and refine signal processing algorithms, and establish both public and private research collaborations. Myriad Masters and PhD dissertations have been possible thanks to the availability of this database. As an example of a project that cuts across diseases, we have used the data and specialized software to develop new guidelines for detecting cerebral microbleeds. Ultimately, the database has been vital in our ability to provide tools and information for researchers and radiologists in diagnosing their patients, and we encourage collaborations and welcome sharing of similar data in this database. PMID:25959660

  18. Privacy-Preserving Classifier Learning

    NASA Astrophysics Data System (ADS)

    Brickell, Justin; Shmatikov, Vitaly

    We present an efficient protocol for the privacy-preserving, distributed learning of decision-tree classifiers. Our protocol allows a user to construct a classifier on a database held by a remote server without learning any additional information about the records held in the database. The server does not learn anything about the constructed classifier, not even the user’s choice of feature and class attributes.

  19. OSI and TCP/IP

    NASA Technical Reports Server (NTRS)

    Randolph, Lynwood P.

    1994-01-01

    The Open Systems Interconnection Transmission Control Protocol/Internet Protocol (OSI TCP/IP) and the Government Open Systems Interconnection Profile (GOSIP) are compared and described in terms of Federal internetworking. The organization and functions of the Federal Internetworking Requirements Panel (FIRP) are discussed and the panel's conclusions and recommendations with respect to the standards and implementation of the National Information Infrastructure (NII) are presented.

  20. Improved Infrastucture for Cdms and JPL Molecular Spectroscopy Catalogues

    NASA Astrophysics Data System (ADS)

    Endres, Christian; Schlemmer, Stephan; Drouin, Brian; Pearson, John; Müller, Holger S. P.; Schilke, P.; Stutzki, Jürgen

    2014-06-01

    Over the past years a new infrastructure for atomic and molecular databases has been developed within the framework of the Virtual Atomic and Molecular Data Centre (VAMDC). Standards for the representation of atomic and molecular data as well as a set of protocols have been established which allow now to retrieve data from various databases through one portal and to combine the data easily. Apart from spectroscopic databases such as the Cologne Database for Molecular Spectroscopy (CDMS), the Jet Propulsion Laboratory microwave, millimeter and submillimeter spectral line catalogue (JPL) and the HITRAN database, various databases on molecular collisions (BASECOL, KIDA) and reactions (UMIST) are connected. Together with other groups within the VAMDC consortium we are working on common user tools to simplify the access for new customers and to tailor data requests for users with specified needs. This comprises in particular tools to support the analysis of complex observational data obtained with the ALMA telescope. In this presentation requests to CDMS and JPL will be used to explain the basic concepts and the tools which are provided by VAMDC. In addition a new portal to CDMS will be presented which has a number of new features, in particular meaningful quantum numbers, references linked to data points, access to state energies and improved documentation. Fit files are accessible for download and queries to other databases are possible.

  1. Artificial Neural Networks for differential diagnosis of breast lesions in MR-Mammography: a systematic approach addressing the influence of network architecture on diagnostic performance using a large clinical database.

    PubMed

    Dietzel, Matthias; Baltzer, Pascal A T; Dietzel, Andreas; Zoubi, Ramy; Gröschel, Tobias; Burmeister, Hartmut P; Bogdan, Martin; Kaiser, Werner A

    2012-07-01

    Differential diagnosis of lesions in MR-Mammography (MRM) remains a complex task. The aim of this MRM study was to design and to test robustness of Artificial Neural Network architectures to predict malignancy using a large clinical database. For this IRB-approved investigation standardized protocols and study design were applied (T1w-FLASH; 0.1 mmol/kgBW Gd-DTPA; T2w-TSE; histological verification after MRM). All lesions were evaluated by two experienced (>500 MRM) radiologists in consensus. In every lesion, 18 previously published descriptors were assessed and documented in the database. An Artificial Neural Network (ANN) was developed to process this database (The-MathWorks/Inc., feed-forward-architecture/resilient back-propagation-algorithm). All 18 descriptors were set as input variables, whereas histological results (malignant vs. benign) was defined as classification variable. Initially, the ANN was optimized in terms of "Training Epochs" (TE), "Hidden Layers" (HL), "Learning Rate" (LR) and "Neurons" (N). Robustness of the ANN was addressed by repeated evaluation cycles (n: 9) with receiver operating characteristics (ROC) analysis of the results applying 4-fold Cross Validation. The best network architecture was identified comparing the corresponding Area under the ROC curve (AUC). Histopathology revealed 436 benign and 648 malignant lesions. Enhancing the level of complexity could not increase diagnostic accuracy of the network (P: n.s.). The optimized ANN architecture (TE: 20, HL: 1, N: 5, LR: 1.2) was accurate (mean-AUC 0.888; P: <0.001) and robust (CI: 0.885-0.892; range: 0.880-0.898). The optimized neural network showed robust performance and high diagnostic accuracy for prediction of malignancy on unknown data. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  2. Standard care versus protocol based therapy for new onset Pseudomonas aeruginosa in cystic fibrosis.

    PubMed

    Mayer-Hamblett, Nicole; Rosenfeld, Margaret; Treggiari, Miriam M; Konstan, Michael W; Retsch-Bogart, George; Morgan, Wayne; Wagener, Jeff; Gibson, Ronald L; Khan, Umer; Emerson, Julia; Thompson, Valeria; Elkin, Eric P; Ramsey, Bonnie W

    2013-10-01

    The Early Pseudomonal Infection Control (EPIC) randomized trial rigorously evaluated the efficacy of different antibiotic regimens for eradication of newly identified Pseudomonas (Pa) in children with cystic fibrosis (CF). Protocol based therapy in the trial was provided based on culture positivity independent of symptoms. It is unclear whether outcomes observed in the clinical trial were different than those that would have been observed with historical standard of care driven more heavily by respiratory symptoms than culture positivity alone. We hypothesized that the incidence of Pa recurrence and hospitalizations would be significantly reduced among trial participants as compared to historical controls whose standard of care preceded the widespread adoption of tobramycin inhalation solution (TIS) as initial eradication therapy at the time of new isolation of Pa. Eligibility criteria from the trial were used to derive historical controls from the Epidemiologic Study of CF (ESCF) who received standard of care treatment from 1995 to 1998, before widespread availability of TIS. Pa recurrence and hospitalization outcomes were assessed over a 15-month time period. As compared to 100% of the 304 trial participants, only 296/608 (49%) historical controls received antibiotics within an average of 20 weeks after new onset Pa. Pa recurrence occurred among 104/298 (35%) of the trial participants as compared to 295/549 (54%) of historical controls (19% difference, 95% CI: 12%, 26%, P < 0.001). No significant differences in the incidence of hospitalization were observed between cohorts. Protocol-based antimicrobial therapy for newly acquired Pa resulted in a lower rate of Pa recurrence but comparable hospitalization rates as compared to a historical control cohort less aggressively treated with antibiotics for new onset Pa. © 2013 Wiley Periodicals, Inc.

  3. Standard Care versus Protocol Based Therapy for New Onset Pseudomonas aeruginosa in Cystic Fibrosis

    PubMed Central

    Mayer-Hamblett, Nicole; Rosenfeld, Margaret; Treggiari, Miriam M.; Konstan, Michael W.; Retsch-Bogart, George; Morgan, Wayne; Wagener, Jeff; Gibson, Ronald L.; Khan, Umer; Emerson, Julia; Thompson, Valeria; Elkin, Eric P.; Ramsey, Bonnie W.

    2014-01-01

    Rationale The Early Pseudomonal Infection Control (EPIC) randomized trial rigorously evaluated the efficacy of different antibiotic regimens for eradication of newly identified Pseudomonas (Pa) in children with cystic fibrosis (CF). Protocol based therapy in the trial was provided based on culture positivity independent of symptoms. It is unclear whether outcomes observed in the clinical trial were different than those that would have been observed with historical standard of care driven more heavily by respiratory symptoms than culture positivity alone. We hypothesized that the incidence of Pa recurrence and hospitalizations would be significantly reduced among trial participants as compared to historical controls whose standard of care preceded the widespread adoption of tobramycin inhalation solution (TIS) as initial eradication therapy at the time of new isolation of Pa. Methods Eligibility criteria from the trial were used to derive historical controls from the Epidemiologic Study of CF (ESCF) who received standard of care treatment from 1995 to 1998, before widespread availability of TIS. Pa recurrence and hospitalization outcomes were assessed over a 15-month time period. Results As compared to 100% of the 304 trial participants, only 296/608 (49%) historical controls received antibiotics within an average of 20 weeks after new onset Pa. Pa recurrence occurred among 104/298 (35%) of the trial participants as compared to 295/549 (54%) of historical controls (19% difference, 95% CI: 12%, 26%, p<0.001). No significant differences in the incidence of hospitalization were observed between cohorts. Conclusions Protocol-based antimicrobial therapy for newly acquired Pa resulted in a lower rate of Pa recurrence but comparable hospitalization rates as compared to a historical control cohort less aggressively treated with antibiotics for new onset Pa. PMID:23818295

  4. Kilovoltage cone-beam CT: Comparative dose and image quality evaluations in partial and full-angle scan protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Sangroh; Yoo, Sua; Yin Fangfang

    2010-07-15

    Purpose: To assess imaging dose of partial and full-angle kilovoltage CBCT scan protocols and to evaluate image quality for each protocol. Methods: The authors obtained the CT dose index (CTDI) of the kilovoltage CBCT protocols in an on-board imager by ion chamber (IC) measurements and Monte Carlo (MC) simulations. A total of six new CBCT scan protocols were evaluated: Standard-dose head (100 kVp, 151 mA s, partial-angle), low-dose head (100 kVp, 75 mA s, partial-angle), high-quality head (100 kVp, 754 mA s, partial-angle), pelvis (125 kVp, 706 mA s, full-angle), pelvis spotlight (125 kVp, 752 mA s, partial-angle), and low-dosemore » thorax (110 kVp, 271 mA s, full-angle). Using the point dose method, various CTDI values were calculated by (1) the conventional weighted CTDI (CTDI{sub w}) calculation and (2) Bakalyar's method (CTDI{sub wb}). The MC simulations were performed to obtain the CTDI{sub w} and CTDI{sub wb}, as well as from (3) central slice averaging (CTDI{sub 2D}) and (4) volume averaging (CTDI{sub 3D}) techniques. The CTDI values of the new protocols were compared to those of the old protocols (full-angle CBCT protocols). Image quality of the new protocols was evaluated following the CBCT image quality assurance (QA) protocol [S. Yoo et al., ''A quality assurance program for the on-board imager registered ,'' Med. Phys. 33(11), 4431-4447 (2006)] testing Hounsfield unit (HU) linearity, spatial linearity/resolution, contrast resolution, and HU uniformity. Results: The CTDI{sub w} were found as 6.0, 3.2, 29.0, 25.4, 23.8, and 7.7 mGy for the new protocols, respectively. The CTDI{sub w} and CTDI{sub wb} differed within +3% between IC measurements and MC simulations. Method (2) results were within {+-}12% of method (1). In MC simulations, the CTDI{sub w} and CTDI{sub wb} were comparable to the CTDI{sub 2D} and CTDI{sub 3D} with the differences ranging from -4.3% to 20.6%. The CTDI{sub 3D} were smallest among all the CTDI values. CTDI{sub w} of the new protocols were found as {approx}14 times lower for standard head scan and 1.8 times lower for standard body scan than the old protocols, respectively. In the image quality QA tests, all the protocols except low-dose head and low-dose thorax protocols were within the tolerance in the HU verification test. The HU value for the two protocols was always higher than the nominal value. All the protocols passed the spatial linearity/resolution and HU uniformity tests. In the contrast resolution test, only high-quality head and pelvis scan protocols were within the tolerance. In addition, crescent effect was found in the partial-angle scan protocols. Conclusions: The authors found that CTDI{sub w} of the new CBCT protocols has been significantly reduced compared to the old protocols with acceptable image quality. The CTDI{sub w} values in the point dose method were close to the volume averaging method within 9%-21% for all the CBCT scan protocols. The Bakalyar's method produced more accurate dose estimation within 14%. The HU inaccuracy from low-dose head and low-dose thorax protocols can render incorrect dose results in the treatment planning system. When high soft-tissue contrast data are desired, high-quality head or pelvis scan protocol is recommended depending on the imaging area. The point dose method can be applicable to estimate CBCT dose with reasonable accuracy in the clinical environment.« less

  5. Acupuncture for common cold: A systematic review and meta-analyze protocol.

    PubMed

    Cheng, Ying; Gao, Bifeng; Jin, Yuhao; Xu, Na; Guo, Taipin

    2018-03-01

    The common cold (CC) is the most common syndromes of infection in human beings, but there is currently no special treatment. For this reason, acupuncture is used to relieve the symptoms of the CC. Acupuncture is a traditional Chinese medicine (TCM) therapy that has been used for over 2000 years to treat various diseases. However, few studies have provided evidence for the efficacy and safety of acupuncture for the CC. This study aims to evaluate the effectiveness and safety of acupuncture on CC periods and its symptoms. The following electronic databases will be searched for studies conducted through January 1, 2019: Web of Science, Cochrane Library, EBASE, World Health Organization International Clinical Trials Registry Platform, Springer, Wan-fang database, Chinese Biomedical Literature Database (CBM), Chinese Scientific Journal Database (VIP), China National Knowledge Infrastructure (CNKI), and other sources. All randomized controlled trials on acupuncture for common cold will be included. Risk of bias will be assessed using the Cochrane risk of bias assessment tool, while RevMan V.5.3.5 software will be implemented for the assessment of bias risk, data synthesis, subgroup analysis, and meta-analyses if conditions are met. Continuous outcomes will be presented as mean difference (MD) or standard mean difference (SMD), while dichotomous data will be expressed as relative risk. A high-quality synthesis of current evidence of acupuncture for CC will be stated from several aspect using subjective reports and objective measures of performance. The reduction rate of common cold symptoms after initial treatment, resolved cold symptoms, and reduced cold duration will be collected. This protocol will present the evidence of whether acupuncture therapy is an effective intervention for CC.

  6. Introducing the GRACEnet/REAP Data Contribution, Discovery, and Retrieval System.

    PubMed

    Del Grosso, S J; White, J W; Wilson, G; Vandenberg, B; Karlen, D L; Follett, R F; Johnson, J M F; Franzluebbers, A J; Archer, D W; Gollany, H T; Liebig, M A; Ascough, J; Reyes-Fox, M; Pellack, L; Starr, J; Barbour, N; Polumsky, R W; Gutwein, M; James, D

    2013-07-01

    Difficulties in accessing high-quality data on trace gas fluxes and performance of bioenergy/bioproduct feedstocks limit the ability of researchers and others to address environmental impacts of agriculture and the potential to produce feedstocks. To address those needs, the GRACEnet (Greenhouse gas Reduction through Agricultural Carbon Enhancement network) and REAP (Renewable Energy Assessment Project) research programs were initiated by the USDA Agricultural Research Service (ARS). A major product of these programs is the creation of a database with greenhouse gas fluxes, soil carbon stocks, biomass yield, nutrient, and energy characteristics, and input data for modeling cropped and grazed systems. The data include site descriptors (e.g., weather, soil class, spatial attributes), experimental design (e.g., factors manipulated, measurements performed, plot layouts), management information (e.g., planting and harvesting schedules, fertilizer types and amounts, biomass harvested, grazing intensity), and measurements (e.g., soil C and N stocks, plant biomass amount and chemical composition). To promote standardization of data and ensure that experiments were fully described, sampling protocols and a spreadsheet-based data-entry template were developed. Data were first uploaded to a temporary database for checking and then were uploaded to the central database. A Web-accessible application allows for registered users to query and download data including measurement protocols. Separate portals have been provided for each project (GRACEnet and REAP) at nrrc.ars.usda.gov/slgracenet/#/Home and nrrc.ars.usda.gov/slreap/#/Home. The database architecture and data entry template have proven flexible and robust for describing a wide range of field experiments and thus appear suitable for other natural resource research projects. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  7. Database documentation of marine mammal stranding and mortality: current status review and future prospects.

    PubMed

    Chan, Derek K P; Tsui, Henry C L; Kot, Brian C W

    2017-11-21

    Databases are systematic tools to archive and manage information related to marine mammal stranding and mortality events. Stranding response networks, governmental authorities and non-governmental organizations have established regional or national stranding networks and have developed unique standard stranding response and necropsy protocols to document and track stranded marine mammal demographics, signalment and health data. The objectives of this study were to (1) describe and review the current status of marine mammal stranding and mortality databases worldwide, including the year established, types of database and their goals; and (2) summarize the geographic range included in the database, the number of cases recorded, accessibility, filter and display methods. Peer-reviewed literature was searched, focussing on published databases of live and dead marine mammal strandings and mortality and information released from stranding response organizations (i.e. online updates, journal articles and annual stranding reports). Databases that were not published in the primary literature or recognized by government agencies were excluded. Based on these criteria, 10 marine mammal stranding and mortality databases were identified, and strandings and necropsy data found in these databases were evaluated. We discuss the results, limitations and future prospects of database development. Future prospects include the development and application of virtopsy, a new necropsy investigation tool. A centralized web-accessed database of all available postmortem multimedia from stranded marine mammals may eventually support marine conservation and policy decisions, which will allow the use of marine animals as sentinels of ecosystem health, working towards a 'One Ocean-One Health' ideal.

  8. Use of cardiocerebral resuscitation or AHA/ERC 2005 Guidelines is associated with improved survival from out-of-hospital cardiac arrest: a systematic review and meta-analysis.

    PubMed

    Salmen, Marcus; Ewy, Gordon A; Sasson, Comilla

    2012-01-01

    To determine whether the use of cardiocerebral resuscitation (CCR) or AHA/ERC 2005 Resuscitation Guidelines improved patient outcomes from out-of-hospital cardiac arrest (OHCA) compared to older guidelines. Systematic review and meta-analysis. MEDLINE, EMBASE, Web of Science and the Cochrane Library databases. We also hand-searched study references and consulted experts. Design: randomised controlled trials and observational studies. OHCA patients, age >17 years. 'Control' protocol versus 'Study' protocol. 'Control' protocol defined as AHA/ERC 2000 Guidelines for cardiopulmonary resuscitation (CPR). 'Study' protocol defined as AHA/ERC 2005 Guidelines for CPR, or a CCR protocol. Survival to hospital discharge. High-quality or medium-quality studies, as measured by the Newcastle Ottawa Scale using predefined categories. Twelve observational studies met inclusion criteria. All the three studies using CCR demonstrated significantly improved survival compared to use of AHA 2000 Guidelines, as did five of the nine studies using AHA/ERC 2005 Guidelines. Pooled data demonstrate that use of a CCR protocol has an unadjusted OR of 2.26 (95% CI 1.64 to 3.12) for survival to hospital discharge among all cardiac arrest patients. Among witnessed ventricular fibrillation/ventricular tachycardia (VF/VT) patients, CCR increased survival by an OR of 2.98 (95% CI 1.92 to 4.62). Studies using AHA/ERC 2005 Guidelines showed an overall trend towards increased survival, but significant heterogeneity existed among these studies. We demonstrate an association with improved survival from OHCA when CCR protocols or AHA/ERC 2005 Guidelines are compared to use of older guidelines. In the subgroup of patients with witnessed VF/VT, there was a threefold increase in OHCA survival when CCR was used. CCR appears to be a promising resuscitation protocol for Emergency Medical Services providers in increasing survival from OHCA. Future research will need to be conducted to directly compare AHA/ERC 2010 Guidelines with the CCR approach.

  9. Heart rate variability indexes as a marker of chronic adaptation in athletes: a systematic review.

    PubMed

    da Silva, Vanessa Pereira; de Oliveira, Natacha Alves; Silveira, Heitor; Mello, Roger Gomes Tavares; Deslandes, Andrea Camaz

    2015-03-01

    Regular exercise promotes functional and structural changes in the central and peripheral mechanisms of the cardiovascular system. Heart rate variability (HRV) measurement provides a sensitive indicator of the autonomic balance. However, because of the diversity of methods and variables used, the results are difficult to compare in the sports sciences. Since the protocol (supine, sitting, or standing position) and measure (time or frequency domain) are not well defined, the aim of this study is to investigate the HRV measures that better indicates the chronic adaptations of physical exercise in athletes. PubMed (MEDLINE), Web of Science, SciELO (Scientific Electronic Library), and Scopus databases were consulted. Original complete articles in English with short-term signals evaluating young and adult athletes, between 17 and 40 years old, with a control group, published up to 2013 were included. Selected 19 of 1369 studies, for a total sample pool of 333 male and female athletes who practice different sports. The main protocols observed were the supine or standing positions in free or controlled breathing conditions. The main statistical results found in this study were the higher mean RR, standard deviation of RR intervals, and high frequency in athletes group. In addition, the analyses of Cohen's effect size showed that factors as modality of sport, protocol used and unit of measure selected could influence this expected results. Our findings indicate that time domain measures are more consistent than frequency domain to describe the chronic cardiovascular autonomic adaptations in athletes. © 2014 Wiley Periodicals, Inc.

  10. Development of Uniform Protocol for Alopecia Areata Clinical Trials.

    PubMed

    Solomon, James A

    2015-11-01

    Developing a successful treatment for alopecia areata (AA), clearly has not been at the forefront of the agenda for new drug/device development among the pharmaceutical and medical device industry. The National Alopecia Areata Foundation (NAAF), a patient advocacy group, initiated a plan to facilitate and drive clinical research toward finding safe and efficacious treatments for AA. As such, Alopecia Areata Uniform Protocols for clinical trials to test new treatments for AA were developed. The design of the uniform protocol is to accomplish the development of a plug-and-play template as well as to provide a framework wherein data from studies utilizing the uniform protocol can be compared through consistency of inclusions/exclusions, safety, and outcome assessment measures. A core uniform protocol for use by pharmaceutical companies in testing proof of concept for investigational products to treat AA. The core protocol includes standardized title, informed consent, inclusion/exclusion criteria, disease outcome assessments, and safety assessments. The statistical methodology to assess successful outcomes will also be standardized. The protocol as well as the informed consent form has been approved in concept by Liberty IRB and is ready to present to pharmaceutical companies.

  11. Content based information retrieval in forensic image databases.

    PubMed

    Geradts, Zeno; Bijhold, Jurrien

    2002-03-01

    This paper gives an overview of the various available image databases and ways of searching these databases on image contents. The developments in research groups of searching in image databases is evaluated and compared with the forensic databases that exist. Forensic image databases of fingerprints, faces, shoeprints, handwriting, cartridge cases, drugs tablets, and tool marks are described. The developments in these fields appear to be valuable for forensic databases, especially that of the framework in MPEG-7, where the searching in image databases is standardized. In the future, the combination of the databases (also DNA-databases) and possibilities to combine these can result in stronger forensic evidence.

  12. MoccaDB - an integrative database for functional, comparative and diversity studies in the Rubiaceae family

    PubMed Central

    Plechakova, Olga; Tranchant-Dubreuil, Christine; Benedet, Fabrice; Couderc, Marie; Tinaut, Alexandra; Viader, Véronique; De Block, Petra; Hamon, Perla; Campa, Claudine; de Kochko, Alexandre; Hamon, Serge; Poncet, Valérie

    2009-01-01

    Background In the past few years, functional genomics information has been rapidly accumulating on Rubiaceae species and especially on those belonging to the Coffea genus (coffee trees). An increasing number of expressed sequence tag (EST) data and EST- or genomic-derived microsatellite markers have been generated, together with Conserved Ortholog Set (COS) markers. This considerably facilitates comparative genomics or map-based genetic studies through the common use of orthologous loci across different species. Similar genomic information is available for e.g. tomato or potato, members of the Solanaceae family. Since both Rubiaceae and Solanaceae belong to the Euasterids I (lamiids) integration of information on genetic markers would be possible and lead to more efficient analyses and discovery of key loci involved in important traits such as fruit development, quality, and maturation, or adaptation. Our goal was to develop a comprehensive web data source for integrated information on validated orthologous markers in Rubiaceae. Description MoccaDB is an online MySQL-PHP driven relational database that houses annotated and/or mapped microsatellite markers in Rubiaceae. In its current release, the database stores 638 markers that have been defined on 259 ESTs and 379 genomic sequences. Marker information was retrieved from 11 published works, and completed with original data on 132 microsatellite markers validated in our laboratory. DNA sequences were derived from three Coffea species/hybrids. Microsatellite markers were checked for similarity, in vitro tested for cross-amplification and diversity/polymorphism status in up to 38 Rubiaceae species belonging to the Cinchonoideae and Rubioideae subfamilies. Functional annotation was provided and some markers associated with described metabolic pathways were also integrated. Users can search the database for marker, sequence, map or diversity information through multi-option query forms. The retrieved data can be browsed and downloaded, along with protocols used, using a standard web browser. MoccaDB also integrates bioinformatics tools (CMap viewer and local BLAST) and hyperlinks to related external data sources (NCBI GenBank and PubMed, SOL Genomic Network database). Conclusion We believe that MoccaDB will be extremely useful for all researchers working in the areas of comparative and functional genomics and molecular evolution, in general, and population analysis and association mapping of Rubiaceae and Solanaceae species, in particular. PMID:19788737

  13. Micro X-ray Fluorescence Study of Late Pre-Hispanic Ceramics from the Western Slopes of the South Central Andes Region in the Arica y Parinacota Region, Chile: A New Methodological Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flewett, S.; Saintenoy, T.; Sepulveda, M.

    Archeological ceramic paste material typically consists of a mix of a clay matrix and various millimeter and sub-millimeter sized mineral inclusions. Micro X-ray Fluorescence (μXRF) is a standard compositional classification tool, and in this work we propose and demonstrate an improved fluorescence map processing protocol where the mineral inclusions are automatically separated from the clay matrix to allow independent statistical analysis of the two parts. Application of this protocol allowed us to improve enhance the differentiation discrimination between different ceramic shards compared with the standard procedure of comparing working with only the spatially averaged elemental concentrations. Using the new protocol,more » we performed an initial compositional classification of a set of 83 ceramic shards from the western slopes of the south central Andean region in the Arica y Parinacota region of present-day far northern Chile. Comparing the classifications obtained using the new versus the old (average concentrations only) protocols, we found that some samples were erroneously classified with the old protocol. From an archaeological perspective, a very broad and heterogeneous sample set was used in this study due to the fact that this was the first such study to be performed on ceramics from this region. This allowed a general overview to be obtained, however further work on more specific sample sets will be necessary to extract concrete archaeological conclusions.« less

  14. Standardization of a Videofluoroscopic Swallow Study Protocol to Investigate Dysphagia in Dogs.

    PubMed

    Harris, R A; Grobman, M E; Allen, M J; Schachtel, J; Rawson, N E; Bennett, B; Ledyayev, J; Hopewell, B; Coates, J R; Reinero, C R; Lever, T E

    2017-03-01

    Videofluoroscopic swallow study (VFSS) is the gold standard for diagnosis of dysphagia in veterinary medicine but lacks standardized protocols that emulate physiologic feeding practices. Age impacts swallow function in humans but has not been evaluated by VFSS in dogs. To develop a protocol with custom kennels designed to allow free-feeding of 3 optimized formulations of contrast media and diets that address limitations of current VFSS protocols. We hypothesized that dogs evaluated by a free-feeding VFSS protocol would show differences in objective swallow metrics based on age. Healthy juvenile, adult, and geriatric dogs (n = 24). Prospective, experimental study. Custom kennels were developed to maintain natural feeding behaviors during VFSS. Three food consistencies (thin liquid, pureed food, and dry kibble) were formulated with either iohexol or barium to maximize palatability and voluntary prehension. Dogs were evaluated by 16 swallow metrics and compared across age groups. Development of a standardized VFSS protocol resulted in successful collection of swallow data in healthy dogs. No significant differences in swallow metrics were observed among age groups. Substantial variability was observed in healthy dogs when evaluated under these physiologic conditions. Features typically attributed to pathologic states, such as gastric reflux, were seen in healthy dogs. Development of a VFSS protocol that reflects natural feeding practices may allow emulation of physiology resulting in clinical signs of dysphagia. Age did not result in significant changes in swallow metrics, but additional studies are needed, particularly in light of substantial normal variation. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  15. Comparing Effective Doses During Image-Guided Core Needle Biopsies with Computed Tomography Versus C-Arm Cone Beam CT Using Adult and Pediatric Phantoms.

    PubMed

    Ben-Shlomo, A; Cohen, D; Bruckheimer, E; Bachar, G N; Konstantinovsky, R; Birk, E; Atar, E

    2016-05-01

    To compare the effective doses of needle biopsies based on dose measurements and simulations using adult and pediatric phantoms, between cone beam c-arm CT (CBCT) and CT. Effective doses were calculated and compared based on measurements and Monte Carlo simulations of CT- and CBCT-guided biopsy procedures of the lungs, liver, and kidney using pediatric and adult phantoms. The effective doses for pediatric and adult phantoms, using our standard protocols for upper, middle and lower lungs, liver, and kidney biopsies, were significantly lower under CBCT guidance than CT. The average effective dose for a 5-year old for these five biopsies was 0.36 ± 0.05 mSv with the standard CBCT exposure protocols and 2.13 ± 0.26 mSv with CT. The adult average effective dose for the five biopsies was 1.63 ± 0.22 mSv with the standard CBCT protocols and 8.22 ± 1.02 mSv using CT. The CT effective dose was higher than CBCT protocols for child and adult phantoms by 803 and 590% for upper lung, 639 and 525% for mid-lung, and 461 and 251% for lower lung, respectively. Similarly, the effective dose was higher by 691 and 762% for liver and 513 and 608% for kidney biopsies. Based on measurements and simulations with pediatric and adult phantoms, radiation effective doses during image-guided needle biopsies of the lung, liver, and kidney are significantly lower with CBCT than with CT.

  16. Development of new method and protocol for cryopreservation related to embryo and oocytes freezing in terms of fertilization rate: A comparative study including review of literature

    PubMed Central

    Barik, Mayadhar; Bajpai, Minu; Patnaik, Santosh; Mishra, Pravash; Behera, Priyamadhaba; Dwivedi, Sada Nanda

    2016-01-01

    Background: Cryopreservation is basically related to meritorious thin samples or small clumps of cells that are cooled quickly without loss. Our main objective is to establish and formulate an innovative method and protocol development for cryopreservation as a gold standard for clinical uses in laboratory practice and treatment. The knowledge regarding usefulness of cryopreservation in clinical practice is essential to carry forward the clinical practice and research. Materials and Methods: We are trying to compare different methods of cryopreservation (in two dozen of cells) at the same time we compare the embryo and oocyte freezing interms of fertilization rate according to the International standard protocol. Results: The combination of cryoprotectants and regimes of rapid cooling and rinsing during warming often allows successful cryopreservation of biological materials, particularly cell suspensions or thin tissue samples. Examples include semen, blood, tissue samples like tumors, histological cross-sections, human eggs and human embryos. Although presently many studies have reported that the children born from frozen embryos or “frosties,” show consistently positive results with no increase in birth defects or development abnormalities is quite good enough and similar to our study (50–85%). Conclusions: We ensure that cryopreservation technology provided useful cell survivability, tissue and organ preservation in a proper way. Although it varies according to different laboratory conditions, it is certainly beneficial for patient's treatment and research. Further studies are needed for standardization and development of new protocol. PMID:27512686

  17. An in silico method to identify computer-based protocols worthy of clinical study: An insulin infusion protocol use case

    PubMed Central

    Wong, Anthony F; Pielmeier, Ulrike; Haug, Peter J; Andreassen, Steen

    2016-01-01

    Objective Develop an efficient non-clinical method for identifying promising computer-based protocols for clinical study. An in silico comparison can provide information that informs the decision to proceed to a clinical trial. The authors compared two existing computer-based insulin infusion protocols: eProtocol-insulin from Utah, USA, and Glucosafe from Denmark. Materials and Methods The authors used eProtocol-insulin to manage intensive care unit (ICU) hyperglycemia with intravenous (IV) insulin from 2004 to 2010. Recommendations accepted by the bedside clinicians directly link the subsequent blood glucose values to eProtocol-insulin recommendations and provide a unique clinical database. The authors retrospectively compared in silico 18 984 eProtocol-insulin continuous IV insulin infusion rate recommendations from 408 ICU patients with those of Glucosafe, the candidate computer-based protocol. The subsequent blood glucose measurement value (low, on target, high) was used to identify if the insulin recommendation was too high, on target, or too low. Results Glucosafe consistently provided more favorable continuous IV insulin infusion rate recommendations than eProtocol-insulin for on target (64% of comparisons), low (80% of comparisons), or high (70% of comparisons) blood glucose. Aggregated eProtocol-insulin and Glucosafe continuous IV insulin infusion rates were clinically similar though statistically significantly different (Wilcoxon signed rank test P = .01). In contrast, when stratified by low, on target, or high subsequent blood glucose measurement, insulin infusion rates from eProtocol-insulin and Glucosafe were statistically significantly different (Wilcoxon signed rank test, P < .001), and clinically different. Discussion This in silico comparison appears to be an efficient nonclinical method for identifying promising computer-based protocols. Conclusion Preclinical in silico comparison analytical framework allows rapid and inexpensive identification of computer-based protocol care strategies that justify expensive and burdensome clinical trials. PMID:26228765

  18. Validation of asthma recording in electronic health records: protocol for a systematic review.

    PubMed

    Nissen, Francis; Quint, Jennifer K; Wilkinson, Samantha; Mullerova, Hana; Smeeth, Liam; Douglas, Ian J

    2017-05-29

    Asthma is a common, heterogeneous disease with significant morbidity and mortality worldwide. It can be difficult to define in epidemiological studies using electronic health records as the diagnosis is based on non-specific respiratory symptoms and spirometry, neither of which are routinely registered. Electronic health records can nonetheless be valuable to study the epidemiology, management, healthcare use and control of asthma. For health databases to be useful sources of information, asthma diagnoses should ideally be validated. The primary objectives are to provide an overview of the methods used to validate asthma diagnoses in electronic health records and summarise the results of the validation studies. EMBASE and MEDLINE will be systematically searched for appropriate search terms. The searches will cover all studies in these databases up to October 2016 with no start date and will yield studies that have validated algorithms or codes for the diagnosis of asthma in electronic health records. At least one test validation measure (sensitivity, specificity, positive predictive value, negative predictive value or other) is necessary for inclusion. In addition, we require the validated algorithms to be compared with an external golden standard, such as a manual review, a questionnaire or an independent second database. We will summarise key data including author, year of publication, country, time period, date, data source, population, case characteristics, clinical events, algorithms, gold standard and validation statistics in a uniform table. This study is a synthesis of previously published studies and, therefore, no ethical approval is required. The results will be submitted to a peer-reviewed journal for publication. Results from this systematic review can be used to study outcome research on asthma and can be used to identify case definitions for asthma. CRD42016041798. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Effectiveness of a Rapid Lumbar Spine MRI Protocol Using 3D T2-Weighted SPACE Imaging Versus a Standard Protocol for Evaluation of Degenerative Changes of the Lumbar Spine.

    PubMed

    Sayah, Anousheh; Jay, Ann K; Toaff, Jacob S; Makariou, Erini V; Berkowitz, Frank

    2016-09-01

    Reducing lumbar spine MRI scanning time while retaining diagnostic accuracy can benefit patients and reduce health care costs. This study compares the effectiveness of a rapid lumbar MRI protocol using 3D T2-weighted sampling perfection with application-optimized contrast with different flip-angle evolutions (SPACE) sequences with a standard MRI protocol for evaluation of lumbar spondylosis. Two hundred fifty consecutive unenhanced lumbar MRI examinations performed at 1.5 T were retrospectively reviewed. Full, rapid, and complete versions of each examination were interpreted for spondylotic changes at each lumbar level, including herniations and neural compromise. The full examination consisted of sagittal T1-weighted, T2-weighted turbo spin-echo (TSE), and STIR sequences; and axial T1- and T2-weighted TSE sequences (time, 18 minutes 40 seconds). The rapid examination consisted of sagittal T1- and T2-weighted SPACE sequences, with axial SPACE reformations (time, 8 minutes 46 seconds). The complete examination consisted of the full examination plus the T2-weighted SPACE sequence. Sensitivities and specificities of the full and rapid examinations were calculated using the complete study as the reference standard. The rapid and full studies had sensitivities of 76.0% and 69.3%, with specificities of 97.2% and 97.9%, respectively, for all degenerative processes. Rapid and full sensitivities were 68.7% and 66.3% for disk herniation, 85.2% and 81.5% for canal compromise, 82.9% and 69.1% for lateral recess compromise, and 76.9% and 69.7% for foraminal compromise, respectively. Isotropic SPACE T2-weighted imaging provides high-quality imaging of lumbar spondylosis, with multiplanar reformatting capability. Our SPACE-based rapid protocol had sensitivities and specificities for herniations and neural compromise comparable to those of the protocol without SPACE. This protocol fits within a 15-minute slot, potentially reducing costs and discomfort for a large subgroup of patients.

  20. Use of dual-energy X-ray absorptiometry (DXA) for diagnosis and fracture risk assessment; WHO-criteria, T- and Z-score, and reference databases.

    PubMed

    Dimai, Hans P

    2017-11-01

    Dual-energy X-ray absorptiometry (DXA) is a two-dimensional imaging technology developed to assess bone mineral density (BMD) of the entire human skeleton and also specifically of skeletal sites known to be most vulnerable to fracture. In order to simplify interpretation of BMD measurement results and allow comparability among different DXA-devices, the T-score concept was introduced. This concept involves an individual's BMD which is then compared with the mean value of a young healthy reference population, with the difference expressed as a standard deviation (SD). Since the early nineties of the past century, the diagnostic categories "normal, osteopenia, and osteoporosis", as recommended by a WHO working Group, are based on this concept. Thus, DXA is still the globally accepted "gold-standard" method for the noninvasive diagnosis of osteoporosis. Another score obtained from DXA measurement, termed Z-score, describes the number of SDs by which the BMD in an individual differs from the mean value expected for age and sex. Although not intended for diagnosis of osteoporosis in adults, it nevertheless provides information about an individual's fracture risk compared to peers. DXA measurement can either be used as a "stand-alone" means in the assessment of an individual's fracture risk, or incorporated into one of the available fracture risk assessment tools such as FRAX® or Garvan, thus improving the predictive power of such tools. The issue which reference databases should be used by DXA-device manufacturers for T-score reference standards has been recently addressed by an expert group, who recommended use National Health and Nutrition Examination Survey III (NHANES III) databases for the hip reference standard but own databases for the lumbar spine. Furthermore, in men it is recommended use female reference databases for calculation of the T-score and use male reference databases for calculation of Z-score. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Impact of a Newly Implemented Burn Protocol on Surgically Managed Partial Thickness Burns at a Specialized Burns Center in Singapore.

    PubMed

    Tay, Khwee-Soon Vincent; Chong, Si-Jack; Tan, Bien-Keem

    2016-03-01

    This study evaluated the impact of a newly implemented protocol for superficial to mid-dermal partial thickness burns which involves early surgery and rapid coverage with biosynthetic dressing in a specialized national burns center in Singapore. Consecutive patients with 5% or greater total body surface area (TBSA) superficial to mid-dermal partial thickness burns injury admitted to the Burns Centre at the Singapore General Hospital between August and December 2014 for surgery within 48 hours of injury were prospectively recruited into the study to form the protocol group. Comparable historical cases from the year 2013 retrieved from the burns center audit database were used to form the historical control group. Demographics (age, sex), type and depth of burns, %TBSA burnt, number of operative sessions, and length of stay were recorded for each patient of both cohorts. Thirty-nine burns patients managed under the new protocol were compared with historical control (n = 39) comparable in age and extensiveness of burns. A significantly shorter length of stay (P < 0.05) per TBSA burns was observed in the new protocol group (0.74 day/%TBSA) versus historical control (1.55 day/%TBSA). Fewer operative sessions were needed under the new protocol for burns 10% or greater TBSA burns (P < 0.05). The authors report their promising experience with a newly implemented protocol for surgically managed burns patients which involves early surgery and appropriate use of biosynthetic dressing on superficial to mid-dermal partial thickness burns. Clinically, shorter lengths of stay, fewer operative sessions, and decreased need for skin grafting of burns patient were observed.

  2. Digital Dental X-ray Database for Caries Screening

    NASA Astrophysics Data System (ADS)

    Rad, Abdolvahab Ehsani; Rahim, Mohd Shafry Mohd; Rehman, Amjad; Saba, Tanzila

    2016-06-01

    Standard database is the essential requirement to compare the performance of image analysis techniques. Hence the main issue in dental image analysis is the lack of available image database which is provided in this paper. Periapical dental X-ray images which are suitable for any analysis and approved by many dental experts are collected. This type of dental radiograph imaging is common and inexpensive, which is normally used for dental disease diagnosis and abnormalities detection. Database contains 120 various Periapical X-ray images from top to bottom jaw. Dental digital database is constructed to provide the source for researchers to use and compare the image analysis techniques and improve or manipulate the performance of each technique.

  3. A semi-nested real-time PCR method to detect low chimerism percentage in small quantity of hematopoietic stem cell transplant DNA samples.

    PubMed

    Aloisio, Michelangelo; Bortot, Barbara; Gandin, Ilaria; Severini, Giovanni Maria; Athanasakis, Emmanouil

    2017-02-01

    Chimerism status evaluation of post-allogeneic hematopoietic stem cell transplantation samples is essential to predict post-transplant relapse. The most commonly used technique capable of detecting small increments of chimerism is quantitative real-time PCR. Although this method is already used in several laboratories, previously described protocols often lack sensitivity and the amount of the DNA required for each chimerism analysis is too high. In the present study, we compared a novel semi-nested allele-specific real-time PCR (sNAS-qPCR) protocol with our in-house standard allele-specific real-time PCR (gAS-qPCR) protocol. We selected two genetic markers and analyzed technical parameters (slope, y-intercept, R2, and standard deviation) useful to determine the performances of the two protocols. The sNAS-qPCR protocol showed better sensitivity and precision. Moreover, the sNAS-qPCR protocol requires, as input, only 10 ng of DNA, which is at least 10-fold less than the gAS-qPCR protocols described in the literature. Finally, the proposed sNAS-qPCR protocol could prove very useful for performing chimerism analysis with a small amount of DNA, as in the case of blood cell subsets.

  4. SU-F-J-16: Planar KV Imaging Dose Reduction Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gershkevitsh, E; Zolotuhhin, D

    Purpose: IGRT has become an indispensable tool in modern radiotherapy with kV imaging used in many departments due to superior image quality and lower dose when compared to MV imaging. Many departments use manufacturer supplied protocols for imaging which are not always optimised between image quality and radiation dose (ALARA). Methods: Whole body phantom PBU-50 (Kyoto Kagaku ltd., Japan) for imaging in radiology has been imaged on Varian iX accelerator (Varian Medical Systems, USA) with OBI 1.5 system. Manufacturer’s default protocols were adapted by modifying kV and mAs values when imaging different anatomical regions of the phantom (head, thorax, abdomen,more » pelvis, extremities). Images with different settings were independently reviewed by two persons and their suitability for IGRT set-up correction protocols were evaluated. The suitable images with the lowest mAs were then selected. The entrance surface dose (ESD) for manufacturer’s default protocols and modified protocols were measured with RTI Black Piranha (RTI Group, Sweden) and compared. Image quality was also measured with kVQC phantom (Standard Imaging, USA) for different protocols. The modified protocols have been applied for clinical work. Results: For most cases optimized protocols reduced the ESD on average by a factor of 3(range 0.9–8.5). Further reduction in ESD has been observed by applying bow-tie filter designed for CBCT. The largest reduction in dose (12.2 times) was observed for Thorax lateral protocol. The dose was slightly increased (by 10%) for large pelvis AP protocol. Conclusion: Manufacturer’s default IGRT protocols could be optimised to reduce the ESD to the patient without losing the necessary image quality for patient set-up correction. For patient set-up with planar kV imaging the bony anatomy is mostly used and optimization should focus on this aspect. Therefore, the current approach with anthropomorphic phantom is more advantageous in optimization over standard kV quality control phantoms and SNR metrics.« less

  5. Effect of various Danshen injections on patients with coronary heart disease after percutaneous coronary intervention: A protocol for a systematic review and network meta-analysis.

    PubMed

    Zhu, Zehao; Wang, Yuanping; Liao, Weilin; Li, Huimin; Wang, Dawei

    2018-06-01

    Patients with coronary heart disease (CHD) who undergo percutaneous coronary intervention (PCI) have a certain risk of vascular complications, including coronary restenosis and thrombosis. Many recent randomized controlled trials have reported that Danshen injection (DSI) combined with conventional Western medicine can significantly reduce the occurrence of major cardiovascular adverse events in patients with CHD after PCI. However, there are many types of DSIs, and no study has yet compared each type. Therefore, we propose a study protocol for the systematic evaluation of the efficacy of various DSIs in the treatment of CHD after PCI. We will search the following electronic databases for randomized controlled trials evaluating the effect of DSI in patients with CHD after PCI: PubMed, Embase, Web of Science, Cochrane Library, Scopus, Ovid Evidence-Based Medicine Reviews, China National Knowledge Infrastructure, and Chinese Biomedicine Literature Database. Each database will be searched from inception to April 2018. The entire process will include study selection, data extraction, risk of bias assessment, pairwise meta-analyses, and network meta-analyses. This proposed study will compare the efficacy of different DSIs in the treatment of patients with CHD after PCI. The outcomes will include major cardiovascular adverse events and left ventricular ejection fraction. This proposed systematic review will evaluate the different advantages of various types of DSIs in the treatment of patients with CHD after PCI. PROSPERO (registration number: CRD42018092705).

  6. Conservative Management for Stable High Ankle Injuries in Professional Football Players.

    PubMed

    Knapik, Derrick M; Trem, Anthony; Sheehan, Joseph; Salata, Michael J; Voos, James E

    High ankle "syndesmosis" injuries are common in American football players relative to the general population. At the professional level, syndesmotic sprains represent a challenging and unique injury lacking a standardized rehabilitation protocol during conservative management. PubMed, Biosis Preview, SPORTDiscus, PEDro, and EMBASE databases were searched using the terms syndesmotic injuries, American football, conservative management, and rehabilitation. Clinical review. Level 3. When compared with lateral ankle sprains, syndesmosis injuries result in significantly prolonged recovery times and games lost. For stable syndesmotic injuries, conservative management features a brief period of immobilization and protected weightbearing followed by progressive strengthening exercises and running, and athletes can expect to return to competition in 2 to 6 weeks. Further research investigating the efficacy of dry needling and blood flow restriction therapy is necessary to evaluate the benefit of these techniques in the rehabilitation process. Successful conservative management of stable syndesmotic injuries in professional American football athletes requires a thorough understanding of the anatomy, injury mechanisms, diagnosis, and rehabilitation strategies utilized in elite athletes.

  7. AFLOW-SYM: platform for the complete, automatic and self-consistent symmetry analysis of crystals.

    PubMed

    Hicks, David; Oses, Corey; Gossett, Eric; Gomez, Geena; Taylor, Richard H; Toher, Cormac; Mehl, Michael J; Levy, Ohad; Curtarolo, Stefano

    2018-05-01

    Determination of the symmetry profile of structures is a persistent challenge in materials science. Results often vary amongst standard packages, hindering autonomous materials development by requiring continuous user attention and educated guesses. This article presents a robust procedure for evaluating the complete suite of symmetry properties, featuring various representations for the point, factor and space groups, site symmetries and Wyckoff positions. The protocol determines a system-specific mapping tolerance that yields symmetry operations entirely commensurate with fundamental crystallographic principles. The self-consistent tolerance characterizes the effective spatial resolution of the reported atomic positions. The approach is compared with the most used programs and is successfully validated against the space-group information provided for over 54 000 entries in the Inorganic Crystal Structure Database (ICSD). Subsequently, a complete symmetry analysis is applied to all 1.7+ million entries of the AFLOW data repository. The AFLOW-SYM package has been implemented in, and made available for, public use through the automated ab initio framework AFLOW.

  8. Desensitization to Mycofenolate Mofetil: a novel 12 step protocol.

    PubMed

    Smith, M; Gonzalez-Estrada, A; Fernandez, J; Subramanian, A

    2016-07-01

    The use of MMF has become standard practice in many solid organ transplant recipients due its efficacy and favorable risk profile compared to other immunosuppressants. There has been a single case report of successful MMF desensitization. However, this protocol did not follow current Drug practice parameters. We report a successful desensitization to MMF in a double heart-kidney transplant recipient.

  9. Photodynamic therapy in endodontics: a literature review.

    PubMed

    Trindade, Alessandra Cesar; De Figueiredo, José Antônio Poli; Steier, Liviu; Weber, João Batista Blessmann

    2015-03-01

    Recently, several in vitro and in vivo studies demonstrated promising results about the use of photodynamic therapy during root canal system disinfection. However, there is no consensus on a standard protocol for its incorporation during root canal treatment. The purpose of this study was to summarize the results of research on photodynamic therapy in endodontics published in peer-reviewed journals. A review of pertinent literature was conducted using the PubMed database, and data obtained were categorized into sections in terms of relevant topics. Studies conducted in recent years highlighted the antimicrobial potential of photodynamic therapy in endodontics. However, most of these studies were not able to confirm a significant improvement in root canal disinfection for photodynamic therapy as a substitute for current disinfection methods. Its indication as an excellent adjunct to conventional endodontic therapy is well documented, however. Data suggest the need for protocol adjustments or new photosensitizer formulations to enhance photodynamic therapy predictability in endodontics.

  10. A communal catalogue reveals Earth's multiscale microbial diversity.

    PubMed

    Thompson, Luke R; Sanders, Jon G; McDonald, Daniel; Amir, Amnon; Ladau, Joshua; Locey, Kenneth J; Prill, Robert J; Tripathi, Anupriya; Gibbons, Sean M; Ackermann, Gail; Navas-Molina, Jose A; Janssen, Stefan; Kopylova, Evguenia; Vázquez-Baeza, Yoshiki; González, Antonio; Morton, James T; Mirarab, Siavash; Zech Xu, Zhenjiang; Jiang, Lingjing; Haroon, Mohamed F; Kanbar, Jad; Zhu, Qiyun; Jin Song, Se; Kosciolek, Tomasz; Bokulich, Nicholas A; Lefler, Joshua; Brislawn, Colin J; Humphrey, Gregory; Owens, Sarah M; Hampton-Marcell, Jarrad; Berg-Lyons, Donna; McKenzie, Valerie; Fierer, Noah; Fuhrman, Jed A; Clauset, Aaron; Stevens, Rick L; Shade, Ashley; Pollard, Katherine S; Goodwin, Kelly D; Jansson, Janet K; Gilbert, Jack A; Knight, Rob

    2017-11-23

    Our growing awareness of the microbial world's importance and diversity contrasts starkly with our limited understanding of its fundamental structure. Despite recent advances in DNA sequencing, a lack of standardized protocols and common analytical frameworks impedes comparisons among studies, hindering the development of global inferences about microbial life on Earth. Here we present a meta-analysis of microbial community samples collected by hundreds of researchers for the Earth Microbiome Project. Coordinated protocols and new analytical methods, particularly the use of exact sequences instead of clustered operational taxonomic units, enable bacterial and archaeal ribosomal RNA gene sequences to be followed across multiple studies and allow us to explore patterns of diversity at an unprecedented scale. The result is both a reference database giving global context to DNA sequence data and a framework for incorporating data from future studies, fostering increasingly complete characterization of Earth's microbial diversity.

  11. Rapid and reliable high-throughput methods of DNA extraction for use in barcoding and molecular systematics of mushrooms.

    PubMed

    Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc

    2010-07-01

    We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (<2 year) and older (>100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.

  12. Evaluation of a new very low dose imaging protocol: feasibility and impact on X-ray dose levels in electrophysiology procedures.

    PubMed

    Bourier, Felix; Reents, Tilko; Ammar-Busch, Sonia; Buiatti, Alessandra; Kottmaier, Marc; Semmler, Verena; Telishevska, Marta; Brkic, Amir; Grebmer, Christian; Lennerz, Carsten; Kolb, Christof; Hessling, Gabriele; Deisenhofer, Isabel

    2016-09-01

    This study presents and evaluates the impact of a new lowest-dose fluoroscopy protocol (Siemens AG), especially designed for electrophysiology (EP) procedures, on X-ray dose levels. From October 2014 to March 2015, 140 patients underwent an EP study on an Artis zee angiography system. The standard low-dose protocol was operated at 23 nGy (fluoroscopy) and at 120 nGy (cine-loop), the new lowest-dose protocol was operated at 8 nGy (fluoroscopy) and at 36 nGy (cine-loop). Procedural data, X-ray times, and doses were analysed in 100 complex left atrial and in 40 standard EP procedures. The resulting dose-area products were 877.9 ± 624.7 µGym² (n = 50 complex procedures, standard low dose), 199 ± 159.6 µGym² (n = 50 complex procedures, lowest dose), 387.7 ± 36.0 µGym² (n = 20 standard procedures, standard low dose), and 90.7 ± 62.3 µGym² (n = 20 standard procedures, lowest dose), P < 0.01. In the low-dose and lowest-dose groups, procedure times were 132.6 ± 35.7 vs. 126.7 ± 34.7 min (P = 0.40, complex procedures) and 72.3 ± 20.9 vs. 85.2 ± 44.1 min (P = 0.24, standard procedures), radiofrequency (RF) times were 53.8 ± 26.1 vs. 50.4 ± 29.4 min (P = 0.54, complex procedures) and 10.1 ± 9.9 vs. 12.2 ± 14.7 min (P = 0.60, standard procedures). One complication occurred in the standard low-dose and lowest-dose groups (P = 1.0). The new lowest-dose imaging protocol reduces X-ray dose levels by 77% compared with the currently available standard low-dose protocol. From an operator standpoint, lowest X-ray dose levels create a different, reduced image quality. The new image quality did not significantly affect procedure or RF times and did not result in higher complication rates. Regarding radiological protection, operating at lowest-dose settings should become standard in EP procedures. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  13. Challenges to the Standardization of Burn Data Collection: A Call for Common Data Elements for Burn Care.

    PubMed

    Schneider, Jeffrey C; Chen, Liang; Simko, Laura C; Warren, Katherine N; Nguyen, Brian Phu; Thorpe, Catherine R; Jeng, James C; Hickerson, William L; Kazis, Lewis E; Ryan, Colleen M

    2018-02-20

    The use of common data elements (CDEs) is growing in medical research; CDEs have demonstrated benefit in maximizing the impact of existing research infrastructure and funding. However, the field of burn care does not have a standard set of CDEs. The objective of this study is to examine the extent of common data collected in current burn databases.This study examines the data dictionaries of six U.S. burn databases to ascertain the extent of common data. This was assessed from a quantitative and qualitative perspective. Thirty-two demographic and clinical data elements were examined. The number of databases that collect each data element was calculated. The data values for each data element were compared across the six databases for common terminology. Finally, the data prompts of the data elements were examined for common language and structure.Five (16%) of the 32 data elements are collected by all six burn databases; additionally, five data elements (16%) are present in only one database. Furthermore, there are considerable variations in data values and prompts used among the burn databases. Only one of the 32 data elements (age) contains the same data values across all databases.The burn databases examined show minimal evidence of common data. There is a need to develop CDEs and standardized coding to enhance interoperability of burn databases.

  14. The challenges of measuring quality-of-care indicators in rural emergency departments: a cross-sectional descriptive study.

    PubMed

    Layani, Géraldine; Fleet, Richard; Dallaire, Renée; Tounkara, Fatoumata K; Poitras, Julien; Archambault, Patrick; Chauny, Jean-Marc; Ouimet, Mathieu; Gauthier, Josée; Dupuis, Gilles; Tanguay, Alain; Lévesque, Jean-Frédéric; Simard-Racine, Geneviève; Haggerty, Jeannie; Légaré, France

    2016-01-01

    Evidence-based indicators of quality of care have been developed to improve care and performance in Canadian emergency departments. The feasibility of measuring these indicators has been assessed mainly in urban and academic emergency departments. We sought to assess the feasibility of measuring quality-of-care indicators in rural emergency departments in Quebec. We previously identified rural emergency departments in Quebec that offered medical coverage with hospital beds 24 hours a day, 7 days a week and were located in rural areas or small towns as defined by Statistics Canada. A standardized protocol was sent to each emergency department to collect data on 27 validated quality-of-care indicators in 8 categories: duration of stay, patient safety, pain management, pediatrics, cardiology, respiratory care, stroke and sepsis/infection. Data were collected by local professional medical archivists between June and December 2013. Fifteen (58%) of the 26 emergency departments invited to participate completed data collection. The ability to measure the 27 quality-of-care indicators with the use of databases varied across departments. Centres 2, 5, 6 and 13 used databases for at least 21 of the indicators (78%-92%), whereas centres 3, 8, 9, 11, 12 and 15 used databases for 5 (18%) or fewer of the indicators. On average, the centres were able to measure only 41% of the indicators using heterogeneous databases and manual extraction. The 15 centres collected data from 15 different databases or combinations of databases. The average data collection time for each quality-of-care indicator varied from 5 to 88.5 minutes. The median data collection time was 15 minutes or less for most indicators. Quality-of-care indicators were not easily captured with the use of existing databases in rural emergency departments in Quebec. Further work is warranted to improve standardized measurement of these indicators in rural emergency departments in the province and to generalize the information gathered in this study to other health care environments.

  15. Mechanical Properties and Simulated Aging of Silicone Maxillofacial Elastomers: Advancements in the Past 45 Years.

    PubMed

    Hatamleh, Muhanad M; Polyzois, Gregory L; Nuseir, Amjad; Hatamleh, Khaldoun; Alnazzawi, Ahmad

    2016-07-01

    To identify and discuss the findings of publications on mechanical behavior of maxillofacial prosthetic materials published since 1969. Original experimental articles reporting on mechanical properties of maxillofacial prosthetic materials were included. A two-stage search of the literature, electronic and hand search, identified relevant published studies up to May 2015. An extensive electronic search was conducted of databases including PubMed, Embase, Scopus, and Google Scholar. Included primary studies (n = 63) reported on tensile strength, tear strength, and hardness of maxillofacial prosthetic materials at baseline and after aging. The search revealed 63 papers, with more than 28 papers being published in the past 10 years, which shows an increased number of publications when compared to only 6 papers published in the 1970s. The increase is linear with significant correlation (r = 0.85). Such an increase reflects great awareness and continued developments and warrants more research in the field of maxillofacial prosthetic materials properties; however, it is difficult to directly compare results, as studies varied in maxillofacial prosthetic materials tested with various silicone elastomers being heavily investigated, standards followed in preparing test specimens, experimental testing protocols, and parameters used in setting simulated aging conditionings. It is imperative to overcome the existing variability by establishing unified national or international standards/specifications for maxillofacial prosthetic materials. Standardization organizations or bodies, the scientific community, and academia need to be coordinated to achieve this goal. In the meantime and despite all of these theoretically significant alternatives, clinical practice still faces problems with serviceability of maxillofacial prostheses. © 2016 by the American College of Prosthodontists.

  16. Evaluation of Dogs with Border Collie Collapse, Including Response to Two Standardized Strenuous Exercise Protocols.

    PubMed

    Taylor, Susan; Shmon, Cindy; Su, Lillian; Epp, Tasha; Minor, Katie; Mickelson, James; Patterson, Edward; Shelton, G Diane

    2016-01-01

    Clinical and metabolic variables were evaluated in 13 dogs with border collie collapse (BCC) before, during, and following completion of standardized strenuous exercise protocols. Six dogs participated in a ball-retrieving protocol, and seven dogs participated in a sheep-herding protocol. Findings were compared with 16 normal border collies participating in the same exercise protocols (11 retrieving, five herding). Twelve dogs with BCC developed abnormal mentation and/or an abnormal gait during evaluation. All dogs had post-exercise elevations in rectal temperature, pulse rate, arterial blood pH, PaO2, and lactate, and decreased PaCO2 and bicarbonate, as expected with strenuous exercise, but there were no significant differences between BCC dogs and normal dogs. Electrocardiography demonstrated sinus tachycardia in all dogs following exercise. Needle electromyography was normal, and evaluation of muscle biopsy cryosections using a standard panel of histochemical stains and reactions did not reveal a reason for collapse in 10 dogs with BCC in which these tests were performed. Genetic testing excluded the dynamin-1 related exercise-induced collapse mutation and the V547A malignant hyperthermia mutation as the cause of BCC. Common reasons for exercise intolerance were eliminated. Although a genetic basis is suspected, the cause of collapse in BCC was not determined.

  17. Cryotherapy for acute ankle sprains: a randomised controlled study of two different icing protocols.

    PubMed

    Bleakley, C M; McDonough, S M; MacAuley, D C; Bjordal, J

    2006-08-01

    The use of cryotherapy in the management of acute soft tissue injury is largely based on anecdotal evidence. Preliminary evidence suggests that intermittent cryotherapy applications are most effective at reducing tissue temperature to optimal therapeutic levels. However, its efficacy in treating injured human subjects is not yet known. To compare the efficacy of an intermittent cryotherapy treatment protocol with a standard cryotherapy treatment protocol in the management of acute ankle sprains. Sportsmen (n = 44) and members of the general public (n = 45) with mild/moderate acute ankle sprains. Subjects were randomly allocated, under strictly controlled double blind conditions, to one of two treatment groups: standard ice application (n = 46) or intermittent ice application (n = 43). The mode of cryotherapy was standardised across groups and consisted of melting iced water (0 degrees C) in a standardised pack. Function, pain, and swelling were recorded at baseline and one, two, three, four, and six weeks after injury. Subjects treated with the intermittent protocol had significantly (p<0.05) less ankle pain on activity than those using a standard 20 minute protocol; however, one week after ankle injury, there were no significant differences between groups in terms of function, swelling, or pain at rest. Intermittent applications may enhance the therapeutic effect of ice in pain relief after acute soft tissue injury.

  18. The Effect of Changing Scan Mode on Trabecular Bone Score Using Lunar Prodigy.

    PubMed

    Chen, Weiwen; Slattery, Anthony; Center, Jacqueline; Pocock, Nicholas

    2016-10-01

    Trabecular bone score (TBS) is a measure of gray scale homogeneity that correlates with trabecular microarchitecture and is an independent predictor of fracture risk. TBS is being increasingly used in the assessment of patients at risk of osteoporosis and has recently been incorporated into FRAX ® . GE Lunar machines acquire spine scans using 1 of 3 acquisition modes depending on abdominal tissue thickness (thin, standard, and thick). From a database review, 30 patients (mean body mass index: 30.8, range 26.2-34.1) were identified who had undergone lumbar spine DXA scans (GE Lunar Prodigy, software 14.10; Lunar Radiation Corporation, Madison, WI) in both standard mode and thick mode, on the same day with no repositioning. Lumbar spine bone mineral density (L1-L4) and TBS were derived from the 30 paired spine scans. There was no significant difference in lumbar spine bone mineral density between the 2 scanning modes. There were, however, significant higher TBS values from the spine scans acquired in thick mode compared to the TBS values derived from spine acquisitions in standard mode (mean TBS difference: 0.24 [20%], standard deviation ±0.10). In conclusion, these preliminary data suggest that TBS values acquired in the GE Lunar Prodigy are dependent on the scanning mode used. Further evaluation is required to confirm the cause and develop appropriate protocols. Copyright © 2016 International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  19. The importance of standardization for biodiversity comparisons: A case study using autonomous reef monitoring structures (ARMS) and metabarcoding to measure cryptic diversity on Mo’orea coral reefs, French Polynesia

    PubMed Central

    Geller, Jonathan B.; Timmers, Molly; Leray, Matthieu; Mahardini, Angka; Sembiring, Andrianus; Collins, Allen G.; Meyer, Christopher P.

    2017-01-01

    The advancement of metabarcoding techniques, declining costs of high-throughput sequencing and development of systematic sampling devices, such as autonomous reef monitoring structures (ARMS), have provided the means to gather a vast amount of diversity data from cryptic marine communities. However, such increased capability could also lead to analytical challenges if the methods used to examine these communities across local and global scales are not standardized. Here we compare and assess the underlying biases of four ARMS field processing methods, preservation media, and current bioinformatic pipelines in evaluating diversity from cytochrome c oxidase I metabarcoding data. Illustrating the ability of ARMS-based metabarcoding to capture a wide spectrum of biodiversity, 3,372 OTUs and twenty-eight phyla, including 17 of 33 marine metazoan phyla, were detected from 3 ARMS (2.607 m2 area) collected on coral reefs in Mo’orea, French Polynesia. Significant differences were found between processing and preservation methods, demonstrating the need to standardize methods for biodiversity comparisons. We recommend the use of a standardized protocol (NOAA method) combined with DMSO preservation of tissues for sessile macroorganisms because it gave a more accurate representation of the underlying communities, is cost effective and removes chemical restrictions associated with sample transportation. We found that sequences identified at ≥ 97% similarity increased more than 7-fold (5.1% to 38.6%) using a geographically local barcode inventory, highlighting the importance of local species inventories. Phylogenetic approaches that assign higher taxonomic ranks accrued phylum identification errors (9.7%) due to sparse taxonomic coverage of the understudied cryptic coral reef community in public databases. However, a ≥ 85% sequence identity cut-off provided more accurate results (0.7% errors) and enabled phylum level identifications of 86.3% of the sequence reads. With over 1600 ARMS deployed, standardizing methods and improving databases are imperative to provide unprecedented global baseline assessments of understudied cryptic marine species in a rapidly changing world. PMID:28430780

  20. Quantitative and qualitative comparison of MR imaging of the temporomandibular joint at 1.5 and 3.0 T using an optimized high-resolution protocol

    PubMed Central

    Spinner, Georg; Wyss, Michael; Erni, Stefan; Ettlin, Dominik A; Nanz, Daniel; Ulbrich, Erika J; Gallo, Luigi M; Andreisek, Gustav

    2016-01-01

    Objectives: To quantitatively and qualitatively compare MRI of the temporomandibular joint (TMJ) using an optimized high-resolution protocol at 3.0 T and a clinical standard protocol at 1.5 T. Methods: A phantom and 12 asymptomatic volunteers were MR imaged using a 2-channel surface coil (standard TMJ coil) at 1.5 and 3.0 T (Philips Achieva and Philips Ingenia, respectively; Philips Healthcare, Best, Netherlands). Imaging protocol consisted of coronal and oblique sagittal proton density-weighted turbo spin echo sequences. For quantitative evaluation, a spherical phantom was imaged. Signal-to-noise ratio (SNR) maps were calculated on a voxelwise basis. For qualitative evaluation, all volunteers underwent MRI of the TMJ with the jaw in closed position. Two readers independently assessed visibility and delineation of anatomical structures of the TMJ and overall image quality on a 5-point Likert scale. Quantitative and qualitative measurements were compared between field strengths. Results: The quantitative analysis showed similar SNR for the high-resolution protocol at 3.0 T compared with the clinical protocol at 1.5 T. The qualitative analysis showed significantly better visibility and delineation of clinically relevant anatomical structures of the TMJ, including the TMJ disc and pterygoid muscle as well as better overall image quality at 3.0 T than at 1.5 T. Conclusions: The presented results indicate that expected gains in SNR at 3.0 T can be used to increase the spatial resolution when imaging the TMJ, which translates into increased visibility and delineation of anatomical structures of the TMJ. Therefore, imaging at 3.0 T should be preferred over 1.5 T for imaging the TMJ. PMID:26371077

  1. Quantitative and qualitative comparison of MR imaging of the temporomandibular joint at 1.5 and 3.0 T using an optimized high-resolution protocol.

    PubMed

    Manoliu, Andrei; Spinner, Georg; Wyss, Michael; Erni, Stefan; Ettlin, Dominik A; Nanz, Daniel; Ulbrich, Erika J; Gallo, Luigi M; Andreisek, Gustav

    2016-01-01

    To quantitatively and qualitatively compare MRI of the temporomandibular joint (TMJ) using an optimized high-resolution protocol at 3.0 T and a clinical standard protocol at 1.5 T. A phantom and 12 asymptomatic volunteers were MR imaged using a 2-channel surface coil (standard TMJ coil) at 1.5 and 3.0 T (Philips Achieva and Philips Ingenia, respectively; Philips Healthcare, Best, Netherlands). Imaging protocol consisted of coronal and oblique sagittal proton density-weighted turbo spin echo sequences. For quantitative evaluation, a spherical phantom was imaged. Signal-to-noise ratio (SNR) maps were calculated on a voxelwise basis. For qualitative evaluation, all volunteers underwent MRI of the TMJ with the jaw in closed position. Two readers independently assessed visibility and delineation of anatomical structures of the TMJ and overall image quality on a 5-point Likert scale. Quantitative and qualitative measurements were compared between field strengths. The quantitative analysis showed similar SNR for the high-resolution protocol at 3.0 T compared with the clinical protocol at 1.5 T. The qualitative analysis showed significantly better visibility and delineation of clinically relevant anatomical structures of the TMJ, including the TMJ disc and pterygoid muscle as well as better overall image quality at 3.0 T than at 1.5 T. The presented results indicate that expected gains in SNR at 3.0 T can be used to increase the spatial resolution when imaging the TMJ, which translates into increased visibility and delineation of anatomical structures of the TMJ. Therefore, imaging at 3.0 T should be preferred over 1.5 T for imaging the TMJ.

  2. The FREGAT biobank: a clinico-biological database dedicated to esophageal and gastric cancers.

    PubMed

    Mariette, Christophe; Renaud, Florence; Piessen, Guillaume; Gele, Patrick; Copin, Marie-Christine; Leteurtre, Emmanuelle; Delaeter, Christine; Dib, Malek; Clisant, Stéphanie; Harter, Valentin; Bonnetain, Franck; Duhamel, Alain; Christophe, Véronique; Adenis, Antoine

    2018-02-06

    While the incidence of esophageal and gastric cancers is increasing, the prognosis of these cancers remains bleak. Endoscopy and surgery are the standard treatments for localized tumors, but multimodal treatments, associated chemotherapy, targeted therapies, immunotherapy, radiotherapy, and surgery are needed for the vast majority of patients who present with locally advanced or metastatic disease at diagnosis. Although survival has improved, most patients still present with advanced disease at diagnosis. In addition, most patients exhibit a poor or incomplete response to treatment, experience early recurrence and have an impaired quality of life. Compared with several other cancers, the therapeutic approach is not personalized, and research is much less developed. It is, therefore, urgent to hasten the development of research protocols, and consequently, develop a large, ambitious and innovative tool through which future scientific questions may be answered. This research must be patient-related so that rapid feedback to the bedside is achieved and should aim to identify clinical-, biological- and tumor-related factors that are associated with treatment resistance. Finally, this research should also seek to explain epidemiological and social facets of disease behavior. The prospective FREGAT database, established by the French National Cancer Institute, is focused on adult patients with carcinomas of the esophagus and stomach and on whatever might be the tumor stage or therapeutic strategy. The database includes epidemiological, clinical, and tumor characteristics data as well as follow-up, human and social sciences quality of life data, along with a tumor and serum bank. This innovative method of research will allow for the banking of millions of data for the development of excellent basic, translational and clinical research programs for esophageal and gastric cancer. This will ultimately improve general knowledge of these diseases, therapeutic strategies and patient survival. This database was initially developed in France on a nationwide basis, but currently, the database is available for worldwide contributions with respect to the input of patient data or the request for data for scientific projects. The FREGAT database has a dedicated website ( www.fregat-database.org ) and is registered on the Clinicaltrials.gov site, number NCT 02526095 , since August 8, 2015.

  3. Assessment of phantom dosimetry and image quality of i-CAT FLX cone-beam computed tomography.

    PubMed

    Ludlow, John B; Walker, Cameron

    2013-12-01

    The increasing use of cone-beam computed tomography in orthodontics has been coupled with heightened concern about the long-term risks of x-ray exposure in orthodontic populations. An industry response to this has been to offer low-exposure alternative scanning options in newer cone-beam computed tomography models. Effective doses resulting from various combinations of field of view size and field location comparing child and adult anthropomorphic phantoms with the recently introduced i-CAT FLX cone-beam computed tomography unit (Imaging Sciences, Hatfield, Pa) were measured with optical stimulated dosimetry using previously validated protocols. Scan protocols included high resolution (360° rotation, 600 image frames, 120 kV[p], 5 mA, 7.4 seconds), standard (360°, 300 frames, 120 kV[p], 5 mA, 3.7 seconds), QuickScan (180°, 160 frames, 120 kV[p], 5 mA, 2 seconds), and QuickScan+ (180°, 160 frames, 90 kV[p], 3 mA, 2 seconds). Contrast-to-noise ratio was calculated as a quantitative measure of image quality for the various exposure options using the QUART DVT phantom. Child phantom doses were on average 36% greater than adult phantom doses. QuickScan+ protocols resulted in significantly lower doses than standard protocols for the child (P = 0.0167) and adult (P = 0.0055) phantoms. The 13 × 16-cm cephalometric fields of view ranged from 11 to 85 μSv in the adult phantom and 18 to 120 μSv in the child phantom for the QuickScan+ and standard protocols, respectively. The contrast-to-noise ratio was reduced by approximately two thirds when comparing QuickScan+ with standard exposure parameters. QuickScan+ effective doses are comparable with conventional panoramic examinations. Significant dose reductions are accompanied by significant reductions in image quality. However, this trade-off might be acceptable for certain diagnostic tasks such as interim assessment of treatment results. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  4. Data Acceptance Criteria for Standardized Human-Associated Fecal Source Identification Quantitative Real-Time PCR Methods.

    PubMed

    Shanks, Orin C; Kelty, Catherine A; Oshiro, Robin; Haugland, Richard A; Madi, Tania; Brooks, Lauren; Field, Katharine G; Sivaganesan, Mano

    2016-05-01

    There is growing interest in the application of human-associated fecal source identification quantitative real-time PCR (qPCR) technologies for water quality management. The transition from a research tool to a standardized protocol requires a high degree of confidence in data quality across laboratories. Data quality is typically determined through a series of specifications that ensure good experimental practice and the absence of bias in the results due to DNA isolation and amplification interferences. However, there is currently a lack of consensus on how best to evaluate and interpret human fecal source identification qPCR experiments. This is, in part, due to the lack of standardized protocols and information on interlaboratory variability under conditions for data acceptance. The aim of this study is to provide users and reviewers with a complete series of conditions for data acceptance derived from a multiple laboratory data set using standardized procedures. To establish these benchmarks, data from HF183/BacR287 and HumM2 human-associated qPCR methods were generated across 14 laboratories. Each laboratory followed a standardized protocol utilizing the same lot of reference DNA materials, DNA isolation kits, amplification reagents, and test samples to generate comparable data. After removal of outliers, a nested analysis of variance (ANOVA) was used to establish proficiency metrics that include lab-to-lab, replicate testing within a lab, and random error for amplification inhibition and sample processing controls. Other data acceptance measurements included extraneous DNA contamination assessments (no-template and extraction blank controls) and calibration model performance (correlation coefficient, amplification efficiency, and lower limit of quantification). To demonstrate the implementation of the proposed standardized protocols and data acceptance criteria, comparable data from two additional laboratories were reviewed. The data acceptance criteria proposed in this study should help scientists, managers, reviewers, and the public evaluate the technical quality of future findings against an established benchmark. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  5. Accuracy of the clinical diagnosis of vaginitis compared with a DNA probe laboratory standard.

    PubMed

    Lowe, Nancy K; Neal, Jeremy L; Ryan-Wenger, Nancy A

    2009-01-01

    To estimate the accuracy of the clinical diagnosis of the three most common causes of acute vulvovaginal symptoms (bacterial vaginosis, candidiasis vaginitis, and trichomoniasis vaginalis) using a traditional, standardized clinical diagnostic protocol compared with a DNA probe laboratory standard. This prospective clinical comparative study had a sample of 535 active-duty United States military women presenting with vulvovaginal symptoms. Clinical diagnoses were made by research staff using a standardized protocol of history, physical examination including pelvic examination, determination of vaginal pH, vaginal fluid amines test, and wet-prep microscopy. Vaginal fluid samples were obtained for DNA analysis. The research clinicians were blinded to the DNA results. The participants described a presenting symptom of abnormal discharge (50%), itching/irritation (33%), malodor (10%), burning (4%), or others such as vulvar pain and vaginal discomfort. According to laboratory standard, there were 225 cases (42%) of bacterial vaginosis, 76 cases (14%) of candidiasis vaginitis, 8 cases (1.5%) of trichomoniasis vaginalis, 87 cases of mixed infections (16%), and 139 negative cases (26%). For each single infection, the clinical diagnosis had a sensitivity and specificity of 80.8% and 70.0% for bacterial vaginosis, 83.8% and 84.8% for candidiasis vaginitis, and 84.6% and 99.6% for trichomoniasis vaginalis when compared with the DNA probe standard. Compared with a DNA probe standard, clinical diagnosis is 81-85% sensitive and 70-99% specific for bacterial vaginosis, Candida vaginitis, and trichomoniasis. Even under research conditions that provided clinicians with sufficient time and materials to conduct a thorough and standardized clinical evaluation, the diagnosis and, therefore, subsequent treatment of these common vaginal problems remains difficult. II.

  6. Short communication: Evaluation of MALDI-TOF mass spectrometry and a custom reference spectra expanded database for the identification of bovine-associated coagulase-negative staphylococci.

    PubMed

    Cameron, M; Perry, J; Middleton, J R; Chaffer, M; Lewis, J; Keefe, G P

    2018-01-01

    This study evaluated MALDI-TOF mass spectrometry and a custom reference spectra expanded database for the identification of bovine-associated coagulase-negative staphylococci (CNS). A total of 861 CNS isolates were used in the study, covering 21 different CNS species. The majority of the isolates were previously identified by rpoB gene sequencing (n = 804) and the remainder were identified by sequencing of hsp60 (n = 56) and tuf (n = 1). The genotypic identification was considered the gold standard identification. Using a direct transfer protocol and the existing commercial database, MALDI-TOF mass spectrometry showed a typeability of 96.5% (831/861) and an accuracy of 99.2% (824/831). Using a custom reference spectra expanded database, which included an additional 13 in-house created reference spectra, isolates were identified by MALDI-TOF mass spectrometry with 99.2% (854/861) typeability and 99.4% (849/854) accuracy. Overall, MALDI-TOF mass spectrometry using the direct transfer method was shown to be a highly reliable tool for the identification of bovine-associated CNS. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. Software architecture of the Magdalena Ridge Observatory Interferometer

    NASA Astrophysics Data System (ADS)

    Farris, Allen; Klinglesmith, Dan; Seamons, John; Torres, Nicolas; Buscher, David; Young, John

    2010-07-01

    Merging software from 36 independent work packages into a coherent, unified software system with a lifespan of twenty years is the challenge faced by the Magdalena Ridge Observatory Interferometer (MROI). We solve this problem by using standardized interface software automatically generated from simple highlevel descriptions of these systems, relying only on Linux, GNU, and POSIX without complex software such as CORBA. This approach, based on gigabit Ethernet with a TCP/IP protocol, provides the flexibility to integrate and manage diverse, independent systems using a centralized supervisory system that provides a database manager, data collectors, fault handling, and an operator interface.

  8. Use of cardiocerebral resuscitation or AHA/ERC 2005 Guidelines is associated with improved survival from out-of-hospital cardiac arrest: a systematic review and meta-analysis

    PubMed Central

    Salmen, Marcus; Ewy, Gordon A; Sasson, Comilla

    2012-01-01

    Objective To determine whether the use of cardiocerebral resuscitation (CCR) or AHA/ERC 2005 Resuscitation Guidelines improved patient outcomes from out-of-hospital cardiac arrest (OHCA) compared to older guidelines. Design Systematic review and meta-analysis. Data sources MEDLINE, EMBASE, Web of Science and the Cochrane Library databases. We also hand-searched study references and consulted experts. Study selection Design: randomised controlled trials and observational studies. Population OHCA patients, age >17 years. Comparators ‘Control’ protocol versus ‘Study’ protocol. ‘Control’ protocol defined as AHA/ERC 2000 Guidelines for cardiopulmonary resuscitation (CPR). ‘Study’ protocol defined as AHA/ERC 2005 Guidelines for CPR, or a CCR protocol. Outcome Survival to hospital discharge. Quality High-quality or medium-quality studies, as measured by the Newcastle Ottawa Scale using predefined categories. Results Twelve observational studies met inclusion criteria. All the three studies using CCR demonstrated significantly improved survival compared to use of AHA 2000 Guidelines, as did five of the nine studies using AHA/ERC 2005 Guidelines. Pooled data demonstrate that use of a CCR protocol has an unadjusted OR of 2.26 (95% CI 1.64 to 3.12) for survival to hospital discharge among all cardiac arrest patients. Among witnessed ventricular fibrillation/ventricular tachycardia (VF/VT) patients, CCR increased survival by an OR of 2.98 (95% CI 1.92 to 4.62). Studies using AHA/ERC 2005 Guidelines showed an overall trend towards increased survival, but significant heterogeneity existed among these studies. Conclusions We demonstrate an association with improved survival from OHCA when CCR protocols or AHA/ERC 2005 Guidelines are compared to use of older guidelines. In the subgroup of patients with witnessed VF/VT, there was a threefold increase in OHCA survival when CCR was used. CCR appears to be a promising resuscitation protocol for Emergency Medical Services providers in increasing survival from OHCA. Future research will need to be conducted to directly compare AHA/ERC 2010 Guidelines with the CCR approach. PMID:23036985

  9. A Dynamic Approach to Make CDS/ISIS Databases Interoperable over the Internet Using the OAI Protocol

    ERIC Educational Resources Information Center

    Jayakanth, F.; Maly, K.; Zubair, M.; Aswath, L.

    2006-01-01

    Purpose: A dynamic approach to making legacy databases, like CDS/ISIS, interoperable with OAI-compliant digital libraries (DLs). Design/methodology/approach: There are many bibliographic databases that are being maintained using legacy database systems. CDS/ISIS is one such legacy database system. It was designed and developed specifically for…

  10. Uniform standards for genome databases in forest and fruit trees

    USDA-ARS?s Scientific Manuscript database

    TreeGenes and tfGDR serve the international forestry and fruit tree genomics research communities, respectively. These databases hold similar sequence data and provide resources for the submission and recovery of this information in order to enable comparative genomics research. Large-scale genotype...

  11. Favorable Geochemistry from Springs and Wells in Colorado

    DOE Data Explorer

    Richard E. Zehner

    2012-02-01

    This layer contains favorable geochemistry for high-temperature geothermal systems, as interpreted by Richard "Rick" Zehner. The data is compiled from the data obtained from the USGS. The original data set combines 15,622 samples collected in the State of Colorado from several sources including 1) the original Geotherm geochemical database, 2) USGS NWIS (National Water Information System), 3) Colorado Geological Survey geothermal sample data, and 4) original samples collected by R. Zehner at various sites during the 2011 field season. These samples are also available in a separate shapefile FlintWaterSamples.shp. Data from all samples were reportedly collected using standard water sampling protocols (filtering through 0.45 micron filter, etc.) Sample information was standardized to ppm (micrograms/liter) in spreadsheet columns. Commonly-used cation and silica geothermometer temperature estimates are included.

  12. Comparing rapid methods for detecting Listeria in seafood and environmental samples using the most probably number (MPN) technique.

    PubMed

    Cruz, Cristina D; Win, Jessicah K; Chantarachoti, Jiraporn; Mutukumira, Anthony N; Fletcher, Graham C

    2012-02-15

    The standard Bacteriological Analytical Manual (BAM) protocol for detecting Listeria in food and on environmental surfaces takes about 96 h. Some studies indicate that rapid methods, which produce results within 48 h, may be as sensitive and accurate as the culture protocol. As they only give presence/absence results, it can be difficult to compare the accuracy of results generated. We used the Most Probable Number (MPN) technique to evaluate the performance and detection limits of six rapid kits for detecting Listeria in seafood and on an environmental surface compared with the standard protocol. Three seafood products and an environmental surface were inoculated with similar known cell concentrations of Listeria and analyzed according to the manufacturers' instructions. The MPN was estimated using the MPN-BAM spreadsheet. For the seafood products no differences were observed among the rapid kits and efficiency was similar to the BAM method. On the environmental surface the BAM protocol had a higher recovery rate (sensitivity) than any of the rapid kits tested. Clearview™, Reveal®, TECRA® and VIDAS® LDUO detected the cells but only at high concentrations (>10(2) CFU/10 cm(2)). Two kits (VIP™ and Petrifilm™) failed to detect 10(4) CFU/10 cm(2). The MPN method was a useful tool for comparing the results generated by these presence/absence test kits. There remains a need to develop a rapid and sensitive method for detecting Listeria in environmental samples that performs as well as the BAM protocol, since none of the rapid tests used in this study achieved a satisfactory result. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. A statistical view of FMRFamide neuropeptide diversity.

    PubMed

    Espinoza, E; Carrigan, M; Thomas, S G; Shaw, G; Edison, A S

    2000-01-01

    FMRFamide-like peptide (FLP) amino acid sequences have been collected and statistically analyzed. FLP amino acid composition as a function of position in the peptide is graphically presented for several major phyla. Results of total amino acid composition and frequencies of pairs of FLP amino acids have been computed and compared with corresponding values from the entire GenBank protein sequence database. The data for pairwise distributions of amino acids should help in future structure-function studies of FLPs. To aid in future peptide discovery, a computer program and search protocol was developed to identify FLPs from the GenBank protein database without the use of keywords.

  14. Privacy preserving protocol for detecting genetic relatives using rare variants.

    PubMed

    Hormozdiari, Farhad; Joo, Jong Wha J; Wadia, Akshay; Guan, Feng; Ostrosky, Rafail; Sahai, Amit; Eskin, Eleazar

    2014-06-15

    High-throughput sequencing technologies have impacted many areas of genetic research. One such area is the identification of relatives from genetic data. The standard approach for the identification of genetic relatives collects the genomic data of all individuals and stores it in a database. Then, each pair of individuals is compared to detect the set of genetic relatives, and the matched individuals are informed. The main drawback of this approach is the requirement of sharing your genetic data with a trusted third party to perform the relatedness test. In this work, we propose a secure protocol to detect the genetic relatives from sequencing data while not exposing any information about their genomes. We assume that individuals have access to their genome sequences but do not want to share their genomes with anyone else. Unlike previous approaches, our approach uses both common and rare variants which provide the ability to detect much more distant relationships securely. We use a simulated data generated from the 1000 genomes data and illustrate that we can easily detect up to fifth degree cousins which was not possible using the existing methods. We also show in the 1000 genomes data with cryptic relationships that our method can detect these individuals. The software is freely available for download at http://genetics.cs.ucla.edu/crypto/. © The Author 2014. Published by Oxford University Press.

  15. [Review of digital ground object spectral library].

    PubMed

    Zhou, Xiao-Hu; Zhou, Ding-Wu

    2009-06-01

    A higher spectral resolution is the main direction of developing remote sensing technology, and it is quite important to set up the digital ground object reflectance spectral database library, one of fundamental research fields in remote sensing application. Remote sensing application has been increasingly relying on ground object spectral characteristics, and quantitative analysis has been developed to a new stage. The present article summarized and systematically introduced the research status quo and development trend of digital ground object reflectance spectral libraries at home and in the world in recent years. Introducing the spectral libraries has been established, including desertification spectral database library, plants spectral database library, geological spectral database library, soil spectral database library, minerals spectral database library, cloud spectral database library, snow spectral database library, the atmosphere spectral database library, rocks spectral database library, water spectral database library, meteorites spectral database library, moon rock spectral database library, and man-made materials spectral database library, mixture spectral database library, volatile compounds spectral database library, and liquids spectral database library. In the process of establishing spectral database libraries, there have been some problems, such as the lack of uniform national spectral database standard and uniform standards for the ground object features as well as the comparability between different databases. In addition, data sharing mechanism can not be carried out, etc. This article also put forward some suggestions on those problems.

  16. Dual-layer DECT for multiphasic hepatic CT with 50 percent iodine load: a matched-pair comparison with a 120 kVp protocol.

    PubMed

    Nagayama, Yasunori; Nakaura, Takeshi; Oda, Seitaro; Utsunomiya, Daisuke; Funama, Yoshinori; Iyama, Yuji; Taguchi, Narumi; Namimoto, Tomohiro; Yuki, Hideaki; Kidoh, Masafumi; Hirata, Kenichiro; Nakagawa, Masataka; Yamashita, Yasuyuki

    2018-04-01

    To evaluate the image quality and lesion conspicuity of virtual-monochromatic-imaging (VMI) with dual-layer DECT (DL-DECT) for reduced-iodine-load multiphasic-hepatic CT. Forty-five adults with renal dysfunction who had undergone hepatic DL-DECT with 300-mgI/kg were included. VMI (40-70-keV, DL-DECT-VMI) was generated at each enhancement phase. As controls, 45 matched patients undergoing standard 120-kVp protocol (120-kVp, 600-mgI/kg, and iterative reconstruction) were included. We compared the size-specific dose estimate (SSDE), image noise, CT attenuation, and contrast-to-noise ratio (CNR) between protocols. Two radiologists scored the image quality and lesion conspicuity. SSDE was significantly lower in DL-DECT group (p < 0.01). Image noise of DL-DECT-VMI was almost constant at each keV (differences of ≤15%) and equivalent to or lower than of 120-kVp. As the energy decreased, CT attenuation and CNR gradually increased; the values of 55-60 keV images were almost equivalent to those of standard 120-kVp. The highest scores for overall quality and lesion conspicuity were assigned at 40-keV followed by 45 to 55-keV, all of which were similar to or better than of 120-kVp. For multiphasic-hepatic CT with 50% iodine-load, DL-DECT-VMI at 40- to 55-keV provides equivalent or better image quality and lesion conspicuity without increasing radiation dose compared with standard 120-kVp protocol. • 40-55-keV yields optimal image quality for half-iodine-load multiphasic-hepatic CT with DL-DECT. • DL-DECT protocol decreases radiation exposure compared with 120-kVp scans with iterative reconstruction. • 40-keV images maximise conspicuity of hepatocellular carcinoma especially at hepatic-arterial phase.

  17. Performance Comparison of Wireless Sensor Network Standard Protocols in an Aerospace Environment: ISA100.11a and ZigBee Pro

    NASA Technical Reports Server (NTRS)

    Wagner, Raymond S.; Barton, Richard J.

    2011-01-01

    Standards-based wireless sensor network (WSN) protocols are promising candidates for spacecraft avionics systems, offering unprecedented instrumentation flexibility and expandability. Ensuring reliable data transport is key, however, when migrating from wired to wireless data gathering systems. In this paper, we conduct a rigorous laboratory analysis of the relative performances of the ZigBee Pro and ISA100.11a protocols in a representative crewed aerospace environment. Since both operate in the 2.4 GHz radio frequency (RF) band shared by systems such as Wi-Fi, they are subject at times to potentially debilitating RF interference. We compare goodput (application-level throughput) achievable by both under varying levels of 802.11g Wi-Fi traffic. We conclude that while the simpler, more inexpensive ZigBee Pro protocol performs well under moderate levels of interference, the more complex and costly ISA100.11a protocol is needed to ensure reliable data delivery under heavier interference. This paper represents the first published, rigorous analysis of WSN protocols in an aerospace environment that we are aware of and the first published head-to-head comparison of ZigBee Pro and ISA100.11a.

  18. Model-based Iterative Reconstruction: Effect on Patient Radiation Dose and Image Quality in Pediatric Body CT

    PubMed Central

    Dillman, Jonathan R.; Goodsitt, Mitchell M.; Christodoulou, Emmanuel G.; Keshavarzi, Nahid; Strouse, Peter J.

    2014-01-01

    Purpose To retrospectively compare image quality and radiation dose between a reduced-dose computed tomographic (CT) protocol that uses model-based iterative reconstruction (MBIR) and a standard-dose CT protocol that uses 30% adaptive statistical iterative reconstruction (ASIR) with filtered back projection. Materials and Methods Institutional review board approval was obtained. Clinical CT images of the chest, abdomen, and pelvis obtained with a reduced-dose protocol were identified. Images were reconstructed with two algorithms: MBIR and 100% ASIR. All subjects had undergone standard-dose CT within the prior year, and the images were reconstructed with 30% ASIR. Reduced- and standard-dose images were evaluated objectively and subjectively. Reduced-dose images were evaluated for lesion detectability. Spatial resolution was assessed in a phantom. Radiation dose was estimated by using volumetric CT dose index (CTDIvol) and calculated size-specific dose estimates (SSDE). A combination of descriptive statistics, analysis of variance, and t tests was used for statistical analysis. Results In the 25 patients who underwent the reduced-dose protocol, mean decrease in CTDIvol was 46% (range, 19%–65%) and mean decrease in SSDE was 44% (range, 19%–64%). Reduced-dose MBIR images had less noise (P > .004). Spatial resolution was superior for reduced-dose MBIR images. Reduced-dose MBIR images were equivalent to standard-dose images for lungs and soft tissues (P > .05) but were inferior for bones (P = .004). Reduced-dose 100% ASIR images were inferior for soft tissues (P < .002), lungs (P < .001), and bones (P < .001). By using the same reduced-dose acquisition, lesion detectability was better (38% [32 of 84 rated lesions]) or the same (62% [52 of 84 rated lesions]) with MBIR as compared with 100% ASIR. Conclusion CT performed with a reduced-dose protocol and MBIR is feasible in the pediatric population, and it maintains diagnostic quality. © RSNA, 2013 Online supplemental material is available for this article. PMID:24091359

  19. Reusable single-port access device shortens operative time and reduces operative costs.

    PubMed

    Shussman, Noam; Kedar, Asaf; Elazary, Ram; Abu Gazala, Mahmoud; Rivkind, Avraham I; Mintz, Yoav

    2014-06-01

    In recent years, single-port laparoscopy (SPL) has become an attractive approach for performing surgical procedures. The pitfalls of this approach are technical and financial. Financial concerns are due to the increased cost of dedicated devices and prolonged operating room time. Our aim was to calculate the cost of SPL using a reusable port and instruments in order to evaluate the cost difference between this approach to SPL using the available disposable ports and standard laparoscopy. We performed 22 laparoscopic procedures via the SPL approach using a reusable single-port access system and reusable laparoscopic instruments. These included 17 cholecystectomies and five other procedures. Operative time, postoperative length of stay (LOS) and complications were prospectively recorded and were compared with similar data from our SPL database. Student's t test was used for statistical analysis. SPL was successfully performed in all cases. Mean operative time for cholecystectomy was 72 min (range 40-116). Postoperative LOS was not changed from our standard protocols and was 1.1 days for cholecystectomy. The postoperative course was within normal limits for all patients and perioperative morbidity was recorded. Both operative time and length of hospital stay were shorter for the 17 patients who underwent cholecystectomy using a reusable port than for the matched previous 17 SPL cholecystectomies we performed (p < 0.001). Prices of disposable SPL instruments and multiport access devices as well as extraction bags from different manufacturers were used to calculate the cost difference. Operating with a reusable port ended up with an average cost savings of US$388 compared with using disposable ports, and US$240 compared with standard laparoscopy. Single-port laparoscopic surgery is a technically challenging and expensive surgical approach. Financial concerns among others have been advocated against this approach; however, we demonstrate herein that using a reusable port and instruments reduces operative time and overall operative costs, even beyond the cost of standard laparoscopy.

  20. Refining animal models in fracture research: seeking consensus in optimising both animal welfare and scientific validity for appropriate biomedical use

    PubMed Central

    Auer, Jorg A; Goodship, Allen; Arnoczky, Steven; Pearce, Simon; Price, Jill; Claes, Lutz; von Rechenberg, Brigitte; Hofmann-Amtenbrinck, Margarethe; Schneider, Erich; Müller-Terpitz, R; Thiele, F; Rippe, Klaus-Peter; Grainger, David W

    2007-01-01

    Background In an attempt to establish some consensus on the proper use and design of experimental animal models in musculoskeletal research, AOVET (the veterinary specialty group of the AO Foundation) in concert with the AO Research Institute (ARI), and the European Academy for the Study of Scientific and Technological Advance, convened a group of musculoskeletal researchers, veterinarians, legal experts, and ethicists to discuss, in a frank and open forum, the use of animals in musculoskeletal research. Methods The group narrowed the field to fracture research. The consensus opinion resulting from this workshop can be summarized as follows: Results & Conclusion Anaesthesia and pain management protocols for research animals should follow standard protocols applied in clinical work for the species involved. This will improve morbidity and mortality outcomes. A database should be established to facilitate selection of anaesthesia and pain management protocols for specific experimental surgical procedures and adopted as an International Standard (IS) according to animal species selected. A list of 10 golden rules and requirements for conduction of animal experiments in musculoskeletal research was drawn up comprising 1) Intelligent study designs to receive appropriate answers; 2) Minimal complication rates (5 to max. 10%); 3) Defined end-points for both welfare and scientific outputs analogous to quality assessment (QA) audit of protocols in GLP studies; 4) Sufficient details for materials and methods applied; 5) Potentially confounding variables (genetic background, seasonal, hormonal, size, histological, and biomechanical differences); 6) Post-operative management with emphasis on analgesia and follow-up examinations; 7) Study protocols to satisfy criteria established for a "justified animal study"; 8) Surgical expertise to conduct surgery on animals; 9) Pilot studies as a critical part of model validation and powering of the definitive study design; 10) Criteria for funding agencies to include requirements related to animal experiments as part of the overall scientific proposal review protocols. Such agencies are also encouraged to seriously consider and adopt the recommendations described here when awarding funds for specific projects. Specific new requirements and mandates related both to improving the welfare and scientific rigour of animal-based research models are urgently needed as part of international harmonization of standards. PMID:17678534

  1. Protocol adherence and the ability to achieve target haemoglobin levels in haemodialysis patients.

    PubMed

    Chan, Kevin; Moran, John; Hlatky, Mark; Lafayette, Richard

    2009-06-01

    Anemia management remains complicated in patients with endstage renal disease on hemodialysis. We wished to evaluate the effect of protocol adherence to EPO and intravenous iron dosing on achieving the desired range of hemoglobin levels. A cohort of hemodialysis patients was studied to evaluate the rate of adherence to EPO and iron dosing protocols over a 5 month period. A database was completed to evaluate all known comorbidities, demographic factors, and facility issues that might affect hemoglobin levels. A logistic regression model was employed to evaluate the effect of adherence to the anemia protocols on the probability of achieving a hemoglobin level below, within or above the targeted range of 11-12.5 g/dl. Among 2114 patients, we found that adherence to both the EPO and iron dosing protocol resulted in the greatest probability of achieving the target hemoglobin range (56 +/- 5% in anemia protocol adherent patients versus 42 +/- 7% in non adherent patients). This was predominantly due to a lowered risk of having above target hemoglobin levels rather than below. The use of the anemia protocols was associated with lower rates of hospitalization (9 +/- 0.7 visits/100 months in adherent group vs 15 +/- 2 in non adherent group) and lower utilization of both EPO and intravenous iron. Furthermore, patients in the adherent groups had less variability of their hemoglobin levels month by month, at least as judged by standard deviation. Adherence to anemia protocols, as practiced in the dialysis units included in this cohort, may improve hemodialysis patients' ability to achieve target hemoglobin levels, and by avoiding above target hemoglobin values, lower drug utilization and reduce variability of hemoglobin levels.

  2. GENERIC VERIFICATION PROTOCOL FOR DETERMINATION OF EMISSIONS REDUCTIONS OBTAINED BY USE OF ALTERNATIVE OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSIONS AND LUBRICANTS FOR HIGHWAY AND NONROAD USE DISEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    This report sets standards by which the emissions reduction provided by fuel and lubricant technologies can be tested and be tested in a comparable way. It is a generic protocol under the Environmental Technology Verification program.

  3. Optimized human platelet lysate as novel basis for a serum-, xeno-, and additive-free corneal endothelial cell and tissue culture.

    PubMed

    Thieme, Daniel; Reuland, Lynn; Lindl, Toni; Kruse, Friedrich; Fuchsluger, Thomas

    2018-02-01

    The expansion of donor-derived corneal endothelial cells (ECs) is a promising approach for regenerative therapies in corneal diseases. To achieve the best Good Manufacturing Practice standard the entire cultivation process should be devoid of nonhuman components. However, so far, there is no suitable xeno-free protocol for clinical applications. We therefore introduce a processed variant of a platelet lysate for the use in corneal cell and tissue culture based on a Good Manufacturing Practice-grade thrombocyte concentrate. This processed human platelet lysate (phPL), free of any animal components and of anticoagulants such as heparin with a physiological ionic composition, was used to cultivate corneal ECs in vitro and ex vivo in comparison to standard cultivation with fetal calf serum (FCS). Human donor corneas were cut in quarters while 2 quarters of each cornea were incubated with the respective medium supplement. Three fields of view per quarter were taken into account for the analysis. Evaluation of phPL as a medium supplement in cell culture of immortalized EC showed a superior viability compared with FCS control with reduced cell proliferation. Furthermore, the viability during the expansion of primary cells is significantly (3-fold ±0.5) increased with phPL compared with FCS standard medium. Quartering donor corneas was traumatic for the endothelium and therefore resulted in increased EC loss. Interestingly, however, cultivation of the quartered pieces for 2 weeks in 0.1-mg/ml pHPL in Biochrome I showed a 21 (±10) % EC loss compared with 67 (±12) % EC loss when cultivated in 2% FCS in Biochrome I. The cell culture protocol with pHPL as FCS replacement seems to be superior to the standard FCS protocols with respect to EC survival. It offers a xeno-free and physiological environment for corneal endothelial cells. This alternative cultivation protocol could facilitate the use of EC for human corneal cell therapy. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Toward the Standardization of Mitochondrial Proteomics: The Italian Mitochondrial Human Proteome Project Initiative.

    PubMed

    Alberio, Tiziana; Pieroni, Luisa; Ronci, Maurizio; Banfi, Cristina; Bongarzone, Italia; Bottoni, Patrizia; Brioschi, Maura; Caterino, Marianna; Chinello, Clizia; Cormio, Antonella; Cozzolino, Flora; Cunsolo, Vincenzo; Fontana, Simona; Garavaglia, Barbara; Giusti, Laura; Greco, Viviana; Lucacchini, Antonio; Maffioli, Elisa; Magni, Fulvio; Monteleone, Francesca; Monti, Maria; Monti, Valentina; Musicco, Clara; Petrosillo, Giuseppe; Porcelli, Vito; Saletti, Rosaria; Scatena, Roberto; Soggiu, Alessio; Tedeschi, Gabriella; Zilocchi, Mara; Roncada, Paola; Urbani, Andrea; Fasano, Mauro

    2017-12-01

    The Mitochondrial Human Proteome Project aims at understanding the function of the mitochondrial proteome and its crosstalk with the proteome of other organelles. Being able to choose a suitable and validated enrichment protocol of functional mitochondria, based on the specific needs of the downstream proteomics analysis, would greatly help the researchers in the field. Mitochondrial fractions from ten model cell lines were prepared using three enrichment protocols and analyzed on seven different LC-MS/MS platforms. All data were processed using neXtProt as reference database. The data are available for the Human Proteome Project purposes through the ProteomeXchange Consortium with the identifier PXD007053. The processed data sets were analyzed using a suite of R routines to perform a statistical analysis and to retrieve subcellular and submitochondrial localizations. Although the overall number of identified total and mitochondrial proteins was not significantly dependent on the enrichment protocol, specific line to line differences were observed. Moreover, the protein lists were mapped to a network representing the functional mitochondrial proteome, encompassing mitochondrial proteins and their first interactors. More than 80% of the identified proteins resulted in nodes of this network but with a different ability in coisolating mitochondria-associated structures for each enrichment protocol/cell line pair.

  5. OVERSEER: An Expert System Monitor for the Psychiatric Hospital

    PubMed Central

    Bronzino, Joseph D.; Morelli, Ralph A.; Goethe, John W.

    1988-01-01

    In order to improve patient care, comply with regulatory guidelines and decrease potential liability, psychiatric hospitals and clinics have been searching for computer systems to monitor the management and treatment of patients. This paper describes OVERSEER: a knowledge based system that monitors the treatment of psychiatric patients in real time. Based on procedures and protocols developed in the psychiatric setting, OVERSEER monitors the clinical database and issues alerts when standard clinical practices are not followed or when laboratory results or other clinical indicators are abnormal. Written in PROLOG, OVERSEER is designed to interface directly with the hospital's database, and, thereby utilizes all available pharmacy and laboratory data. Moreover, unlike the interactive expert systems developed for the psychiatric clinic, OVERSEER does not require extensive data entry by the clinician. Consequently, the chief benefit of OVERSEER's monitoring approach is the unobtrusive manner in which it evaluates treatment and patient responses and provides information regarding patient management.

  6. South African Research Ethics Committee Review of Standards of Prevention in HIV Vaccine Trial Protocols.

    PubMed

    Essack, Zaynab; Wassenaar, Douglas R

    2018-04-01

    HIV prevention trials provide a prevention package to participants to help prevent HIV acquisition. As new prevention methods are proven effective, this raises ethical and scientific design complexities regarding the prevention package or standard of prevention. Given its high HIV incidence and prevalence, South Africa has become a hub for HIV prevention research. For this reason, it is critical to study the implementation of relevant ethical-legal frameworks for such research in South Africa. This qualitative study used in-depth interviews to explore the practices and perspectives of eight members of South African research ethics committees (RECs) who have reviewed protocols for HIV vaccine trials. Their practices and perspectives are compared with ethics guideline requirements for standards of prevention.

  7. Standardization of Nanoparticle Characterization: Methods for Testing Properties, Stability, and Functionality of Edible Nanoparticles.

    PubMed

    McClements, Jake; McClements, David Julian

    2016-06-10

    There has been a rapid increase in the fabrication of various kinds of edible nanoparticles for oral delivery of bioactive agents, such as those constructed from proteins, carbohydrates, lipids, and/or minerals. It is currently difficult to compare the relative advantages and disadvantages of different kinds of nanoparticle-based delivery systems because researchers use different analytical instruments and protocols to characterize them. In this paper, we briefly review the various analytical methods available for characterizing the properties of edible nanoparticles, such as composition, morphology, size, charge, physical state, and stability. This information is then used to propose a number of standardized protocols for characterizing nanoparticle properties, for evaluating their stability to environmental stresses, and for predicting their biological fate. Implementation of these protocols would facilitate comparison of the performance of nanoparticles under standardized conditions, which would facilitate the rational selection of nanoparticle-based delivery systems for different applications in the food, health care, and pharmaceutical industries.

  8. Comparing Effective Doses During Image-Guided Core Needle Biopsies with Computed Tomography Versus C-Arm Cone Beam CT Using Adult and Pediatric Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ben-Shlomo, A.; Cohen, D.; Bruckheimer, E.

    PurposeTo compare the effective doses of needle biopsies based on dose measurements and simulations using adult and pediatric phantoms, between cone beam c-arm CT (CBCT) and CT.MethodEffective doses were calculated and compared based on measurements and Monte Carlo simulations of CT- and CBCT-guided biopsy procedures of the lungs, liver, and kidney using pediatric and adult phantoms.ResultsThe effective doses for pediatric and adult phantoms, using our standard protocols for upper, middle and lower lungs, liver, and kidney biopsies, were significantly lower under CBCT guidance than CT. The average effective dose for a 5-year old for these five biopsies was 0.36 ± 0.05 mSv withmore » the standard CBCT exposure protocols and 2.13 ± 0.26 mSv with CT. The adult average effective dose for the five biopsies was 1.63 ± 0.22 mSv with the standard CBCT protocols and 8.22 ± 1.02 mSv using CT. The CT effective dose was higher than CBCT protocols for child and adult phantoms by 803 and 590 % for upper lung, 639 and 525 % for mid-lung, and 461 and 251 % for lower lung, respectively. Similarly, the effective dose was higher by 691 and 762 % for liver and 513 and 608 % for kidney biopsies.ConclusionsBased on measurements and simulations with pediatric and adult phantoms, radiation effective doses during image-guided needle biopsies of the lung, liver, and kidney are significantly lower with CBCT than with CT.« less

  9. Standardization of Keyword Search Mode

    ERIC Educational Resources Information Center

    Su, Di

    2010-01-01

    In spite of its popularity, keyword search mode has not been standardized. Though information professionals are quick to adapt to various presentations of keyword search mode, novice end-users may find keyword search confusing. This article compares keyword search mode in some major reference databases and calls for standardization. (Contains 3…

  10. A clinical pathway for the postoperative management of hypocalcemia after pediatric thyroidectomy reduces blood draws.

    PubMed

    Patel, Neha A; Bly, Randall A; Adams, Seth; Carlin, Kristen; Parikh, Sanjay R; Dahl, John P; Manning, Scott

    2018-02-01

    Postoperative calcium management is challenging following pediatric thyroidectomy given potential limitations in self-reporting symptoms and compliance with phlebotomy. A protocol was created at our tertiary children's institution utilizing intraoperative parathyroid hormone (PTH) levels to guide electrolyte management during hospitalization. The objective of this study was to determine the effect of a new thyroidectomy postoperative management protocol on two primary outcomes: (1) the number of postoperative calcium blood draws and (2) the length of hospital stay. Institutional review board approved retrospective study (2010-2016). Consecutive pediatric total thyroidectomy and completion thyroidectomy ± neck dissection cases from 1/1/2010 through 8/5/2016 at a single tertiary children's institution were retrospectively reviewed before and after initiation of a new management protocol. All cases after 2/1/2014 comprised the experimental group (post-protocol implementation). The pre-protocol control group consisted of cases prior to 2/1/2014. Multivariable linear and Poisson regression models were used to compare the control and experimental groups for outcome measure of number of calcium lab draws and hospital length of stay. 53 patients were included (n = 23, control group; n = 30 experimental group). The median age was 15 years. 41 patients (77.4%) were female. Postoperative calcium draws decreased from a mean of 5.2 to 3.6 per day post-protocol implementation (Rate Ratio = 0.70, p < .001), adjusting for covariates. The mean number of total inpatient calcium draws before protocol initiation was 13.3 (±13.20) compared to 7.2 (±4.25) in the post-protocol implementation group. Length of stay was 2.1 days in the control group and 1.8 days post-protocol implementation (p = .29). Patients who underwent concurrent neck dissection had a longer mean length of stay of 2.32 days compared to 1.66 days in those patients who did not undergo a neck dissection (p = .02). Hypocalcemia was also associated with a longer mean length of stay of 2.41 days compared to 1.60 days in patients who did not develop hypocalcemia (p < .01). The number of calcium blood draws was significantly reduced after introduction of a standardized protocol based on intraoperative PTH levels. The hospital length of stay did not change. Adoption of a standardized postoperative protocol based on intraoperative PTH levels may reduce the number of blood draws in children undergoing thyroidectomy. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. [Protocol for the treatment of severe acute pancreatitis with necrosis].

    PubMed

    Barreda, Luis; Targarona, Javier; Rodriguez, César

    2005-01-01

    The Severe Acute Pancreatic Unit of Edgardo Rebagliati Martins National Hospital was officially created in the year 2000. Up to date, we have cared for more than 195 patients with Pancreatic Necrosis. All of them have been treated under a management protocol presented by us. This has helped us to standardize treatment and also to compare results with work groups around the world. This Protocol comes from our own experience and that of our colleagues abroad with a wide knowledge in this kind of pathology abroad, with whom we maintain close ties.

  12. Design and Development of a Technology Platform for DNA-Encoded Library Production and Affinity Selection.

    PubMed

    Castañón, Jesús; Román, José Pablo; Jessop, Theodore C; de Blas, Jesús; Haro, Rubén

    2018-06-01

    DNA-encoded libraries (DELs) have emerged as an efficient and cost-effective drug discovery tool for the exploration and screening of very large chemical space using small-molecule collections of unprecedented size. Herein, we report an integrated automation and informatics system designed to enhance the quality, efficiency, and throughput of the production and affinity selection of these libraries. The platform is governed by software developed according to a database-centric architecture to ensure data consistency, integrity, and availability. Through its versatile protocol management functionalities, this application captures the wide diversity of experimental processes involved with DEL technology, keeps track of working protocols in the database, and uses them to command robotic liquid handlers for the synthesis of libraries. This approach provides full traceability of building-blocks and DNA tags in each split-and-pool cycle. Affinity selection experiments and high-throughput sequencing reads are also captured in the database, and the results are automatically deconvoluted and visualized in customizable representations. Researchers can compare results of different experiments and use machine learning methods to discover patterns in data. As of this writing, the platform has been validated through the generation and affinity selection of various libraries, and it has become the cornerstone of the DEL production effort at Lilly.

  13. Retrospective analysis supports algorithm as efficient diagnostic approach to treatable intellectual developmental disabilities.

    PubMed

    Sayson, Bryan; Popurs, Marioara Angela Moisa; Lafek, Mirafe; Berkow, Ruth; Stockler-Ipsiroglu, Sylvia; van Karnebeek, Clara D M

    2015-05-01

    Intellectual developmental disorders (IDD(1)), characterized by a significant impairment in cognitive function and behavior, affect 2.5% of the population and are associated with considerable morbidity and healthcare costs. Inborn errors of metabolism (IEM) currently constitute the largest group of genetic defects presenting with IDD, which are amenable to causal therapy. Recently, we created an evidence-based 2-tiered diagnostic protocol (TIDE protocol); the first tier is a 'screening step' applied in all patients, comprising routinely performed, wide available metabolic tests in blood and urine, while second-tier tests are more specific and based on the patient's phenotype. The protocol is supported by an app (www.treatable-ID.org). To retrospectively examine the cost- and time-effectiveness of the TIDE protocol in patients identified with a treatable IEM at the British Columbia Children's Hospital. We searched the database for all IDD patients diagnosed with a treatable IEM, during the period 2000-2009 in our academic institution. Data regarding the patient's clinical phenotype, IEM, diagnostic tests and interval were collected. Total costs and time intervals associated with all testing and physician consultations actually performed were calculated and compared to the model of the TIDE protocol. Thirty-one patients (16 males) were diagnosed with treatable IDD during the period 2000-2009. For those identifiable via the 1st tier (n=20), the average cost savings would have been $311.17 CAD, and for those diagnosed via a second-tier test (n=11) $340.14 CAD. Significant diagnostic delay (mean 9 months; range 1-29 months) could have been avoided in 9 patients with first-tier diagnoses, had the TIDE protocol been used. For those with second-tier treatable IDD, diagnoses could have been more rapidly achieved with the use of the Treatable IDD app allowing for specific searches based on signs and symptoms. The TIDE protocol for treatable forms of IDD appears effective reducing diagnostic delay and unnecessary costs. Larger prospective studies, currently underway, are needed to prove that standard screening for treatable conditions in patients with IDD is time- and cost-effective, and most importantly will preserve brain function by timely diagnosis enabling initiation of causal therapy. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. A Protocol for Using Gene Set Enrichment Analysis to Identify the Appropriate Animal Model for Translational Research.

    PubMed

    Weidner, Christopher; Steinfath, Matthias; Wistorf, Elisa; Oelgeschläger, Michael; Schneider, Marlon R; Schönfelder, Gilbert

    2017-08-16

    Recent studies that compared transcriptomic datasets of human diseases with datasets from mouse models using traditional gene-to-gene comparison techniques resulted in contradictory conclusions regarding the relevance of animal models for translational research. A major reason for the discrepancies between different gene expression analyses is the arbitrary filtering of differentially expressed genes. Furthermore, the comparison of single genes between different species and platforms often is limited by technical variance, leading to misinterpretation of the con/discordance between data from human and animal models. Thus, standardized approaches for systematic data analysis are needed. To overcome subjective gene filtering and ineffective gene-to-gene comparisons, we recently demonstrated that gene set enrichment analysis (GSEA) has the potential to avoid these problems. Therefore, we developed a standardized protocol for the use of GSEA to distinguish between appropriate and inappropriate animal models for translational research. This protocol is not suitable to predict how to design new model systems a-priori, as it requires existing experimental omics data. However, the protocol describes how to interpret existing data in a standardized manner in order to select the most suitable animal model, thus avoiding unnecessary animal experiments and misleading translational studies.

  15. Validation of a reaction volume reduction protocol for analysis of Y chromosome haplotypes targeting DNA databases.

    PubMed

    Souza, C A; Oliveira, T C; Crovella, S; Santos, S M; Rabêlo, K C N; Soriano, E P; Carvalho, M V D; Junior, A F Caldas; Porto, G G; Campello, R I C; Antunes, A A; Queiroz, R A; Souza, S M

    2017-04-28

    The use of Y chromosome haplotypes, important for the detection of sexual crimes in forensics, has gained prominence with the use of databases that incorporate these genetic profiles in their system. Here, we optimized and validated an amplification protocol for Y chromosome profile retrieval in reference samples using lesser materials than those in commercial kits. FTA ® cards (Flinders Technology Associates) were used to support the oral cells of male individuals, which were amplified directly using the SwabSolution reagent (Promega). First, we optimized and validated the process to define the volume and cycling conditions. Three reference samples and nineteen 1.2 mm-diameter perforated discs were used per sample. Amplification of one or two discs (samples) with the PowerPlex ® Y23 kit (Promega) was performed using 25, 26, and 27 thermal cycles. Twenty percent, 32%, and 100% reagent volumes, one disc, and 26 cycles were used for the control per sample. Thereafter, all samples (N = 270) were amplified using 27 cycles, one disc, and 32% reagents (optimized conditions). Data was analyzed using a study of equilibrium values between fluorophore colors. In the samples analyzed with 20% volume, an imbalance was observed in peak heights, both inside and in-between each dye. In samples amplified with 32% reagents, the values obtained for the intra-color and inter-color standard balance calculations for verification of the quality of the analyzed peaks were similar to those of samples amplified with 100% of the recommended volume. The quality of the profiles obtained with 32% reagents was suitable for insertion into databases.

  16. The USA National Phenology Network: A national observatory for assessment of biotic response to environmental variation

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; USA National Phenology Network National Coordinating Office

    2011-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org), established in 2007, is a national science and monitoring initiative focused on phenology as a tool to understand how plants, animals and landscapes respond to climatic variability and change. Core functions of the National Coordinating Office (NCO) of USA-NPN are to provide a national information management system including databases, develop and implement internationally standardized phenology monitoring protocols, create partnerships with a variety of organizations including field stations for implementation, facilitate research and the development of decision support tools, and promote education and outreach activities related to phenology and climate change. This presentation will describe programs, tools and materials developed by USA-NPN to facilitate science, management and education related to phenology of plants, animals and landscapes within protected areas at local, regional and national scales. Particular emphasis will be placed on the on-line integrated animal and plant monitoring program, Nature's Notebook, which provides standardized protocols for phenological status monitoring and data management for over 500 animal and plant species. The monitoring system facilitates collection of sampling intensity, absence data, considerable metadata (from site to observation). We recently added functionality for recording estimates of animal abundance and plant canopy development. Real-time raw data for plants (from 2009 to present) and animals (from 2010 to present), including FGDC-compliant metadata and documented methodology, are now available for download from the website. A new data exploration tool premiered in spring 2010 allows sophisticated graphical visualization of integrated phenological and meteorological data. The network seeks to develop partnerships with other organizations interested in (1) implementing vetted, standardized protocols for phenological or ecological monitoring, and (2) using phenology data and information for a variety of modeling applications.

  17. Loss-tolerant measurement-device-independent quantum private queries.

    PubMed

    Zhao, Liang-Yuan; Yin, Zhen-Qiang; Chen, Wei; Qian, Yong-Jun; Zhang, Chun-Mei; Guo, Guang-Can; Han, Zheng-Fu

    2017-01-04

    Quantum private queries (QPQ) is an important cryptography protocol aiming to protect both the user's and database's privacy when the database is queried privately. Recently, a variety of practical QPQ protocols based on quantum key distribution (QKD) have been proposed. However, for QKD-based QPQ the user's imperfect detectors can be subjected to some detector- side-channel attacks launched by the dishonest owner of the database. Here, we present a simple example that shows how the detector-blinding attack can damage the security of QKD-based QPQ completely. To remove all the known and unknown detector side channels, we propose a solution of measurement-device-independent QPQ (MDI-QPQ) with single- photon sources. The security of the proposed protocol has been analyzed under some typical attacks. Moreover, we prove that its security is completely loss independent. The results show that practical QPQ will remain the same degree of privacy as before even with seriously uncharacterized detectors.

  18. [Construction of chemical information database based on optical structure recognition technique].

    PubMed

    Lv, C Y; Li, M N; Zhang, L R; Liu, Z M

    2018-04-18

    To create a protocol that could be used to construct chemical information database from scientific literature quickly and automatically. Scientific literature, patents and technical reports from different chemical disciplines were collected and stored in PDF format as fundamental datasets. Chemical structures were transformed from published documents and images to machine-readable data by using the name conversion technology and optical structure recognition tool CLiDE. In the process of molecular structure information extraction, Markush structures were enumerated into well-defined monomer molecules by means of QueryTools in molecule editor ChemDraw. Document management software EndNote X8 was applied to acquire bibliographical references involving title, author, journal and year of publication. Text mining toolkit ChemDataExtractor was adopted to retrieve information that could be used to populate structured chemical database from figures, tables, and textual paragraphs. After this step, detailed manual revision and annotation were conducted in order to ensure the accuracy and completeness of the data. In addition to the literature data, computing simulation platform Pipeline Pilot 7.5 was utilized to calculate the physical and chemical properties and predict molecular attributes. Furthermore, open database ChEMBL was linked to fetch known bioactivities, such as indications and targets. After information extraction and data expansion, five separate metadata files were generated, including molecular structure data file, molecular information, bibliographical references, predictable attributes and known bioactivities. Canonical simplified molecular input line entry specification as primary key, metadata files were associated through common key nodes including molecular number and PDF number to construct an integrated chemical information database. A reasonable construction protocol of chemical information database was created successfully. A total of 174 research articles and 25 reviews published in Marine Drugs from January 2015 to June 2016 collected as essential data source, and an elementary marine natural product database named PKU-MNPD was built in accordance with this protocol, which contained 3 262 molecules and 19 821 records. This data aggregation protocol is of great help for the chemical information database construction in accuracy, comprehensiveness and efficiency based on original documents. The structured chemical information database can facilitate the access to medical intelligence and accelerate the transformation of scientific research achievements.

  19. Can a single isotropic 3D fast spin echo sequence replace three-plane standard proton density fat-saturated knee MRI at 1.5 T?

    PubMed Central

    Robinson, P; Hodgson, R; Grainger, A J

    2015-01-01

    Objective: To assess whether a single isotropic three-dimensional (3D) fast spin echo (FSE) proton density fat-saturated (PD FS) sequence reconstructed in three planes could replace the three PD (FS) sequences in our standard protocol at 1.5 T (Siemens Avanto, Erlangen, Germany). Methods: A 3D FSE PD water excitation sequence was included in the protocol for 95 consecutive patients referred for routine knee MRI. This was used to produce offline reconstructions in axial, sagittal and coronal planes. Two radiologists independently assessed each case twice, once using the standard MRI protocol and once replacing the standard PD (FS) sequences with reconstructions from the 3D data set. Following scoring, the observer reviewed the 3D data set and performed multiplanar reformats to see if this altered confidence. The menisci, ligaments and cartilage were assessed, and statistical analysis was performed using the standard sequence as the reference standard. Results: The reporting accuracy was as follows: medial meniscus (MM) = 90.9%, lateral meniscus (LM) = 93.7%, anterior cruciate ligament (ACL) = 98.9% and cartilage surfaces = 85.8%. Agreement among the readers was for the standard protocol: MM kappa = 0.91, LM = 0.89, ACL = 0.98 and cartilage = 0.84; and for the 3D protocol: MM = 0.86, LM = 0.77, ACL = 0.94 and cartilage = 0.64. Conclusion: A 3D PD FSE sequence reconstructed in three planes gives reduced accuracy and decreased concordance among readers compared with conventional sequences when evaluating the menisci and cartilage with a 1.5-T MRI scanner. Advances in knowledge: Using the existing 1.5-T MR systems, a 3D FSE sequence should not replace two-dimensional sequences. PMID:26067920

  20. A carcinogenic potency database of the standardized results of animal bioassays

    PubMed Central

    Gold, Lois Swirsky; Sawyer, Charles B.; Magaw, Renae; Backman, Georganne M.; De Veciana, Margarita; Levinson, Robert; Hooper, N. Kim; Havender, William R.; Bernstein, Leslie; Peto, Richard; Pike, Malcolm C.; Ames, Bruce N.

    1984-01-01

    The preceding paper described our numerical index of carcinogenic potency, the TD50 and the statistical procedures adopted for estimating it from experimental data. This paper presents the Carcinogenic Potency Database, which includes results of about 3000 long-term, chronic experiments of 770 test compounds. Part II is a discussion of the sources of our data, the rationale for the inclusion of particular experiments and particular target sites, and the conventions adopted in summarizing the literature. Part III is a guide to the plot of results presented in Part IV. A number of appendices are provided to facilitate use of the database. The plot includes information about chronic cancer tests in mammals, such as dose and other aspects of experimental protocol, histopathology and tumor incidence, TD50 and its statistical significance, dose response, author's opinion and literature reference. The plot readily permits comparisons of carcinogenic potency and many other aspects of cancer tests; it also provides quantitative information about negative tests. The range of carcinogenic potency is over 10 million-fold. PMID:6525996

  1. The use of DRG for identifying clinical trials centers with high recruitment potential: a feasability study.

    PubMed

    Aegerter, Philippe; Bendersky, Noelle; Tran, Thi-Chien; Ropers, Jacques; Taright, Namik; Chatellier, Gilles

    2014-01-01

    Recruitment of large samples of patients is crucial for evidence level and efficacy of clinical trials (CT). Clinical Trial Recruitment Support Systems (CTRSS) used to estimate patient recruitment are generally specific to Hospital Information Systems and few were evaluated on a large number of trials. Our aim was to assess, on a large number of CT, the usefulness of commonly available data as Diagnosis Related Groups (DRG) databases in order to estimate potential recruitment. We used the DRG database of a large French multicenter medical institution (1.2 million inpatient stays and 400 new trials each year). Eligibility criteria of protocols were broken down into in atomic entities (diagnosis, procedures, treatments...) then translated into codes and operators recorded in a standardized form. A program parsed the forms and generated requests on the DRG database. A large majority of selection criteria could be coded and final estimations of number of eligible patients were close to observed ones (median difference = 25). Such a system could be part of the feasability evaluation and center selection process before the start of the clinical trial.

  2. Legal Medicine Information System using CDISC ODM.

    PubMed

    Kiuchi, Takahiro; Yoshida, Ken-ichi; Kotani, Hirokazu; Tamaki, Keiji; Nagai, Hisashi; Harada, Kazuki; Ishikawa, Hirono

    2013-11-01

    We have developed a new database system for forensic autopsies, called the Legal Medicine Information System, using the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM). This system comprises two subsystems, namely the Institutional Database System (IDS) located in each institute and containing personal information, and the Central Anonymous Database System (CADS) located in the University Hospital Medical Information Network Center containing only anonymous information. CDISC ODM is used as the data transfer protocol between the two subsystems. Using the IDS, forensic pathologists and other staff can register and search for institutional autopsy information, print death certificates, and extract data for statistical analysis. They can also submit anonymous autopsy information to the CADS semi-automatically. This reduces the burden of double data entry, the time-lag of central data collection, and anxiety regarding legal and ethical issues. Using the CADS, various studies on the causes of death can be conducted quickly and easily, and the results can be used to prevent similar accidents, diseases, and abuse. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Database for LDV Signal Processor Performance Analysis

    NASA Technical Reports Server (NTRS)

    Baker, Glenn D.; Murphy, R. Jay; Meyers, James F.

    1989-01-01

    A comparative and quantitative analysis of various laser velocimeter signal processors is difficult because standards for characterizing signal bursts have not been established. This leaves the researcher to select a signal processor based only on manufacturers' claims without the benefit of direct comparison. The present paper proposes the use of a database of digitized signal bursts obtained from a laser velocimeter under various configurations as a method for directly comparing signal processors.

  4. SU-E-I-22: A Comprehensive Investigation of Noise Variations Between the GE Discovery CT750 HD and GE LightSpeed VCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bache, S; Loyer, E; Stauduhar, P

    2015-06-15

    Purpose: To quantify and compare the noise properties between two GE CT models-the Discovery CT750 HD (aka HD750) and LightSpeed VCT, with the overall goal of assessing the impact in clinical diagnostic practice. Methods: Daily QC data from a fleet of 9 CT scanners currently in clinical use were investigated – 5 HD750 and 4 VCT (over 600 total acquisitions for each scanner). A standard GE QC phantom was scanned daily using two sets of scan parameters with each scanner over 1 year. Water CT number and standard deviation were recorded from the image of water section of the QCmore » phantom. The standard GE QC scan parameters (Pitch = 0.516, 120kVp, 0.4s, 335mA, Small Body SFOV, 5mm thickness) and an in-house developed protocol (Axial, 120kVp, 1.0s, 240mA, Head SFOV, 5mm thickness) were used, with Standard reconstruction algorithm. Noise was measured as the standard deviation in the center of the water phantom image. Inter-model noise distributions and tube output in mR/mAs were compared to assess any relative differences in noise properties. Results: With the in-house protocols, average noise for the five HD750 scanners was ∼9% higher than the VCT scanners (5.8 vs 5.3). For the GE QC protocol, average noise with the HD750 scanners was ∼11% higher than with the VCT scanners (4.8 vs 4.3). This discrepancy in noise between the two models was found despite the tube output in mR/mAs being comparable with the HD750 scanners only having ∼4% lower output (8.0 vs 8.3 mR/mAs). Conclusion: Using identical scan protocols, average noise in images from the HD750 group was higher than that from the VCT group. This confirms feedback from an institutional radiologist’s feedback regarding grainier patient images from HD750 scanners. Further investigation is warranted to assess the noise texture and distribution, as well as clinical impact.« less

  5. ANZSoilML: An Australian - New Zealand standard for exchange of soil data

    NASA Astrophysics Data System (ADS)

    Simons, Bruce; Wilson, Peter; Ritchie, Alistair; Cox, Simon

    2013-04-01

    The Australian-New Zealand soil information exchange standard (ANZSoilML) is a GML-based standard designed to allow the discovery, query and delivery of soil and landscape data via standard Open Geospatial Consortium (OGC) Web Feature Services. ANZSoilML modifies the Australian soil exchange standard (OzSoilML), which is based on the Australian Soil Information Transfer and Evaluation System (SITES) database design and exchange protocols, to meet the New Zealand National Soils Database requirements. The most significant change was the removal of the lists of CodeList terms in OzSoilML, which were based on the field methods specified in the 'Australian Soil and Land Survey Field Handbook'. These were replaced with empty CodeLists as placeholders to external vocabularies to allow the use of New Zealand vocabularies without violating the data model. Testing of the use of these separately governed Australian and New Zealand vocabularies has commenced. ANZSoilML attempts to accommodate the proposed International Organization for Standardization ISO/DIS 28258 standard for soil quality. For the most part, ANZSoilML is consistent with the ISO model, although major differences arise as a result of: • The need to specify the properties appropriate for each feature type; • The inclusion of soil-related 'Landscape' features; • Allowing the mapping of soil surfaces, bodies, layers and horizons, independent of the soil profile; • Allowing specifying the relationships between the various soil features; • Specifying soil horizons as specialisations of soil layers; • Removing duplication of features provided by the ISO Observation & Measurements standard. The International Union of Soil Sciences (IUSS) Working Group on Soil Information Standards (WG-SIS) aims to develop, promote and maintain a standard to facilitate the exchange of soils data and information. Developing an international exchange standard that is compatible with existing and emerging national and regional standards is a considerable challenge. ANZSoilML is proposed as a profile of the more generalised SoilML model being progressed through the IUSS Working Group.

  6. Critical examination of evidence for the nutritional status of children in Papua New Guinea - a systematic review.

    PubMed

    McGlynn, Peter J; Renzaho, Andre μΝ; Pham, Minh D; Toole, Mike; Fisher, Jane; Luchters, Stanley

    2018-01-01

    Undernutrition remains a significant cause of childhood illness, poor growth, development, and death in Papua New Guinea (PNG). Studies on child nutritional outcomes in PNG vary by design, measurement protocols and quality. We conducted a systematic review to assess the evidence for the prevalence of child undernutrition across different study populations, geographical locations and time periods. Six electronic databases and additional grey literature were searched for articles describing the nutritional status by wasting, stunting and underweight, of PNG children under five years of age, published between 1990 and April 2015. Prevalence data using different scales of measurement and reference populations were standardized according to WHO protocols. The search yielded 566 articles, of which, 31 studies met the inclusion criteria. The prevalence of child undernutrition varied from 1% to 76% for wasting (median 11%), 5% to 92% for stunting (median 51%), and 14% to 59% for underweight (median 32%). Wide variations exist according to the index used for measurement, the population characteristics and the geographical region in which they live. Prevalence estimates increase significantly when data using different scales of measurement and population references are standardized to the WHO protocols. Child undernutrition in PNG is regionally variable due to a complex interplay of poverty, disease, food-security, cultural, environmental and sociopolitical issues requiring a complex mix of solutions by governments, health systems and local communities. Area- specific surveys using multiple measures are necessary to inform local solutions for this important problem.

  7. A CAD system and quality assurance protocol for bone age assessment utilizing digital hand atlas

    NASA Astrophysics Data System (ADS)

    Gertych, Arakadiusz; Zhang, Aifeng; Ferrara, Benjamin; Liu, Brent J.

    2007-03-01

    Determination of bone age assessment (BAA) in pediatric radiology is a task based on detailed analysis of patient's left hand X-ray. The current standard utilized in clinical practice relies on a subjective comparison of the hand with patterns in the book atlas. The computerized approach to BAA (CBAA) utilizes automatic analysis of the regions of interest in the hand image. This procedure is followed by extraction of quantitative features sensitive to skeletal development that are further converted to a bone age value utilizing knowledge from the digital hand atlas (DHA). This also allows providing BAA results resembling current clinical approach. All developed methodologies have been combined into one CAD module with a graphical user interface (GUI). CBAA can also improve the statistical and analytical accuracy based on a clinical work-flow analysis. For this purpose a quality assurance protocol (QAP) has been developed. Implementation of the QAP helped to make the CAD more robust and find images that cannot meet conditions required by DHA standards. Moreover, the entire CAD-DHA system may gain further benefits if clinical acquisition protocol is modified. The goal of this study is to present the performance improvement of the overall CAD-DHA system with QAP and the comparison of the CAD results with chronological age of 1390 normal subjects from the DHA. The CAD workstation can process images from local image database or from a PACS server.

  8. Spanish food composition tables and databases: need for a gold standard for healthcare professionals (review).

    PubMed

    Lupiañez-Barbero, Ascension; González Blanco, Cintia; de Leiva Hidalgo, Alberto

    2018-05-23

    Food composition tables and databases (FCTs or FCDBs) provide the necessary information to estimate intake of nutrients and other food components. In Spain, the lack of a reference database has resulted in use of different FCTs/FCDBs in nutritional surveys and research studies, as well as for development of dietetic for diet analysis. As a result, biased, non-comparable results are obtained, and healthcare professionals are rarely aware of these limitations. AECOSAN and the BEDCA association developed a FCDB following European standards, the Spanish Food Composition Database Network (RedBEDCA).The current database has a limited number of foods and food components and barely contains processed foods, which limits its use in epidemiological studies and in the daily practice of healthcare professionals. Copyright © 2018 SEEN y SED. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Physical rehabilitation interventions for adult patients with critical illness across the continuum of recovery: an overview of systematic reviews protocol.

    PubMed

    Connolly, Bronwen; O'Neill, Brenda; Salisbury, Lisa; McDowell, Kathryn; Blackwood, Bronagh

    2015-09-29

    Patients admitted to the intensive care unit with critical illness often experience significant physical impairments, which typically persist for many years following resolution of the original illness. Physical rehabilitation interventions that enhance restoration of physical function have been evaluated across the continuum of recovery following critical illness including within the intensive care unit, following discharge to the ward and beyond hospital discharge. Multiple systematic reviews have been published appraising the expanding evidence investigating these physical rehabilitation interventions, although there appears to be variability in review methodology and quality. We aim to conduct an overview of existing systematic reviews of physical rehabilitation interventions for adult intensive care patients across the continuum of recovery. This protocol has been developed according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocol (PRISMA-P) guidelines. We will search the Cochrane Systematic Review Database, Database of Abstracts of Reviews of Effectiveness, Cochrane Central Register of Controlled Trials, MEDLINE, Excerpta Medica Database and Cumulative Index to Nursing and Allied Health Literature databases. We will include systematic reviews of randomised controlled trials of adult patients, admitted to the intensive care unit and who have received physical rehabilitation interventions at any time point during their recovery. Data extraction will include systematic review aims and rationale, study types, populations, interventions, comparators, outcomes and quality appraisal method. Primary outcomes of interest will focus on findings reflecting recovery of physical function. Quality of reporting and methodological quality will be appraised using the PRISMA checklist and the Assessment of Multiple Systematic Reviews tool. We anticipate the findings from this novel overview of systematic reviews will contribute to the synthesis and interpretation of existing evidence regarding physical rehabilitation interventions and physical recovery in post-critical illness patients across the continuum of recovery. PROSPERO CRD42015001068.

  10. An Outcomes Study on the Effects of the Singapore General Hospital Burns Protocol.

    PubMed

    Liang, Weihao; Kok, Yee Onn; Tan, Bien Keem; Chong, Si Jack

    2018-01-01

    The Singapore General Hospital Burns Protocol was implemented in May 2014 to standardize treatment for all burns patients, incorporate new techniques and materials, and streamline the processes and workflow of burns management. This study aims to analyze the effects of the Burns Protocol 2 years after its implementation. Using a REDCap electronic database, all burns patients admitted from May 2013 to April 2016 were included in the study. The historical preimplementation control group composed of patients admitted from May 2013 to April 2014 (n = 96). The postimplementation prospective study cohort consisted of patients admitted from May 2014 to April 2016 (n = 243). Details of the patients collected included age, sex, comorbidities, total body surface area (TBSA) burns, time until surgery, number of surgeries, number of positive tissue and blood cultures, and length of hospital stay. There was no statistically significant difference in the demographics of both groups. The study group had a statistically significant shorter time to surgery compared with the control group (20.8 vs 38.1, P < 0.0001). The study group also averaged fewer surgeries performed (1.96 vs 2.29, P = 0.285), which, after accounting for the extent of burns, was statistically significant (number of surgeries/TBSA, 0.324 vs 0.506; P = 0.0499). The study group also had significantly shorter length of stay (12.5 vs 16.8, P = 0.0273), a shorter length of stay/TBSA burns (0.874 vs 1.342, P = 0.0101), and fewer positive tissue cultures (0.6 vs 1.3, P = 0.0003). The study group also trended toward fewer positive blood culture results (0.09 vs 0.35, P = 0.0593), although the difference was just shy of statistical significance. The new Singapore General Hospital Burns Protocol had revolutionized Singapore burns care by introducing a streamlined, multidisciplinary burns management, resulting in improved patient outcomes, lowered health care costs, and improved system resource use.

  11. Early weight-bearing after periacetabular osteotomy leads to a high incidence of postoperative pelvic fractures.

    PubMed

    Ito, Hiroshi; Tanino, Hiromasa; Sato, Tatsuya; Nishida, Yasuhiro; Matsuno, Takeo

    2014-07-11

    It has not been shown whether accelerated rehabilitation following periacetabular osteotomy (PAO) is effective for early recovery. The purpose of this retrospective study was to compare complication rates in patients with standard and accelerated rehabilitation protocols who underwent PAO. Between January 2002 and August 2011, patients with a lateral center-edge (CE) angle of < 20°, showing good joint congruency with the hip in abduction, pre- or early stage of osteoarthritis, and age younger than 60 years were included in this study. We evaluated 156 hips in 138 patients, with a mean age at the time of surgery of 30 years. Full weight-bearing with two crutches started 2 months postoperatively in 73 patients (80 hips) with the standard rehabilitation protocol. In 65 patients (76 hips) with the accelerated rehabilitation protocol, postoperative strengthening of the hip, thigh and core musculature was begun on the day of surgery as tolerated. The exercise program included active hip range of motion, and gentle isometric hamstring and quadriceps muscle sets; these exercises were performed for 30 minutes in the morning and 30 minutes in the afternoon with a physical therapist every weekday for 6 weeks. Full weight-bearing with two axillary crutches started on the day of surgery as tolerated. Complications were evaluated for 2 years. The clinical results at the time of follow-up were similar in the two groups. The average periods between the osteotomy and full-weight-bearing walking without support were 4.2 months and 6.9 months in patients with the accelerated and standard rehabilitation protocols (P < 0.001), indicating that the accelerated rehabilitation protocol could achieve earlier recovery of patients. However, postoperative fractures of the ischial ramus and posterior column of the pelvis were more frequently found in patients with the accelerated rehabilitation protocol (8/76) than in those with the standard rehabilitation protocol (1/80) (P = 0.013). The accelerated rehabilitation protocol seems to have advantages for early muscle recovery in patients undergoing PAO; however, postoperative pelvic fracture rates were unacceptably high in patients with this protocol.

  12. Early weight-bearing after periacetabular osteotomy leads to a high incidence of postoperative pelvic fractures

    PubMed Central

    2014-01-01

    Background It has not been shown whether accelerated rehabilitation following periacetabular osteotomy (PAO) is effective for early recovery. The purpose of this retrospective study was to compare complication rates in patients with standard and accelerated rehabilitation protocols who underwent PAO. Methods Between January 2002 and August 2011, patients with a lateral center-edge (CE) angle of < 20°, showing good joint congruency with the hip in abduction, pre- or early stage of osteoarthritis, and age younger than 60 years were included in this study. We evaluated 156 hips in 138 patients, with a mean age at the time of surgery of 30 years. Full weight-bearing with two crutches started 2 months postoperatively in 73 patients (80 hips) with the standard rehabilitation protocol. In 65 patients (76 hips) with the accelerated rehabilitation protocol, postoperative strengthening of the hip, thigh and core musculature was begun on the day of surgery as tolerated. The exercise program included active hip range of motion, and gentle isometric hamstring and quadriceps muscle sets; these exercises were performed for 30 minutes in the morning and 30 minutes in the afternoon with a physical therapist every weekday for 6 weeks. Full weight-bearing with two axillary crutches started on the day of surgery as tolerated. Complications were evaluated for 2 years. Results The clinical results at the time of follow-up were similar in the two groups. The average periods between the osteotomy and full-weight-bearing walking without support were 4.2 months and 6.9 months in patients with the accelerated and standard rehabilitation protocols (P < 0.001), indicating that the accelerated rehabilitation protocol could achieve earlier recovery of patients. However, postoperative fractures of the ischial ramus and posterior column of the pelvis were more frequently found in patients with the accelerated rehabilitation protocol (8/76) than in those with the standard rehabilitation protocol (1/80) (P = 0.013). Conclusion The accelerated rehabilitation protocol seems to have advantages for early muscle recovery in patients undergoing PAO; however, postoperative pelvic fracture rates were unacceptably high in patients with this protocol. PMID:25015753

  13. Meta-analysis comparing chewing gum versus standard postoperative care after colorectal resection.

    PubMed

    Song, Guo-Min; Deng, Yong-Hong; Jin, Ying-Hui; Zhou, Jian-Guo; Tian, Xu

    2016-10-25

    Previous incomplete studies investigating the potential of chewing gum (CG) in patients undergoing colorectal resection did not obtain definitive conclusions. This updated meta-analysis was therefore conducted to evaluate the effect and safety of CG versus standard postoperative care protocols (SPCPs) after colorectal surgery. Total 26 RCTs enrolling 2214 patients were included in this study. The CG can be well-tolerated by all patients. Compared with SPCPs, CG was associated with shorter time to first flatus (weighted mean difference (WMD) -12.14 (95 per cent c.i. -15.71 to -8.56) hours; P < 0.001), bowl movement (WMD -17.32 (-23.41 to -11.22) hours; P < 0.001), bowel sounds (WMD -6.02 (-7.42 to -4.63) hours; P < 0.001), and length of hospital stay (WMD -0.95 (-1.55 to -0.35) days; P < 0.001), a lower risk of postoperative ileus (risk ratio (RR) 0.61 (0.44 to 0.83); P = 0.002), net beneficial and quality of life. There were no significant differences between the two groups in overall complications, nausea, vomiting, bloating, wound infection, bleeding, dehiscence, readmission, reoperation, mortality. The potentially eligible randomized controlled trials (RCTs) that compared CG with SPCPs for colorectal resection were searched in PubMed, Embase, Cochrane library, China National Knowledge Infrastructure (CNKI), and Chinese Wanfang databases through May 2016. The trial sequential analysis was adopted to examine whether a firm conclusion for specific outcome can be drawn. CG is benefit for enhancing return of gastrointestinal function after colorectal resection, and may be associated with lower risk of postoperative ileus.

  14. Bilateral key comparison SIM.T-K6.5 on humidity standards in the dew/frost-point temperature range from -30 °C to +20 °C

    NASA Astrophysics Data System (ADS)

    Meyer, C.; Solano, A.

    2016-01-01

    A Regional Metrology Organization (RMO) Key Comparison of dew/frost point temperatures was carried out by the National Institute of Standards and Technology (NIST, USA) and the Laboratorio Costarricense de Metrología (LACOMET, Costa Rica) between February 2015 and August 2015. The results of this comparison are reported here, along with descriptions of the humidity laboratory standards for NIST and LACOMET and the uncertainty budget for these standards. This report also describes the protocol for the comparison and presents the data acquired. The results are analyzed, determining degree of equivalence between the dew/frost-point standards of NIST and LACOMET. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCT, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  15. Bilateral key comparison SIM.T-K6.2 on humidity standards in the dew/frost-point temperature range from -20 °c to 20 °c

    NASA Astrophysics Data System (ADS)

    Huang, P. H.; Meyer, C. W.; Martines-López, E.; Dávila Pacheco, J. A.; Méndez-Lango, E.

    2014-01-01

    A Regional Metrology Organization (RMO) Key Comparison of dew/frost point temperatures was carried out by the National Institute of Standards and Technology (NIST, USA) and the Centro Nacional de Metrologia (CENAM, Mexico) between July 2008 and December 2008. The results of this comparison are reported here, along with descriptions of the humidity laboratory standards for NIST and CENAM and the uncertainty budget for these standards. This report also describes the protocol for the comparison and presents the data acquired. The results are analyzed, determining degree of equivalence between the dew/frost-point standards of NIST and CENAM. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCT, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  16. Final report: Bilateral key comparison SIM.T-K6.3 on humidity standards in the dew/frost-point temperature range from -30°C to 20°C

    NASA Astrophysics Data System (ADS)

    Huang, Peter; Meyer, Christopher; Brionizio, Julio D.

    2015-01-01

    A Regional Metrology Organization (RMO) Key Comparison of dew/frost point temperatures was carried out by the National Institute of Standards and Technology (NIST, USA) and the Instituto Nacional de Metrologia, Qualidade e Tecnologia (INMETRO, Brazil) between October 2009 and March 2010. The results of this comparison are reported here, along with descriptions of the humidity laboratory standards for NIST and INMETRO and the uncertainty budget for these standards. This report also describes the protocol for the comparison and presents the data acquired. The results are analyzed, determining degree of equivalence between the dew/frost-point standards of NIST and INMETRO. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCT, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  17. Probiotics in Helicobacter pylori eradication therapy: A systematic review and meta-analysis

    PubMed Central

    Zhang, Min-Min; Qian, Wei; Qin, Ying-Yi; He, Jia; Zhou, Yu-Hao

    2015-01-01

    AIM: To summarize the evidence from randomized controlled trials (RCTs) regarding the effect of probiotics by using a meta-analytic approach. METHODS: In July 2013, we searched PubMed, EMBASE, Ovid, the Cochrane Library, and three Chinese databases (Chinese Biomedical Literature Database, Chinese Medical Current Content, and Chinese Scientific Journals database) to identify relevant RCTs. We included RCTs investigating the effect of a combination of probiotics and standard therapy (probiotics group) with standard therapy alone (control group). Risk ratios (RRs) were used to measure the effect of probiotics plus standard therapy on Helicobacter pylori (H. pylori) eradication rates, adverse events, and patient compliance using a random-effect model. RESULTS: We included data on 6997 participants from 45 RCTs, the overall eradication rates of the probiotic group and the control group were 82.31% and 72.08%, respectively. We noted that the use of probiotics plus standard therapy was associated with an increased eradication rate by per-protocol set analysis (RR = 1.11; 95%CI: 1.08-1.15; P < 0.001) or intention-to-treat analysis (RR = 1.13; 95%CI: 1.10-1.16; P < 0.001). Furthermore, the incidence of adverse events was 21.44% in the probiotics group and 36.27% in the control group, and it was found that the probiotics plus standard therapy significantly reduced the risk of adverse events (RR = 0.59; 95%CI: 0.48-0.71; P < 0.001), which demonstrated a favorable effect of probiotics in reducing adverse events associated with H. pylori eradication therapy. The specific reduction in adverse events ranged from 30% to 59%, and this reduction was statistically significant. Finally, probiotics plus standard therapy had little or no effect on patient compliance (RR = 0.98; 95%CI: 0.68-1.39; P = 0.889). CONCLUSION: The use of probiotics plus standard therapy was associated with an increase in the H. pylori eradication rate, and a reduction in adverse events resulting from treatment in the general population. However, this therapy did not improve patient compliance. PMID:25892886

  18. Assessment protocols of maximum oxygen consumption in young people with Down syndrome--a review.

    PubMed

    Seron, Bruna Barboza; Greguol, Márcia

    2014-03-01

    Maximum oxygen consumption is considered the gold standard measure of cardiorespiratory fitness. Young people with Down syndrome (DS) present low values of this indicator compared to their peers without disabilities and to young people with an intellectual disability but without DS. The use of reliable and valid assessment methods provides more reliable results for the diagnosis of cardiorespiratory fitness and the response of this variable to exercise. The aim of the present study was to review the literature on the assessment protocols used to measure maximum oxygen consumption in children and adolescents with Down syndrome giving emphasis to the protocols used, the validation process and their feasibility. The search was carried out in eight electronic databases--Scopus, Medline-Pubmed, Web of science, SportDiscus, Cinhal, Academic Search Premier, Scielo, and Lilacs. The inclusion criteria were: (a) articles which assessed VO2peak and/or VO2max (independent of the validation method), (b) samples composed of children and/or adolescents with Down syndrome, (c) participants of up to 20 years old, and (d) studies performed after 1990. Fifteen studies were selected and, of these, 11 measured the VO2peak using tests performed in a laboratory, 2 used field tests and the remaining 2 used both laboratory and field tests. The majority of the selected studies used maximal tests and conducted familiarization sessions. All the studies took into account the clinical conditions that could hamper testing or endanger the individuals. However, a large number of studies used tests which had not been specifically validated for the evaluated population. Finally, the search emphasized the small number of studies which use field tests to evaluate oxygen consumption. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Development of an electronic database for Acute Pain Service outcomes

    PubMed Central

    Love, Brandy L; Jensen, Louise A; Schopflocher, Donald; Tsui, Ban CH

    2012-01-01

    BACKGROUND: Quality assurance is increasingly important in the current health care climate. An electronic database can be used for tracking patient information and as a research tool to provide quality assurance for patient care. OBJECTIVE: An electronic database was developed for the Acute Pain Service, University of Alberta Hospital (Edmonton, Alberta) to record patient characteristics, identify at-risk populations, compare treatment efficacies and guide practice decisions. METHOD: Steps in the database development involved identifying the goals for use, relevant variables to include, and a plan for data collection, entry and analysis. Protocols were also created for data cleaning quality control. The database was evaluated with a pilot test using existing data to assess data collection burden, accuracy and functionality of the database. RESULTS: A literature review resulted in an evidence-based list of demographic, clinical and pain management outcome variables to include. Time to assess patients and collect the data was 20 min to 30 min per patient. Limitations were primarily software related, although initial data collection completion was only 65% and accuracy of data entry was 96%. CONCLUSIONS: The electronic database was found to be relevant and functional for the identified goals of data storage and research. PMID:22518364

  20. A round-robin gamma stereotactic radiosurgery dosimetry interinstitution comparison of calibration protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drzymala, R. E., E-mail: drzymala@wustl.edu; Alvarez, P. E.; Bednarz, G.

    2015-11-15

    Purpose: Absorbed dose calibration for gamma stereotactic radiosurgery is challenging due to the unique geometric conditions, dosimetry characteristics, and nonstandard field size of these devices. Members of the American Association of Physicists in Medicine (AAPM) Task Group 178 on Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance have participated in a round-robin exchange of calibrated measurement instrumentation and phantoms exploring two approved and two proposed calibration protocols or formalisms on ten gamma radiosurgery units. The objectives of this study were to benchmark and compare new formalisms to existing calibration methods, while maintaining traceability to U.S. primary dosimetry calibration laboratory standards. Methods:more » Nine institutions made measurements using ten gamma stereotactic radiosurgery units in three different 160 mm diameter spherical phantoms [acrylonitrile butadiene styrene (ABS) plastic, Solid Water, and liquid water] and in air using a positioning jig. Two calibrated miniature ionization chambers and one calibrated electrometer were circulated for all measurements. Reference dose-rates at the phantom center were determined using the well-established AAPM TG-21 or TG-51 dose calibration protocols and using two proposed dose calibration protocols/formalisms: an in-air protocol and a formalism proposed by the International Atomic Energy Agency (IAEA) working group for small and nonstandard radiation fields. Each institution’s results were normalized to the dose-rate determined at that institution using the TG-21 protocol in the ABS phantom. Results: Percentages of dose-rates within 1.5% of the reference dose-rate (TG-21 + ABS phantom) for the eight chamber-protocol-phantom combinations were the following: 88% for TG-21, 70% for TG-51, 93% for the new IAEA nonstandard-field formalism, and 65% for the new in-air protocol. Averages and standard deviations for dose-rates over all measurements relative to the TG-21 + ABS dose-rate were 0.999 ± 0.009 (TG-21), 0.991 ± 0.013 (TG-51), 1.000 ± 0.009 (IAEA), and 1.009 ± 0.012 (in-air). There were no statistically significant differences (i.e., p > 0.05) between the two ionization chambers for the TG-21 protocol applied to all dosimetry phantoms. The mean results using the TG-51 protocol were notably lower than those for the other dosimetry protocols, with a standard deviation 2–3 times larger. The in-air protocol was not statistically different from TG-21 for the A16 chamber in the liquid water or ABS phantoms (p = 0.300 and p = 0.135) but was statistically different from TG-21 for the PTW chamber in all phantoms (p = 0.006 for Solid Water, 0.014 for liquid water, and 0.020 for ABS). Results of IAEA formalism were statistically different from TG-21 results only for the combination of the A16 chamber with the liquid water phantom (p = 0.017). In the latter case, dose-rates measured with the two protocols differed by only 0.4%. For other phantom-ionization-chamber combinations, the new IAEA formalism was not statistically different from TG-21. Conclusions: Although further investigation is needed to validate the new protocols for other ionization chambers, these results can serve as a reference to quantitatively compare different calibration protocols and ionization chambers if a particular method is chosen by a professional society to serve as a standardized calibration protocol.« less

  1. A Constrained and Versioned Data Model for TEAM Data

    NASA Astrophysics Data System (ADS)

    Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.

    2009-04-01

    The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. At each TEAM Site, data is gathered as defined by the protocols and according to a predefined sampling schedule. The TEAM data is organized and stored in a database based on the TEAM spatio-temporal data model. This data model is at the core of the TEAM Information System - it consumes and executes spatio-temporal queries, and analytical functions that are performed on TEAM data, and defines the object data types, relationships and operations that maintain database integrity. The TEAM data model contains object types including types for observation objects (e.g. bird, butterfly and trees), sampling unit, person, role, protocol, site and the relationship of these object types. Each observation data record is a set of attribute values of an observation object and is always associated with a sampling unit, an observation timestamp or time interval, a versioned protocol and data collectors. The operations on the TEAM data model can be classified as read operations, insert operations and update operations. Following are some typical operations: The operation get(site, protocol, [sampling unit block, sampling unit,] start time, end time) returns all data records using the specified protocol and collected at the specified site, block, sampling unit and time range. The operation insertSamplingUnit(sampling unit, site, protocol) saves a new sampling unit into the data model and links it with the site and protocol. The operation updateSampligUnit(sampling_unit_id, attribute, value) changes the attribute (e.g. latitude or longitude) of the sampling unit to the specified value. The operation insertData(observation record, site, protocol, sampling unit, timestamps, data collectors) saves a new observation record into the database and associates it with specified objects. The operation updateData(protocol, data_id, attribute, value) modifies the attribute of an existing observation record to the specified value. All the insert or update operations require: 1) authorization to ensure the user has necessary privileges to perform the operation; 2) timestamp validation to ensure the observation timestamps are in the designated time range specified in the sampling schedule; 3) data validation to check that the data records use correct taxonomy terms and data values. No authorization is performed for get operations, but under some specific condition, a username may be required for the purpose of authentication. Along with the validations above, the TEAM data model also supports human based data validation on observed data through the Data Review subsystem to ensure data quality. The data review is implemented by adding two attributes review_tag and review_comment to each observation data record. The attribute review_tag is used by a reviewer to specify the quality of data, and the attribute review_comment is for reviewers to give more information when a problem is identified. The review_tag attribute can be populated by either the system conducting QA/QC tests or by pre-specified scientific experts. The following is the review operation, which is actually a special case of the operation updateData: The operation updateReview(protocol, data_id, judgment, comment) sets the attribute review_tag and review_comment to the specified values. By systematically tracking every step, The TEAM data model can roll back to any previous state. This is achieved by introducing a historical data container for each editable object type. When the operation updateData is applied to an object to modify its attribute, the object will be tagged with the current timestamp and the name of the user who conducts the operation, the tagged object will then be moved into the historical data container, and finally a new object will be created with the new value for the specified attribute. The diagram illustrates the architecture of the TEAM data management system. A data collector can use the Data Ingestion subsystem to load new data records into the TEAM data model. The system establishes a first level of review (i.e. meets minimum data standards via QA/QC tests). Further review is done via experts and they can verify and provide their comments on data records through the Data Review subsystem. The data editor can then address data records based on the reviewer's comments. Users can use the Data Query and Download application to find data by sites, protocols and time ranges. The Data Query and Download system packages selected data with the data license and important metadata information into a single package and delivers it to the user.

  2. Computer Science Research in Europe.

    DTIC Science & Technology

    1984-08-29

    most attention, multi- database and its structure, and (3) the dependencies between databases Distributed Systems and multi- databases . Having...completed a multi- database Newcastle University, UK system for distributed data management, At the University of Newcastle the INRIA is now working on a real...communications re- INRIA quirements of distributed database A project called SIRIUS was estab- systems, protocols for checking the lished in 1977 at the

  3. Image matching algorithms for breech face marks and firing pins in a database of spent cartridge cases of firearms.

    PubMed

    Geradts, Z J; Bijhold, J; Hermsen, R; Murtagh, F

    2001-06-01

    On the market several systems exist for collecting spent ammunition data for forensic investigation. These databases store images of cartridge cases and the marks on them. Image matching is used to create hit lists that show which marks on a cartridge case are most similar to another cartridge case. The research in this paper is focused on the different methods of feature selection and pattern recognition that can be used for optimizing the results of image matching. The images are acquired by side light images for the breech face marks and by ring light for the firing pin impression. For these images a standard way of digitizing the images used. For the side light images and ring light images this means that the user has to position the cartridge case in the same position according to a protocol. The positioning is important for the sidelight, since the image that is obtained of a striation mark depends heavily on the angle of incidence of the light. In practice, it appears that the user positions the cartridge case with +/-10 degrees accuracy. We tested our algorithms using 49 cartridge cases of 19 different firearms, where the examiner determined that they were shot with the same firearm. For testing, these images were mixed with a database consisting of approximately 4900 images that were available from the Drugfire database of different calibers.In cases where the registration and the light conditions among those matching pairs was good, a simple computation of the standard deviation of the subtracted gray levels, delivered the best-matched images. For images that were rotated and shifted, we have implemented a "brute force" way of registration. The images are translated and rotated until the minimum of the standard deviation of the difference is found. This method did not result in all relevant matches in the top position. This is caused by the effect that shadows and highlights are compared in intensity. Since the angle of incidence of the light will give a different intensity profile, this method is not optimal. For this reason a preprocessing of the images was required. It appeared that the third scale of the "à trous" wavelet transform gives the best results in combination with brute force. Matching the contents of the images is less sensitive to the variation of the lighting. The problem with the brute force method is however that the time for calculation for 49 cartridge cases to compare between them, takes over 1 month of computing time on a Pentium II-computer with 333MHz. For this reason a faster approach is implemented: correlation in log polar coordinates. This gave similar results as the brute force calculation, however it was computed in 24h for a complete database with 4900 images.A fast pre-selection method based on signatures is carried out that is based on the Kanade Lucas Tomasi (KLT) equation. The positions of the points computed with this method are compared. In this way, 11 of the 49 images were in the top position in combination with the third scale of the à trous equation. It depends however on the light conditions and the prominence of the marks if correct matches are found in the top ranked position. All images were retrieved in the top 5% of the database. This method takes only a few minutes for the complete database if, and can be optimized for comparison in seconds if the location of points are stored in files. For further improvement, it is useful to have the refinement in which the user selects the areas that are relevant on the cartridge case for their marks. This is necessary if this cartridge case is damaged and other marks that are not from the firearm appear on it.

  4. IDD Info: a software to manage surveillance data of Iodine Deficiency Disorders.

    PubMed

    Liu, Peng; Teng, Bai-Jun; Zhang, Shu-Bin; Su, Xiao-Hui; Yu, Jun; Liu, Shou-Jun

    2011-08-01

    IDD info, a new software for managing survey data of Iodine Deficiency Disorders (IDD), is presented in this paper. IDD Info aims to create IDD project databases, process, analyze various national or regional surveillance data and form final report. It has series measures of choosing database from existing ones, revising it, choosing indicators from pool to establish database and adding indicators to pool. It also provides simple tools to scan one database and compare two databases, to set IDD standard parameters, to analyze data by single indicator and multi-indicators, and finally to form typeset report with content customized. IDD Info was developed using Chinese national IDD surveillance data of 2005. Its validity was evaluated by comparing with survey report given by China CDC. The IDD Info is a professional analysis tool, which succeeds in speeding IDD data analysis up to about 14.28% with respect to standard reference routines. It consequently enhances analysis performance and user compliance. IDD Info is a practical and accurate means of managing the multifarious IDD surveillance data that can be widely used by non-statisticians in national and regional IDD surveillance. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  5. The Clinical Urine Culture: Enhanced Techniques Improve Detection of Clinically Relevant Microorganisms

    PubMed Central

    Price, Travis K.; Dune, Tanaka; Hilt, Evann E.; Thomas-White, Krystal J.; Kliethermes, Stephanie; Brincat, Cynthia; Brubaker, Linda; Wolfe, Alan J.

    2016-01-01

    Enhanced quantitative urine culture (EQUC) detects live microorganisms in the vast majority of urine specimens reported as “no growth” by the standard urine culture protocol. Here, we evaluated an expanded set of EQUC conditions (expanded-spectrum EQUC) to identify an optimal version that provides a more complete description of uropathogens in women experiencing urinary tract infection (UTI)-like symptoms. One hundred fifty adult urogynecology patient-participants were characterized using a self-completed validated UTI symptom assessment (UTISA) questionnaire and asked “Do you feel you have a UTI?” Women responding negatively were recruited into the no-UTI cohort, while women responding affirmatively were recruited into the UTI cohort; the latter cohort was reassessed with the UTISA questionnaire 3 to 7 days later. Baseline catheterized urine samples were plated using both standard urine culture and expanded-spectrum EQUC protocols: standard urine culture inoculated at 1 μl onto 2 agars incubated aerobically; expanded-spectrum EQUC inoculated at three different volumes of urine onto 7 combinations of agars and environments. Compared to expanded-spectrum EQUC, standard urine culture missed 67% of uropathogens overall and 50% in participants with severe urinary symptoms. Thirty-six percent of participants with missed uropathogens reported no symptom resolution after treatment by standard urine culture results. Optimal detection of uropathogens could be achieved using the following: 100 μl of urine plated onto blood (blood agar plate [BAP]), colistin-nalidixic acid (CNA), and MacConkey agars in 5% CO2 for 48 h. This streamlined EQUC protocol achieved 84% uropathogen detection relative to 33% detection by standard urine culture. The streamlined EQUC protocol improves detection of uropathogens that are likely relevant for symptomatic women, giving clinicians the opportunity to receive additional information not currently reported using standard urine culture techniques. PMID:26962083

  6. Keynote Address: ACR-NEMA standards and their implications for teleradiology

    NASA Astrophysics Data System (ADS)

    Horii, Steven C.

    1990-06-01

    The ACR-NEMA Standard was developed initially as an interface standard for the interconnection of two pieces of imaging equipment Essentially the Standard defmes a point-to-point hardware connection with the necessary protocol and data structure so that two differing devices which meet the specification will be able to communicate with each other. The Standard does not defme a particular PACS architecture nor does it specify a database structure. In part these are the reasons why implementers have had difficulty in using the Standard in a full PACS. Recent activity of the Working Groups formed by the Committee overseeing work on the ACR-NEMA Standard has changed some of the " flavor" of the Standard. It was realized that connection of PACS with hospital and radiology information systems (HIS and RIS) is necessary if a PACS is ever to be succesful. The idea of interconnecting heterogeneous computer systems has pushed Standards development beyond the scope of the original work. Teleradiology which inherenfly involves wide-area networking may be a direct beneficiary of the new directions taken by the Standards Working Groups. This paper will give a brief history of the ACR-NEMA effort describe the " parent" Standard and its " offspring" and describe the activity of the current Working Groups with particular emphasis on the potential impacts on teleradiology.

  7. The Biomarker Knowledge System Informatics Pilot Project Supplement To The Biomarker Development Laboratory at Moffitt (Bedlam) — EDRN Public Portal

    Cancer.gov

    The Biomarker Knowledge System Informatics Pilot Project goal will develop network interfaces among databases that contain information about existing clinical populations and biospecimens and data relating to those specimens that are important in biomarker assay validation. This protocol comprises one of two that will comprise the Moffitt participation in the Biomarker Knowledge System Informatics Pilot Project. THIS PROTOCOL (58) is the Sput-Epi Database.

  8. CT and MR Protocol Standardization Across a Large Health System: Providing a Consistent Radiologist, Patient, and Referring Provider Experience.

    PubMed

    Sachs, Peter B; Hunt, Kelly; Mansoubi, Fabien; Borgstede, James

    2017-02-01

    Building and maintaining a comprehensive yet simple set of standardized protocols for a cross-sectional image can be a daunting task. A single department may have difficulty preventing "protocol creep," which almost inevitably occurs when an organized "playbook" of protocols does not exist and individual radiologists and technologists alter protocols at will and on a case-by-case basis. When multiple departments or groups function in a large health system, the lack of uniformity of protocols can increase exponentially. In 2012, the University of Colorado Hospital formed a large health system (UCHealth) and became a 5-hospital provider network. CT and MR imaging studies are conducted at multiple locations by different radiology groups. To facilitate consistency in ordering, acquisition, and appearance of a given study, regardless of location, we minimized the number of protocols across all scanners and sites of practice with a clinical indication-driven protocol selection and standardization process. Here we review the steps utilized to perform this process improvement task and insure its stability over time. Actions included creation of a standardized protocol template, which allowed for changes in electronic storage and management of protocols, designing a change request form, and formation of a governance structure. We utilized rapid improvement events (1 day for CT, 2 days for MR) and reduced 248 CT protocols into 97 standardized protocols and 168 MR protocols to 66. Additional steps are underway to further standardize output and reporting of imaging interpretation. This will result in an improved, consistent radiologist, patient, and provider experience across the system.

  9. Cryptography in the Bounded-Quantum-Storage Model

    NASA Astrophysics Data System (ADS)

    Schaffner, Christian

    2007-09-01

    This thesis initiates the study of cryptographic protocols in the bounded-quantum-storage model. On the practical side, simple protocols for Rabin Oblivious Transfer, 1-2 Oblivious Transfer and Bit Commitment are presented. No quantum memory is required for honest players, whereas the protocols can only be broken by an adversary controlling a large amount of quantum memory. The protocols are efficient, non-interactive and can be implemented with today's technology. On the theoretical side, new entropic uncertainty relations involving min-entropy are established and used to prove the security of protocols according to new strong security definitions. For instance, in the realistic setting of Quantum Key Distribution (QKD) against quantum-memory-bounded eavesdroppers, the uncertainty relation allows to prove the security of QKD protocols while tolerating considerably higher error rates compared to the standard model with unbounded adversaries.

  10. The macro-economic determinants of health and health inequalities-umbrella review protocol.

    PubMed

    Naik, Yannish; Baker, Peter; Walker, Ian; Tillmann, Taavi; Bash, Kristin; Quantz, Darryl; Hillier-Brown, Frances; Bambra, Clare

    2017-11-03

    The economic determinants of health have been widely recognised as crucial factors affecting health; however, to date, no comprehensive review has been undertaken to summarise these factors and the ways in which they can influence health. We conceptualise the economy as a complex system made up of underlying approaches, regulation from institutions, markets, finance, labour, the public-private balance as well as production and distributional effects, which collectively impact on health through the effect of moderators. This protocol details the methods for an umbrella review to explore the macro-economic factors, strategies, policies and interventions that affect health outcomes and health inequalities. We will identify relevant systematic reviews using search terms derived from the Journal of Economic Literature classification. Reviews will be included if they meet the Database of Abstracts and Reviews of Effects criteria for systematic reviews. Reviews of studies with and without controls will be included; both association and intervention studies will be included. Primary outcomes will include but are not limited to morbidity, mortality, prevalence and incidence of conditions and life expectancy. Secondary outcomes will include health inequalities by gender, ethnicity or socio-economic status. Six databases will be searched using tailored versions of our piloted search strategy to locate relevant reviews. Data will be extracted using a standardized pro forma, and the findings will be synthesized into a conceptual framework to address our review aim. Our umbrella review protocol provides a robust method to systematically appraise the evidence in this field, using new conceptual models derived specifically to address the study question. This will yield important information for policymakers, practitioners and researchers at the local, national and international level. It will also help set the future research agenda in this field and guide the development of interventions. This umbrella review protocol has been registered with PROSPERO CRD42017068357 .

  11. EMPReSS: European mouse phenotyping resource for standardized screens.

    PubMed

    Green, Eain C J; Gkoutos, Georgios V; Lad, Heena V; Blake, Andrew; Weekes, Joseph; Hancock, John M

    2005-06-15

    Standardized phenotyping protocols are essential for the characterization of phenotypes so that results are comparable between different laboratories and phenotypic data can be related to ontological descriptions in an automated manner. We describe a web-based resource for the visualization, searching and downloading of standard operating procedures and other documents, the European Mouse Phenotyping Resource for Standardized Screens-EMPReSS. Direct access: http://www.empress.har.mrc.ac.uk e.green@har.mrc.ac.uk.

  12. Influence of basis images and skull position on evaluation of cortical bone thickness in cone beam computed tomography.

    PubMed

    Nascimento, Monikelly do Carmo Chagas; Boscolo, Solange Maria de Almeida; Haiter-Neto, Francisco; Santos, Emanuela Carla Dos; Lambrichts, Ivo; Pauwels, Ruben; Jacobs, Reinhilde

    2017-06-01

    The aim of this study was to assess the influence of the number of basis images and the orientation of the skull on the evaluation of cortical alveolar bone in cone beam computed tomography (CBCT). Eleven skulls with a total of 59 anterior teeth were selected. CBCT images were acquired by using 4 protocols, by varying the rotation of the tube-detector arm and the orientation of the skull (protocol 1: 360°/0°; protocol 2: 180°/0°; protocol 3: 180°/90°; protocol 4: 180°/180°). Observers evaluated cortical bone as absent, thin, or thick. Direct observation of the skulls was used as the gold standard. Intra- and interobserver agreement, as well as agreement of scoring between the 3 bone thickness classifications, were calculated by using the κ statistic. The Wilcoxon signed-rank test was used to compare the 4 protocols. For lingual cortical bone, protocol 1 showed no statistical difference from the gold standard. Higher reliability was found in protocol 3 for absent (κ = 0.80) and thin (κ = 0.47) cortices, whereas for thick cortical bone, protocol 2 was more consistent (κ = 0.60). In buccal cortical bone, protocol 1 obtained the highest agreement for absent cortices (κ = 0.61), whereas protocol 4 was better for thin cortical plates (κ = 0.38) and protocol 2 for thick cortical plates (κ = 0.40). No consistent effect of the number of basis images or head orientation for visual detection of alveolar bone was detected, except for lingual cortical bone, for which full rotation scanning showed improved visualization. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Network protocols for real-time applications

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1987-01-01

    The Fiber Distributed Data Interface (FDDI) and the SAE AE-9B High Speed Ring Bus (HSRB) are emerging standards for high-performance token ring local area networks. FDDI was designed to be a general-purpose high-performance network. HSRB was designed specifically for military real-time applications. A workshop was conducted at NASA Ames Research Center in January, 1987 to compare and contrast these protocols with respect to their ability to support real-time applications. This report summarizes workshop presentations and includes an independent comparison of the two protocols. A conclusion reached at the workshop was that current protocols for the upper layers of the Open Systems Interconnection (OSI) network model are inadequate for real-time applications.

  14. A national database for essential drugs in South Africa.

    PubMed

    Zweygarth, M; Summers, R S

    2000-06-01

    In the process of drafting standard treatment guidelines for adults and children at hospital level, the Secretariat of the National Essential Drugs List Committee made use of a database designed with technical support from the School of Pharmacy, MEDUNSA. The database links the current 697 drugs on the Essential Drugs List with Standard Treatment Guidelines for over 400 conditions. It served to streamline the inclusion of different drugs and dosage forms in the various guidelines, and provided concise, updated information to other departments involved in drug procurement. From information on drug prices and morbidity, it can also be used to calculate drug consumption and cost estimates and compare them with actual figures.

  15. Collaborative SDOCT Segmentation and Analysis Software.

    PubMed

    Yun, Yeyi; Carass, Aaron; Lang, Andrew; Prince, Jerry L; Antony, Bhavna J

    2017-02-01

    Spectral domain optical coherence tomography (SDOCT) is routinely used in the management and diagnosis of a variety of ocular diseases. This imaging modality also finds widespread use in research, where quantitative measurements obtained from the images are used to track disease progression. In recent years, the number of available scanners and imaging protocols grown and there is a distinct absence of a unified tool that is capable of visualizing, segmenting, and analyzing the data. This is especially noteworthy in longitudinal studies, where data from older scanners and/or protocols may need to be analyzed. Here, we present a graphical user interface (GUI) that allows users to visualize and analyze SDOCT images obtained from two commonly used scanners. The retinal surfaces in the scans can be segmented using a previously described method, and the retinal layer thicknesses can be compared to a normative database. If necessary, the segmented surfaces can also be corrected and the changes applied. The interface also allows users to import and export retinal layer thickness data to an SQL database, thereby allowing for the collation of data from a number of collaborating sites.

  16. Low-dose ionizing radiation increases the mortality risk of solid cancers in nuclear industry workers: A meta-analysis.

    PubMed

    Qu, Shu-Gen; Gao, Jin; Tang, Bo; Yu, Bo; Shen, Yue-Ping; Tu, Yu

    2018-05-01

    Low-dose ionizing radiation (LDIR) may increase the mortality of solid cancers in nuclear industry workers, but only few individual cohort studies exist, and the available reports have low statistical power. The aim of the present study was to focus on solid cancer mortality risk from LDIR in the nuclear industry using standard mortality ratios (SMRs) and 95% confidence intervals. A systematic literature search through the PubMed and Embase databases identified 27 studies relevant to this meta-analysis. There was statistical significance for total, solid and lung cancers, with meta-SMR values of 0.88, 0.80, and 0.89, respectively. There was evidence of stochastic effects by IR, but more definitive conclusions require additional analyses using standardized protocols to determine whether LDIR increases the risk of solid cancer-related mortality.

  17. The EPIC nutrient database project (ENDB): a first attempt to standardize nutrient databases across the 10 European countries participating in the EPIC study.

    PubMed

    Slimani, N; Deharveng, G; Unwin, I; Southgate, D A T; Vignat, J; Skeie, G; Salvini, S; Parpinel, M; Møller, A; Ireland, J; Becker, W; Farran, A; Westenbrink, S; Vasilopoulou, E; Unwin, J; Borgejordet, A; Rohrmann, S; Church, S; Gnagnarella, P; Casagrande, C; van Bakel, M; Niravong, M; Boutron-Ruault, M C; Stripp, C; Tjønneland, A; Trichopoulou, A; Georga, K; Nilsson, S; Mattisson, I; Ray, J; Boeing, H; Ocké, M; Peeters, P H M; Jakszyn, P; Amiano, P; Engeset, D; Lund, E; de Magistris, M Santucci; Sacerdote, C; Welch, A; Bingham, S; Subar, A F; Riboli, E

    2007-09-01

    This paper describes the ad hoc methodological concepts and procedures developed to improve the comparability of Nutrient databases (NDBs) across the 10 European countries participating in the European Prospective Investigation into Cancer and Nutrition (EPIC). This was required because there is currently no European reference NDB available. A large network involving national compilers, nutritionists and experts on food chemistry and computer science was set up for the 'EPIC Nutrient DataBase' (ENDB) project. A total of 550-1500 foods derived from about 37,000 standardized EPIC 24-h dietary recalls (24-HDRS) were matched as closely as possible to foods available in the 10 national NDBs. The resulting national data sets (NDS) were then successively documented, standardized and evaluated according to common guidelines and using a DataBase Management System specifically designed for this project. The nutrient values of foods unavailable or not readily available in NDSs were approximated by recipe calculation, weighted averaging or adjustment for weight changes and vitamin/mineral losses, using common algorithms. The final ENDB contains about 550-1500 foods depending on the country and 26 common components. Each component value was documented and standardized for unit, mode of expression, definition and chemical method of analysis, as far as possible. Furthermore, the overall completeness of NDSs was improved (>or=99%), particularly for beta-carotene and vitamin E. The ENDB constitutes a first real attempt to improve the comparability of NDBs across European countries. This methodological work will provide a useful tool for nutritional research as well as end-user recommendations to improve NDBs in the future.

  18. A Benchmark and Comparative Study of Video-Based Face Recognition on COX Face Database.

    PubMed

    Huang, Zhiwu; Shan, Shiguang; Wang, Ruiping; Zhang, Haihong; Lao, Shihong; Kuerban, Alifu; Chen, Xilin

    2015-12-01

    Face recognition with still face images has been widely studied, while the research on video-based face recognition is inadequate relatively, especially in terms of benchmark datasets and comparisons. Real-world video-based face recognition applications require techniques for three distinct scenarios: 1) Videoto-Still (V2S); 2) Still-to-Video (S2V); and 3) Video-to-Video (V2V), respectively, taking video or still image as query or target. To the best of our knowledge, few datasets and evaluation protocols have benchmarked for all the three scenarios. In order to facilitate the study of this specific topic, this paper contributes a benchmarking and comparative study based on a newly collected still/video face database, named COX(1) Face DB. Specifically, we make three contributions. First, we collect and release a largescale still/video face database to simulate video surveillance with three different video-based face recognition scenarios (i.e., V2S, S2V, and V2V). Second, for benchmarking the three scenarios designed on our database, we review and experimentally compare a number of existing set-based methods. Third, we further propose a novel Point-to-Set Correlation Learning (PSCL) method, and experimentally show that it can be used as a promising baseline method for V2S/S2V face recognition on COX Face DB. Extensive experimental results clearly demonstrate that video-based face recognition needs more efforts, and our COX Face DB is a good benchmark database for evaluation.

  19. Built to last? The sustainability of health system improvements, interventions and change strategies: a study protocol for a systematic review

    PubMed Central

    Braithwaite, Jeffrey; Testa, Luke; Lamprell, Gina; Herkes, Jessica; Ludlow, Kristiana; McPherson, Elise; Campbell, Margie; Holt, Joanna

    2017-01-01

    Introduction The sustainability of healthcare interventions and change programmes is of increasing importance to researchers and healthcare stakeholders interested in creating sustainable health systems to cope with mounting stressors. The aim of this protocol is to extend earlier work and describe a systematic review to identify, synthesise and draw meaning from studies published within the last 5 years that measure the sustainability of interventions, improvement efforts and change strategies in the health system. Methods and analysis The protocol outlines a method by which to execute a rigorous systematic review. The design includes applying primary and secondary data collection techniques, consisting of a comprehensive database search complemented by contact with experts, and searching secondary databases and reference lists, using snowballing techniques. The review and analysis process will occur via an abstract review followed by a full-text screening process. The inclusion criteria include English-language, peer-reviewed, primary, empirical research articles published after 2011 in scholarly journals, for which the full text is available. No restrictions on location will be applied. The review that results from this protocol will synthesise and compare characteristics of the included studies. Ultimately, it is intended that this will help make it easier to identify and design sustainable interventions, improvement efforts and change strategies. Ethics and dissemination As no primary data were collected, ethical approval was not required. Results will be disseminated in conference presentations, peer-reviewed publications and among policymaker bodies interested in creating sustainable health systems. PMID:29133332

  20. Low Dose MDCT with Tube Current Modulation: Role in Detection of Urolithiasis and Patient Effective Dose Reduction

    PubMed Central

    Kakkar, Chandan; Sripathi, Smiti; Parakh, Anushri; Shrivastav, Rajendra

    2016-01-01

    Introduction Urolithiasis is one of the major, recurring problem in young individuals and CT being the commonest diagnostic modality used. In order to reduce the radiation dose to the patient who are young and as stone formation is a recurring process; one of the simplest way would be, low dose CT along with tube current modulation. Aim Aim of this study was to compare the sensitivity and specificity of low dose (70mAs) with standard dose (250mAs) protocol in detecting urolithiasis and to define the tube current and mean effective patient dose by these protocols. Materials and Methods A prospective study was conducted in 200 patients over a period of 2 years with acute flank pain presentation. CT was performed in 100 cases with standard dose and another 100 with low dose protocol using tube current modulation. Sensitivity and specificity for calculus detection, percentage reduction of dose and tube current with low dose protocol was calculated. Results Urolithiasis was detected in 138 patients, 67 were examined by high dose and 71 were by low dose protocol. Sensitivity and Specificity of low dose protocol was 97.1% and 96.4% with similar results found in high BMI patients. Tube current modulation resulted in reduction of effective tube current by 12.17%. The mean effective patient dose for standard dose was 10.33 mSv whereas 2.92 mSv for low dose with 51.13–53.8% reduction in low dose protocol. Conclusion The study has reinforced that low-dose CT with tube current modulation is appropriate for diagnosis of urolithiasis with significant reduction in tube current and patient effective dose. PMID:27437322

  1. Low Dose MDCT with Tube Current Modulation: Role in Detection of Urolithiasis and Patient Effective Dose Reduction.

    PubMed

    Koteshwar, Prakashini; Kakkar, Chandan; Sripathi, Smiti; Parakh, Anushri; Shrivastav, Rajendra

    2016-05-01

    Urolithiasis is one of the major, recurring problem in young individuals and CT being the commonest diagnostic modality used. In order to reduce the radiation dose to the patient who are young and as stone formation is a recurring process; one of the simplest way would be, low dose CT along with tube current modulation. Aim of this study was to compare the sensitivity and specificity of low dose (70mAs) with standard dose (250mAs) protocol in detecting urolithiasis and to define the tube current and mean effective patient dose by these protocols. A prospective study was conducted in 200 patients over a period of 2 years with acute flank pain presentation. CT was performed in 100 cases with standard dose and another 100 with low dose protocol using tube current modulation. Sensitivity and specificity for calculus detection, percentage reduction of dose and tube current with low dose protocol was calculated. Urolithiasis was detected in 138 patients, 67 were examined by high dose and 71 were by low dose protocol. Sensitivity and Specificity of low dose protocol was 97.1% and 96.4% with similar results found in high BMI patients. Tube current modulation resulted in reduction of effective tube current by 12.17%. The mean effective patient dose for standard dose was 10.33 mSv whereas 2.92 mSv for low dose with 51.13-53.8% reduction in low dose protocol. The study has reinforced that low-dose CT with tube current modulation is appropriate for diagnosis of urolithiasis with significant reduction in tube current and patient effective dose.

  2. Strategies for Optimal MAC Parameters Tuning in IEEE 802.15.6 Wearable Wireless Sensor Networks.

    PubMed

    Alam, Muhammad Mahtab; Ben Hamida, Elyes

    2015-09-01

    Wireless body area networks (WBAN) has penetrated immensely in revolutionizing the classical heath-care system. Recently, number of WBAN applications has emerged which introduce potential limits to existing solutions. In particular, IEEE 802.15.6 standard has provided great flexibility, provisions and capabilities to deal emerging applications. In this paper, we investigate the application-specific throughput analysis by fine-tuning the physical (PHY) and medium access control (MAC) parameters of the IEEE 802.15.6 standard. Based on PHY characterizations in narrow band, at the MAC layer, carrier sense multiple access collision avoidance (CSMA/CA) and scheduled access protocols are extensively analyzed. It is concluded that, IEEE 802.15.6 standard can satisfy most of the WBANs applications throughput requirements by maximum achieving 680 Kbps. However, those emerging applications which require high quality audio or video transmissions, standard is not able to meet their constraints. Moreover, delay, energy efficiency and successful packet reception are considered as key performance metrics for comparing the MAC protocols. CSMA/CA protocol provides the best results to meet the delay constraints of medical and non-medical WBAN applications. Whereas, the scheduled access approach, performs very well both in energy efficiency and packet reception ratio.

  3. Challenges to Global Implementation of Infrared Thermography Technology: Current Perspective

    PubMed Central

    Shterenshis, Michael

    2017-01-01

    Medical infrared thermography (IT) produces an image of the infrared waves emitted by the human body as part of the thermoregulation process that can vary in intensity based on the health of the person. This review analyzes recent developments in the use of infrared thermography as a screening and diagnostic tool in clinical and nonclinical settings, and identifies possible future routes for improvement of the method. Currently, infrared thermography is not considered to be a fully reliable diagnostic method. If standard infrared protocol is established and a normative database is available, infrared thermography may become a reliable method for detecting inflammatory processes. PMID:29138741

  4. Challenges to Global Implementation of Infrared Thermography Technology: Current Perspective.

    PubMed

    Shterenshis, Michael

    2017-01-01

    Medical infrared thermography (IT) produces an image of the infrared waves emitted by the human body as part of the thermoregulation process that can vary in intensity based on the health of the person. This review analyzes recent developments in the use of infrared thermography as a screening and diagnostic tool in clinical and nonclinical settings, and identifies possible future routes for improvement of the method. Currently, infrared thermography is not considered to be a fully reliable diagnostic method. If standard infrared protocol is established and a normative database is available, infrared thermography may become a reliable method for detecting inflammatory processes.

  5. UPM: unified policy-based network management

    NASA Astrophysics Data System (ADS)

    Law, Eddie; Saxena, Achint

    2001-07-01

    Besides providing network management to the Internet, it has become essential to offer different Quality of Service (QoS) to users. Policy-based management provides control on network routers to achieve this goal. The Internet Engineering Task Force (IETF) has proposed a two-tier architecture whose implementation is based on the Common Open Policy Service (COPS) protocol and Lightweight Directory Access Protocol (LDAP). However, there are several limitations to this design such as scalability and cross-vendor hardware compatibility. To address these issues, we present a functionally enhanced multi-tier policy management architecture design in this paper. Several extensions are introduced thereby adding flexibility and scalability. In particular, an intermediate entity between the policy server and policy rule database called the Policy Enforcement Agent (PEA) is introduced. By keeping internal data in a common format, using a standard protocol, and by interpreting and translating request and decision messages from multi-vendor hardware, this agent allows a dynamic Unified Information Model throughout the architecture. We have tailor-made this unique information system to save policy rules in the directory server and allow executions of policy rules with dynamic addition of new equipment during run-time.

  6. Protocol standards and implementation within the digital engineering laboratory computer network (DELNET) using the universal network interface device (UNID). Part 2

    NASA Astrophysics Data System (ADS)

    Phister, P. W., Jr.

    1983-12-01

    Development of the Air Force Institute of Technology's Digital Engineering Laboratory Network (DELNET) was continued with the development of an initial draft of a protocol standard for all seven layers as specified by the International Standards Organization's (ISO) Reference Model for Open Systems Interconnections. This effort centered on the restructuring of the Network Layer to perform Datagram routing and to conform to the developed protocol standards and actual software module development of the upper four protocol layers residing within the DELNET Monitor (Zilog MCZ 1/25 Computer System). Within the guidelines of the ISO Reference Model the Transport Layer was developed utilizing the Internet Header Format (IHF) combined with the Transport Control Protocol (TCP) to create a 128-byte Datagram. Also a limited Application Layer was created to pass the Gettysburg Address through the DELNET. This study formulated a first draft for the DELNET Protocol Standard and designed, implemented, and tested the Network, Transport, and Application Layers to conform to these protocol standards.

  7. Extracorporeal shock wave therapy as an adjunct wound treatment: a systematic review of the literature.

    PubMed

    Dymarek, Robert; Halski, Tomasz; Ptaszkowski, Kuba; Slupska, Lucyna; Rosinczuk, Joanna; Taradaj, Jakub

    2014-07-01

    Standard care procedures for complex wounds are sometimes supported and reinforced by physical treatment modalities such as extracorporeal shock wave therapy (ESWT). To evaluate available evidence of ESWT effectiveness in humans, a systematic review of the literature was conducted using MEDLINE, PubMed, Scopus, EBSCOhost, and PEDro databases. Of the 393 articles found, 13 met the publication date (year 2000-2013), study type (clinical study), language (English only), and abstract availability (yes) criteria. The 13 studies (n = 919 patients with wounds of varying etiologies) included seven randomized controlled trials that were evaluated using Cochrane Collaboration Group standards. Only studies with randomization, well prepared inclusion/exclusion criteria protocol, written in English, and full version available were analyzed. An additional six publications reporting results of other clinical studies including a total of 523patients were identified and summarized. ESWT was most commonly applied once or twice a week using used low or medium energy, focused or defocused generator heads (energy range 0.03 to 0.25 mJ/mm2; usually 0.1 mJ/mm2), and electrohydraulic or electromagnetic sources. Few safety concerns were reported, and in the controlled clinical studies statistically significant differences in rates of wound closure were reported compared to a variety of standard topical treatment modalities, sham ESWT treatment, and hyperbaric oxygen therapy. Based on this analysis, ESWT can be characterized as noninvasive, mostly painless, and safe. Controlled, randomized, multicenter, blind clinical trials still are required to evaluate the efficacy and cost-effectiveness of ESWT compared to sham control, other adjunctive treatments, and commonly used moisture-retentive dressings. In the future, ESWT may play an important role in wound care once evidence-based practice guidelines are developed.

  8. Non-vitamin K antagonist oral anticoagulants have better efficacy and equivalent safety compared to warfarin in elderly patients with atrial fibrillation: A systematic review and meta-analysis.

    PubMed

    Kim, In-Soo; Kim, Hyun-Jung; Kim, Tae-Hoon; Uhm, Jae-Sun; Joung, Boyoung; Lee, Moon-Hyoung; Pak, Hui-Nam

    2018-08-01

    To evaluate the efficacy and safety of non-vitamin K antagonist oral anticoagulants (NOACs) in elderly patients (aged ≥75 years) with atrial fibrillation (AF), depending on dose and/or renal function. After systematically searching the databases (Medline, EMBASE, CENTRAL, SCOPUS, and Web of Science), 5 phase III randomized controlled trials and reported data according to subgroups of elderly/non-elderly AF patients, comparing any NOACs and warfarin were included. The primary efficacy and safety outcomes were stroke/systemic thromboembolism and major bleeding. (1) NOACs showed better efficacy than warfarin in elderly patients [RR 0.83 (0.69-1.00), p=0.04, I 2 =55%], but equivalent efficacy in non-elderly patients. (2) NOACs reduced major bleeding compared to warfarin in non-elderly (p<0.001) and had comparable safety to warfarin in elderly patients. (3) Even in elderly patients with moderately impaired renal function, NOACs had a safety profile comparable to that of warfarin for major bleeding if dose reduction was reached appropriately [pooled RR 0.82 (0.35-1.88), p=0.63, I 2 =63%]. (4) All-cause mortality was lower with NOACs in non-elderly patients [RR 0.89 (0.83-0.95), p=0.001, I 2 =0%], and with standard-dose NOAC group of elderly patients [RR 0.93 (0.86-1.00), p=0.04, I 2 =0%] compared to warfarin. For elderly patients (aged ≥75 years), NOACs showed better efficacy and equivalent safety compared to warfarin even in those with moderately impaired renal function. All-cause mortality was lower with standard-dose NOACs compared to warfarin in the elderly patient group. The protocol of this meta-analysis was registered on PROSPERO under CRD42016047922 (https://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42016047922). Copyright © 2018 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  9. Cooperative Energy Harvesting-Adaptive MAC Protocol for WBANs

    PubMed Central

    Esteves, Volker; Antonopoulos, Angelos; Kartsakli, Elli; Puig-Vidal, Manel; Miribel-Català, Pere; Verikoukis, Christos

    2015-01-01

    In this paper, we introduce a cooperative medium access control (MAC) protocol, named cooperative energy harvesting (CEH)-MAC, that adapts its operation to the energy harvesting (EH) conditions in wireless body area networks (WBANs). In particular, the proposed protocol exploits the EH information in order to set an idle time that allows the relay nodes to charge their batteries and complete the cooperation phase successfully. Extensive simulations have shown that CEH-MAC significantly improves the network performance in terms of throughput, delay and energy efficiency compared to the cooperative operation of the baseline IEEE 802.15.6 standard. PMID:26029950

  10. Optimizing the NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maa, Ming-Hokng

    1996-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.

  11. Head-To-Head Comparison Between High- and Standard-b-Value DWI for Detecting Prostate Cancer: A Systematic Review and Meta-Analysis.

    PubMed

    Woo, Sungmin; Suh, Chong Hyun; Kim, Sang Youn; Cho, Jeong Yeon; Kim, Seung Hyup

    2018-01-01

    The purpose of this study was to perform a head-to-head comparison between high-b-value (> 1000 s/mm 2 ) and standard-b-value (800-1000 s/mm 2 ) DWI regarding diagnostic performance in the detection of prostate cancer. The MEDLINE and EMBASE databases were searched up to April 1, 2017. The analysis included diagnostic accuracy studies in which high- and standard-b-value DWI were used for prostate cancer detection with histopathologic examination as the reference standard. Methodologic quality was assessed with the revised Quality Assessment of Diagnostic Accuracy Studies tool. Sensitivity and specificity of all studies were calculated and were pooled and plotted in a hierarchic summary ROC plot. Meta-regression and multiple-subgroup analyses were performed to compare the diagnostic performances of high- and standard-b-value DWI. Eleven studies (789 patients) were included. High-b-value DWI had greater pooled sensitivity (0.80 [95% CI, 0.70-0.87]) (p = 0.03) and specificity (0.92 [95% CI, 0.87-0.95]) (p = 0.01) than standard-b-value DWI (sensitivity, 0.78 [95% CI, 0.66-0.86]); specificity, 0.87 [95% CI, 0.77-0.93] (p < 0.01). Multiple-subgroup analyses showed that specificity was consistently higher for high- than for standard-b-value DWI (p ≤ 0.05). Sensitivity was significantly higher for high- than for standard-b-value DWI only in the following subgroups: peripheral zone only, transition zone only, multiparametric protocol (DWI and T2-weighted imaging), visual assessment of DW images, and per-lesion analysis (p ≤ 0.04). In a head-to-head comparison, high-b-value DWI had significantly better sensitivity and specificity for detection of prostate cancer than did standard-b-value DWI. Multiple-subgroup analyses showed that specificity was consistently superior for high-b-value DWI.

  12. Effectiveness of Intraoral Chlorhexidine Protocols in the Prevention of Ventilator-Associated Pneumonia: Meta-Analysis and Systematic Review.

    PubMed

    Villar, Cristina C; Pannuti, Claudio M; Nery, Danielle M; Morillo, Carlos M R; Carmona, Maria José C; Romito, Giuseppe A

    2016-09-01

    Ventilator-associated pneumonia (VAP) is common in critical patients and related with increased morbidity and mortality. We conducted a systematic review and meta-analysis, with intention-to-treat analysis, of randomized controlled clinical trials that assessed the effectiveness of different intraoral chlorhexidine protocols for the prevention of VAP. Search strategies were developed for the MEDLINE, EMBASE, and LILACS databases. MeSH terms were combined with Boolean operators and used to search the databases. Eligible studies were randomized controlled trials of mechanically ventilated subjects receiving oral care with chlorhexidine or standard oral care protocols consisting of or associated with the use of a placebo or no chemicals. Pooled estimates of the relative risk and corresponding 95% CIs were calculated with random effects models, and heterogeneity was assessed with the Cochran Q statistic and I(2). The 13 included studies provided data on 1,640 subjects that were randomly allocated to chlorhexidine (n = 834) or control (n = 806) treatments. A preliminary analysis revealed that oral application of chlorhexidine fails to promote a significant reduction in VAP incidence (relative risk 0.80, 95% CI 0.59-1.07, I(2) = 45%). However, subgroup analyses showed that chlorhexidine prevents VAP development when used at 2% concentration (relative risk 0.53, 95% CI 0.31-0.91, I(2) = 0%) or 4 times/d (relative risk 0.56, 95% CI 0.38-0.81, I(2) = 0%). We found that oral care with chlorhexidine is effective in reducing VAP incidence in the adult population if administered at 2% concentration or 4 times/d. Copyright © 2016 by Daedalus Enterprises.

  13. Long-term Recovery in Stroke Accompanied by Aphasia: A Reconsideration.

    PubMed

    Holland, Audrey; Fromm, Davida; Forbes, Margaret; MacWhinney, Brian

    2017-01-01

    This work focuses on the twenty-six individuals who provided data to AphasiaBank on at least two occasions, with initial testing between 6 months and 5.8 years post-onset of aphasia. The data are archival in nature and were collected from the extensive database of aphasic discourse in AphasiaBank. The aim is to furnish data on the nature of long-term changes in both the impairment of aphasia as measured by the Western Aphasia Battery-Revised (WAB-R) and its expression in spoken discourse. AphasiaBank's demographic database was searched to discover all individuals who were tested twice at an interval of at least a year with either: 1) the AphasiaBank protocol; or 2) the AphasiaBank protocol at first testing, and the Famous People Protocol (FPP) at second testing. The Famous People Protocol is a measure developed to assess the communication strategies of individuals whose spoken language limitations preclude full participation in the AphasiaBank protocol. The 26 people with aphasia (PWA) who were identified had completed formal speech therapy before being seen for AphasiaBank. However, all were participants in aphasia centers where at least three hours of planned activities were available, in most cases, twice weekly. WAB-R Aphasia Quotient scores (AQ) were examined, and in those cases where AQ scores improved, changes were assessed on a number of measures from the AphasiaBank discourse protocol. Sixteen individuals demonstrated improved WAB-R AQ scores, defined as positive AQ change scores greater than the WAB-R AQ standard error of the mean (WAB-SEM); seven maintained their original WAB quotients, defined as AQ change scores that were not greater than the WAB-SEM; and the final three showed negative WAB-R change scores, defined as a negative WAB-R AQ change score greater than the WAB-SEM. Concurrent changes on several AphasiaBank tasks were also found, suggesting that the WAB-R improvements were noted in more natural discourse as well. These data are surprising, since conventional wisdom suggests that spontaneous improvement in language is unlikely to occur beyond one year. Long-term improvement or maintenance of early test scores, such as that shown here, has seldom been demonstrated in the absence of formal treatment. Speculations about why these PWA improved, maintained or declined in their scores are considered.

  14. Production, concentration and titration of pseudotyped HIV-1-based lentiviral vectors.

    PubMed

    Kutner, Robert H; Zhang, Xian-Yang; Reiser, Jakob

    2009-01-01

    Over the past decade, lentiviral vectors have emerged as powerful tools for transgene delivery. The use of lentiviral vectors has become commonplace and applications in the fields of neuroscience, hematology, developmental biology, stem cell biology and transgenesis are rapidly emerging. Also, lentiviral vectors are at present being explored in the context of human clinical trials. Here we describe improved protocols to generate highly concentrated lentiviral vector pseudotypes involving different envelope glycoproteins. In this protocol, vector stocks are prepared by transient transfection using standard cell culture media or serum-free media. Such stocks are then concentrated by ultracentrifugation and/or ion exchange chromatography, or by precipitation using polyethylene glycol 6000, resulting in vector titers of up to 10(10) transducing units per milliliter and above. We also provide reliable real-time PCR protocols to titrate lentiviral vectors based on proviral DNA copies present in genomic DNA extracted from transduced cells or on vector RNA. These production/concentration methods result in high-titer vector preparations that show reduced toxicity compared with lentiviral vectors produced using standard protocols involving ultracentrifugation-based methods. The vector production and titration protocol described here can be completed within 8 d.

  15. Filmless PACS in a multiple facility environment

    NASA Astrophysics Data System (ADS)

    Wilson, Dennis L.; Glicksman, Robert A.; Prior, Fred W.; Siu, Kai-Yeung; Goldburgh, Mitchell M.

    1996-05-01

    A Picture Archiving and Communication System centered on a shared image file server can support a filmless hospital. Systems based on this architecture have proven themselves in over four years of clinical operation. Changes in healthcare delivery are causing radiology groups to support multiple facilities for remote clinic support and consolidation of services. There will be a corresponding need for communicating over a standardized wide area network (WAN). Interactive workflow, a natural extension to the single facility case, requires a means to work effectively and seamlessly across moderate to low speed communication networks. Several schemes for supporting a consortium of medical treatment facilities over a WAN are explored. Both centralized and distributed database approaches are evaluated against several WAN scenarios. Likewise, several architectures for distributing image file servers or buffers over a WAN are explored, along with the caching and distribution strategies that support them. An open system implementation is critical to the success of a wide area system. The role of the Digital Imaging and Communications in Medicine (DICOM) standard in supporting multi- facility and multi-vendor open systems is also addressed. An open system can be achieved by using a DICOM server to provide a view of the system-wide distributed database. The DICOM server interface to a local version of the global database lets a local workstation treat the multiple, distributed data servers as though they were one local server for purposes of examination queries. The query will recover information about the examination that will permit retrieval over the network from the server on which the examination resides. For efficiency reasons, the ability to build cross-facility radiologist worklists and clinician-oriented patient folders is essential. The technologies of the World-Wide-Web can be used to generate worklists and patient folders across facilities. A reliable broadcast protocol may be a convenient way to notify many different users and many image servers about new activities in the network of image servers. In addition to ensuring reliability of message delivery and global serialization of each broadcast message in the network, the broadcast protocol should not introduce significant communication overhead.

  16. Improvements in the Protein Identifier Cross-Reference service.

    PubMed

    Wein, Samuel P; Côté, Richard G; Dumousseau, Marine; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan A

    2012-07-01

    The Protein Identifier Cross-Reference (PICR) service is a tool that allows users to map protein identifiers, protein sequences and gene identifiers across over 100 different source databases. PICR takes input through an interactive website as well as Representational State Transfer (REST) and Simple Object Access Protocol (SOAP) services. It returns the results as HTML pages, XLS and CSV files. It has been in production since 2007 and has been recently enhanced to add new functionality and increase the number of databases it covers. Protein subsequences can be Basic Local Alignment Search Tool (BLAST) against the UniProt Knowledgebase (UniProtKB) to provide an entry point to the standard PICR mapping algorithm. In addition, gene identifiers from UniProtKB and Ensembl can now be submitted as input or mapped to as output from PICR. We have also implemented a 'best-guess' mapping algorithm for UniProt. In this article, we describe the usefulness of PICR, how these changes have been implemented, and the corresponding additions to the web services. Finally, we explain that the number of source databases covered by PICR has increased from the initial 73 to the current 102. New resources include several new species-specific Ensembl databases as well as the Ensembl Genome ones. PICR can be accessed at http://www.ebi.ac.uk/Tools/picr/.

  17. A protocol for the creation of useful geometric shape metrics illustrated with a newly derived geometric measure of leaf circularity.

    PubMed

    Krieger, Jonathan D

    2014-08-01

    I present a protocol for creating geometric leaf shape metrics to facilitate widespread application of geometric morphometric methods to leaf shape measurement. • To quantify circularity, I created a novel shape metric in the form of the vector between a circle and a line, termed geometric circularity. Using leaves from 17 fern taxa, I performed a coordinate-point eigenshape analysis to empirically identify patterns of shape covariation. I then compared the geometric circularity metric to the empirically derived shape space and the standard metric, circularity shape factor. • The geometric circularity metric was consistent with empirical patterns of shape covariation and appeared more biologically meaningful than the standard approach, the circularity shape factor. The protocol described here has the potential to make geometric morphometrics more accessible to plant biologists by generalizing the approach to developing synthetic shape metrics based on classic, qualitative shape descriptors.

  18. Normal values and standardization of parameters in nuclear cardiology: Japanese Society of Nuclear Medicine working group database.

    PubMed

    Nakajima, Kenichi; Matsumoto, Naoya; Kasai, Tokuo; Matsuo, Shinro; Kiso, Keisuke; Okuda, Koichi

    2016-04-01

    As a 2-year project of the Japanese Society of Nuclear Medicine working group activity, normal myocardial imaging databases were accumulated and summarized. Stress-rest with gated and non-gated image sets were accumulated for myocardial perfusion imaging and could be used for perfusion defect scoring and normal left ventricular (LV) function analysis. For single-photon emission computed tomography (SPECT) with multi-focal collimator design, databases of supine and prone positions and computed tomography (CT)-based attenuation correction were created. The CT-based correction provided similar perfusion patterns between genders. In phase analysis of gated myocardial perfusion SPECT, a new approach for analyzing dyssynchrony, normal ranges of parameters for phase bandwidth, standard deviation and entropy were determined in four software programs. Although the results were not interchangeable, dependency on gender, ejection fraction and volumes were common characteristics of these parameters. Standardization of (123)I-MIBG sympathetic imaging was performed regarding heart-to-mediastinum ratio (HMR) using a calibration phantom method. The HMRs from any collimator types could be converted to the value with medium-energy comparable collimators. Appropriate quantification based on common normal databases and standard technology could play a pivotal role for clinical practice and researches.

  19. A Systematic Review on Immediate Loading of Implants Used to Support Overdentures Opposed by Conventional Prostheses: Factors That Might Influence Clinical Outcomes.

    PubMed

    Zygogiannis, Kostas; Wismeijer, Daniel; Aartman, Irene Ha; Osman, Reham B

    2016-01-01

    Different treatment protocols in terms of number, diameter, and suprastructure design have been proposed for immediately loaded implants that are used to support mandibular overdentures opposed by maxillary conventional dentures. The aim of this study was to investigate the influence of these protocols on survival rates as well as clinical and prosthodontic outcomes. Several electronic databases were searched for all relevant articles published from 1966 to June 2014. Only randomized controlled trials and prospective studies with a minimum follow-up of 12 months were selected. The primary outcomes of interest were the success and survival rates of the implants. Prosthodontic complications were also evaluated. Fourteen studies fulfilled the inclusion criteria. Of the studies identified, nine were randomized controlled trials and five were prospective studies. The mean follow-up period was 3 years or less for the vast majority of the studies. The reported survival and success rates were comparable to that of conventional loading for most of the included studies. No specific immediate loading protocol seemed to perform better in terms of clinical and prosthodontic outcomes. Immediate loading protocols of mandibular overdentures seem to be a viable alternative to conventional loading. It was not possible to recommend a specific treatment protocol related to the number, diameter of the implants, and attachment system used. Long-term, well-designed studies comparing different immediate loading modalities could help to establish a protocol that delivers the most clinically predictable, efficient, and cost-effective outcome for edentulous patients in need of implant overdentures.

  20. Improving Collaboration by Standardization Efforts in Systems Biology

    PubMed Central

    Dräger, Andreas; Palsson, Bernhard Ø.

    2014-01-01

    Collaborative genome-scale reconstruction endeavors of metabolic networks would not be possible without a common, standardized formal representation of these systems. The ability to precisely define biological building blocks together with their dynamic behavior has even been considered a prerequisite for upcoming synthetic biology approaches. Driven by the requirements of such ambitious research goals, standardization itself has become an active field of research on nearly all levels of granularity in biology. In addition to the originally envisaged exchange of computational models and tool interoperability, new standards have been suggested for an unambiguous graphical display of biological phenomena, to annotate, archive, as well as to rank models, and to describe execution and the outcomes of simulation experiments. The spectrum now even covers the interaction of entire neurons in the brain, three-dimensional motions, and the description of pharmacometric studies. Thereby, the mathematical description of systems and approaches for their (repeated) simulation are clearly separated from each other and also from their graphical representation. Minimum information definitions constitute guidelines and common operation protocols in order to ensure reproducibility of findings and a unified knowledge representation. Central database infrastructures have been established that provide the scientific community with persistent links from model annotations to online resources. A rich variety of open-source software tools thrives for all data formats, often supporting a multitude of programing languages. Regular meetings and workshops of developers and users lead to continuous improvement and ongoing development of these standardization efforts. This article gives a brief overview about the current state of the growing number of operation protocols, mark-up languages, graphical descriptions, and fundamental software support with relevance to systems biology. PMID:25538939

  1. [Discussion on developing a data management plan and its key factors in clinical study based on electronic data capture system].

    PubMed

    Li, Qing-na; Huang, Xiu-ling; Gao, Rui; Lu, Fang

    2012-08-01

    Data management has significant impact on the quality control of clinical studies. Every clinical study should have a data management plan to provide overall work instructions and ensure that all of these tasks are completed according to the Good Clinical Data Management Practice (GCDMP). Meanwhile, the data management plan (DMP) is an auditable document requested by regulatory inspectors and must be written in a manner that is realistic and of high quality. The significance of DMP, the minimum standards and the best practices provided by GCDMP, the main contents of DMP based on electronic data capture (EDC) and some key factors of DMP influencing the quality of clinical study were elaborated in this paper. Specifically, DMP generally consists of 15 parts, namely, the approval page, the protocol summary, role and training, timelines, database design, creation, maintenance and security, data entry, data validation, quality control and quality assurance, the management of external data, serious adverse event data reconciliation, coding, database lock, data management reports, the communication plan and the abbreviated terms. Among them, the following three parts are regarded as the key factors: designing a standardized database of the clinical study, entering data in time and cleansing data efficiently. In the last part of this article, the authors also analyzed the problems in clinical research of traditional Chinese medicine using the EDC system and put forward some suggestions for improvement.

  2. Recommendations for standardizing validation procedures assessing physical activity of older persons by monitoring body postures and movements.

    PubMed

    Lindemann, Ulrich; Zijlstra, Wiebren; Aminian, Kamiar; Chastin, Sebastien F M; de Bruin, Eling D; Helbostad, Jorunn L; Bussmann, Johannes B J

    2014-01-10

    Physical activity is an important determinant of health and well-being in older persons and contributes to their social participation and quality of life. Hence, assessment tools are needed to study this physical activity in free-living conditions. Wearable motion sensing technology is used to assess physical activity. However, there is a lack of harmonisation of validation protocols and applied statistics, which make it hard to compare available and future studies. Therefore, the aim of this paper is to formulate recommendations for assessing the validity of sensor-based activity monitoring in older persons with focus on the measurement of body postures and movements. Validation studies of body-worn devices providing parameters on body postures and movements were identified and summarized and an extensive inter-active process between authors resulted in recommendations about: information on the assessed persons, the technical system, and the analysis of relevant parameters of physical activity, based on a standardized and semi-structured protocol. The recommended protocols can be regarded as a first attempt to standardize validity studies in the area of monitoring physical activity.

  3. Towards communication-efficient quantum oblivious key distribution

    NASA Astrophysics Data System (ADS)

    Panduranga Rao, M. V.; Jakobi, M.

    2013-01-01

    Symmetrically private information retrieval, a fundamental problem in the field of secure multiparty computation, is defined as follows: A database D of N bits held by Bob is queried by a user Alice who is interested in the bit Db in such a way that (1) Alice learns Db and only Db and (2) Bob does not learn anything about Alice's choice b. While solutions to this problem in the classical domain rely largely on unproven computational complexity theoretic assumptions, it is also known that perfect solutions that guarantee both database and user privacy are impossible in the quantum domain. Jakobi [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.83.022301 83, 022301 (2011)] proposed a protocol for oblivious transfer using well-known quantum key device (QKD) techniques to establish an oblivious key to solve this problem. Their solution provided a good degree of database and user privacy (using physical principles like the impossibility of perfectly distinguishing nonorthogonal quantum states and the impossibility of superluminal communication) while being loss-resistant and implementable with commercial QKD devices (due to the use of the Scarani-Acin-Ribordy-Gisin 2004 protocol). However, their quantum oblivious key distribution (QOKD) protocol requires a communication complexity of O(NlogN). Since modern databases can be extremely large, it is important to reduce this communication as much as possible. In this paper, we first suggest a modification of their protocol wherein the number of qubits that need to be exchanged is reduced to O(N). A subsequent generalization reduces the quantum communication complexity even further in such a way that only a few hundred qubits are needed to be transferred even for very large databases.

  4. Digital pathology in nephrology clinical trials, research, and pathology practice.

    PubMed

    Barisoni, Laura; Hodgin, Jeffrey B

    2017-11-01

    In this review, we will discuss (i) how the recent advancements in digital technology and computational engineering are currently applied to nephropathology in the setting of clinical research, trials, and practice; (ii) the benefits of the new digital environment; (iii) how recognizing its challenges provides opportunities for transformation; and (iv) nephropathology in the upcoming era of kidney precision and predictive medicine. Recent studies highlighted how new standardized protocols facilitate the harmonization of digital pathology database infrastructure and morphologic, morphometric, and computer-aided quantitative analyses. Digital pathology enables robust protocols for clinical trials and research, with the potential to identify previously underused or unrecognized clinically useful parameters. The integration of digital pathology with molecular signatures is leading the way to establishing clinically relevant morpho-omic taxonomies of renal diseases. The introduction of digital pathology in clinical research and trials, and the progressive implementation of the modern software ecosystem, opens opportunities for the development of new predictive diagnostic paradigms and computer-aided algorithms, transforming the practice of renal disease into a modern computational science.

  5. A framework for the definition of standardized protocols for measuring upper-extremity kinematics.

    PubMed

    Kontaxis, A; Cutti, A G; Johnson, G R; Veeger, H E J

    2009-03-01

    Increasing interest in upper extremity biomechanics has led to closer investigations of both segment movements and detailed joint motion. Unfortunately, conceptual and practical differences in the motion analysis protocols used up to date reduce compatibility for post data and cross validation analysis and so weaken the body of knowledge. This difficulty highlights a need for standardised protocols, each addressing a set of questions of comparable content. The aim of this work is therefore to open a discussion and propose a flexible framework to support: (1) the definition of standardised protocols, (2) a standardised description of these protocols, and (3) the formulation of general recommendations. Proposal of a framework for the definition of standardized protocols. The framework is composed by two nested flowcharts. The first defines what a motion analysis protocol is by pointing out its role in a motion analysis study. The second flowchart describes the steps to build a protocol, which requires decisions on the joints or segments to be investigated and the description of their mechanical equivalent model, the definition of the anatomical or functional coordinate frames, the choice of marker or sensor configuration and the validity of their use, the definition of the activities to be measured and the refinements that can be applied to the final measurements. Finally, general recommendations are proposed for each of the steps based on the current literature, and open issues are highlighted for future investigation and standardisation. Standardisation of motion analysis protocols is urgent. The proposed framework can guide this process through the rationalisation of the approach.

  6. Installation of the National Transport Code Collaboration Data Server at the ITPA International Multi-tokamak Confinement Profile Database

    NASA Astrophysics Data System (ADS)

    Roach, Colin; Carlsson, Johan; Cary, John R.; Alexander, David A.

    2002-11-01

    The National Transport Code Collaboration (NTCC) has developed an array of software, including a data client/server. The data server, which is written in C++, serves local data (in the ITER Profile Database format) as well as remote data (by accessing one or several MDS+ servers). The client, a web-invocable Java applet, provides a uniform, intuitive, user-friendly, graphical interface to the data server. The uniformity of the interface relieves the user from the trouble of mastering the differences between different data formats and lets him/her focus on the essentials: plotting and viewing the data. The user runs the client by visiting a web page using any Java capable Web browser. The client is automatically downloaded and run by the browser. A reference to the data server is then retrieved via the standard Web protocol (HTTP). The communication between the client and the server is then handled by the mature, industry-standard CORBA middleware. CORBA has bindings for all common languages and many high-quality implementations are available (both Open Source and commercial). The NTCC data server has been installed at the ITPA International Multi-tokamak Confinement Profile Database, which is hosted by the UKAEA at Culham Science Centre. The installation of the data server is protected by an Internet firewall. To make it accessible to clients outside the firewall some modifications of the server were required. The working version of the ITPA confinement profile database is not open to the public. Authentification of legitimate users is done utilizing built-in Java security features to demand a password to download the client. We present an overview of the NTCC data client/server and some details of how the CORBA firewall-traversal issues were resolved and how the user authentification is implemented.

  7. Suggestions for better data presentation in papers: an experience from a comprehensive study on national and sub-national trends of overweight and obesity.

    PubMed

    Djalalinia, Shirin; Kelishadi, Roya; Qorbani, Mostafa; Peykari, Niloofar; Kasaeian, Amir; Saeedi Moghaddam, Sahar; Gohari, Kimiya; Larijani, Bagher; Farzadfar, Farshad

    2014-12-01

    The importance of data quality whether in collection, analysis or presenting stage is a tangible and undeniable scientific fact and the main objects of researches implementation. This paper aims at explaining the main problems of the Iranian scientific papers for providing better data in the field of national and sub-national prevalence, incidence estimates and trends of obesity and overweight. To assess and evaluate papers, we systematically followed an approved standard protocol. Retrieval of studies was performed through Thomson Reuters Web of Science, PubMed, and Scopus, as well as Iranian databases including Irandoc, Scientific Information Database (SID), and IranMedex. Using GBD (Global Burden of Diseases) validated quality assessment forms to assess the quality and availability of data in papers, we considered the following four main domains: a) Quality of studies, b) Quality report of the results, c) Responsiveness of corresponding authors, and d) Diversity in study settings. We retrieved 3,253 records; of these 1,875 were from international and 1378 from national databases. After refining steps, 129 (3.97%) papers remained related to our study domain. More than 51% of relevant papers were excluded because of poor quality of studies. The number of reported total population and points of data were 22,972 and 29 for boys, and 38,985 and 47 for girls, respectively. For all measures, missing values and diversities in studies' setting limited our ability to compare and analyze the results. Moreover, we had some serious problems in contacting the corresponding authors for complementary information necessary (Receptiveness: 17.9%). As the present paper focused on the main problems of Iranian scientific papers and proposed suggestions, the results will have implications for better policy making.

  8. Method for a dummy CD mirror server based on NAS

    NASA Astrophysics Data System (ADS)

    Tang, Muna; Pei, Jing

    2002-09-01

    With the development of computer network, information sharing is becoming the necessity in human life. The rapid development of CD-ROM and CD-ROM driver techniques makes it possible to issue large database online. After comparing many designs of dummy CD mirror database, which are the embodiment of a main product in CD-ROM database now and in near future, we proposed and realized a new PC based scheme. Our system has the following merits, such as, supporting all kinds of CD format; supporting many network protocol; the independence of mirror network server and the main server; low price, super large capacity, without the need of any special hardware. Preliminarily experiments have verified the validity of the proposed scheme. Encouraged by the promising application future, we are now preparing to put it into market. This paper discusses the design and implement of the CD-ROM server detailedly.

  9. Addressing the need for biomarker liquid chromatography/mass spectrometry assays: a protocol for effective method development for the bioanalysis of endogenous compounds in cerebrospinal fluid.

    PubMed

    Benitex, Yulia; McNaney, Colleen A; Luchetti, David; Schaeffer, Eric; Olah, Timothy V; Morgan, Daniel G; Drexler, Dieter M

    2013-08-30

    Research on disorders of the central nervous system (CNS) has shown that an imbalance in the levels of specific endogenous neurotransmitters may underlie certain CNS diseases. These alterations in neurotransmitter levels may provide insight into pathophysiology, but can also serve as disease and pharmacodynamic biomarkers. To measure these potential biomarkers in vivo, the relevant sample matrix is cerebrospinal fluid (CSF), which is in equilibrium with the brain's interstitial fluid and circulates through the ventricular system of the brain and spinal cord. Accurate analysis of these potential biomarkers can be challenging due to low CSF sample volume, low analyte levels, and potential interferences from other endogenous compounds. A protocol has been established for effective method development of bioanalytical assays for endogenous compounds in CSF. Database searches and standard-addition experiments are employed to qualify sample preparation and specificity of the detection thus evaluating accuracy and precision. This protocol was applied to the study of the histaminergic neurotransmitter system and the analysis of histamine and its metabolite 1-methylhistamine in rat CSF. The protocol resulted in a specific and sensitive novel method utilizing pre-column derivatization ultra high performance liquid chromatography/tandem mass spectrometry (UHPLC/MS/MS), which is also capable of separating an endogenous interfering compound, identified as taurine, from the analytes of interest. Copyright © 2013 John Wiley & Sons, Ltd.

  10. The challenges of measuring quality-of-care indicators in rural emergency departments: a cross-sectional descriptive study

    PubMed Central

    Layani, Géraldine; Fleet, Richard; Dallaire, Renée; Tounkara, Fatoumata K.; Poitras, Julien; Archambault, Patrick; Chauny, Jean-Marc; Ouimet, Mathieu; Gauthier, Josée; Dupuis, Gilles; Tanguay, Alain; Lévesque, Jean-Frédéric; Simard-Racine, Geneviève; Haggerty, Jeannie; Légaré, France

    2016-01-01

    Background: Evidence-based indicators of quality of care have been developed to improve care and performance in Canadian emergency departments. The feasibility of measuring these indicators has been assessed mainly in urban and academic emergency departments. We sought to assess the feasibility of measuring quality-of-care indicators in rural emergency departments in Quebec. Methods: We previously identified rural emergency departments in Quebec that offered medical coverage with hospital beds 24 hours a day, 7 days a week and were located in rural areas or small towns as defined by Statistics Canada. A standardized protocol was sent to each emergency department to collect data on 27 validated quality-of-care indicators in 8 categories: duration of stay, patient safety, pain management, pediatrics, cardiology, respiratory care, stroke and sepsis/infection. Data were collected by local professional medical archivists between June and December 2013. Results: Fifteen (58%) of the 26 emergency departments invited to participate completed data collection. The ability to measure the 27 quality-of-care indicators with the use of databases varied across departments. Centres 2, 5, 6 and 13 used databases for at least 21 of the indicators (78%-92%), whereas centres 3, 8, 9, 11, 12 and 15 used databases for 5 (18%) or fewer of the indicators. On average, the centres were able to measure only 41% of the indicators using heterogeneous databases and manual extraction. The 15 centres collected data from 15 different databases or combinations of databases. The average data collection time for each quality-of-care indicator varied from 5 to 88.5 minutes. The median data collection time was 15 minutes or less for most indicators. Interpretation: Quality-of-care indicators were not easily captured with the use of existing databases in rural emergency departments in Quebec. Further work is warranted to improve standardized measurement of these indicators in rural emergency departments in the province and to generalize the information gathered in this study to other health care environments. PMID:27730103

  11. Radionuclide bone scan SPECT-CT: lowering the dose of CT significantly reduces radiation dose without impacting CT image quality

    PubMed Central

    Gupta, Sandeep Kumar; Trethewey, Scott; Brooker, Bree; Rutherford, Natalie; Diffey, Jenny; Viswanathan, Suresh; Attia, John

    2017-01-01

    The CT component of SPECT-CT is required for attenuation correction and anatomical localization of the uptake on SPECT but there is no guideline about the optimal CT acquisition parameters. In our department, a standard CT acquisition protocol was changed in 2013 to give lower radiation dose to the patient. In this study, we retrospectively compared the effects on patient dose as well as the CT image quality with current versus older CT protocols. Ninety nine consecutive patients [n=51 Standard dose ‘old’ protocol (SDP); n=48 lower dose ‘new’ protocol (LDP)] with lumbar spine SPECT-CT for bone scan were examined. The main differences between the two protocols were that SDP used 130 kVp tube voltage and reference current-time product of 70 mAs whereas the LDP used 110 kVp and 40 mAs respectively. Various quantitative parameters from the CT images were obtained and the images were also rated blindly by two experienced nuclear medicine physicians for bony definition and noise. The mean calculated dose length product of the LDP group (121.5±39.6 mGy.cm) was significantly lower compared to the SDP group patients (266.9±96.9 mGy.cm; P<0.0001). This translated into a significant reduction in the mean effective dose to 1.8 mSv from 4.0 mSv. The physicians reported better CT image quality for the bony structures in LDP group although for soft tissue structures, the SDP group had better image quality. The optimized new CT acquisition protocol significantly reduced the radiation dose to the patient and in-fact improved CT image quality for the assessment of bony structures. PMID:28533938

  12. Osteogenic differentiation of equine adipose tissue derived mesenchymal stem cells using CaCl2.

    PubMed

    Elashry, Mohamed I; Baulig, Nadine; Heimann, Manuela; Bernhardt, Caroline; Wenisch, Sabine; Arnhold, Stefan

    2018-04-01

    Adipose tissue derived mesenchymal stem cells (ASCs) may be used to cure bone defects after osteogenic differentiation. In this study we tried to optimize osteogenic differentiation for equine ASCs using various concentrations of CaCl 2 in comparison to the standard osteogenic protocol. ASCs were isolated from subcutaneous adipose tissue from mixed breed horses. The osteogenic induction protocols were (1) the standard osteogenic medium (OM) composed of dexamethasone, ascorbic acid and β-glycerol phosphate; (2) CaCl 2 based protocol composed of 3, 5 and 7.5mM CaCl 2 . Differentiation and proliferation were evaluated at 7, 10, 14 and 21days post-differentiation induction using the alizarin red staining (ARS) detecting matrix calcification. Semi-quantification of cell protein content, ARS and alkaline phosphatase activity (ALP) were performed using an ELISA reader. Quantification of the transcription level for the common osteogenic markers alkaline phosphatase (ALP) and Osteopontin (OP) was performed using RT-qPCR. In the presence of CaCl 2 , a concentration dependent effect on the osteogenic differentiation capacity was evident by the ARS evaluation and OP gene expression. We provide evidence that 5 and 7mM CaCl 2 enhance the osteogenic differentiation compared to the OM protocol. Although, there was a clear commitment of ASCs to the osteogenic fate in the presence of 5 and 7mM CaCl 2 , cell proliferation was increased compared to OM. We report that an optimized CaCl 2 protocol reliably influences ASCs osteogenesis while conserving the proliferation capacity. Thus, using these protocols provide a platform for using ASCs as a cell source in bone tissue engineering. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Efficiency Improvement in a Busy Radiology Practice: Determination of Musculoskeletal Magnetic Resonance Imaging Protocol Using Deep-Learning Convolutional Neural Networks.

    PubMed

    Lee, Young Han

    2018-04-04

    The purposes of this study are to evaluate the feasibility of protocol determination with a convolutional neural networks (CNN) classifier based on short-text classification and to evaluate the agreements by comparing protocols determined by CNN with those determined by musculoskeletal radiologists. Following institutional review board approval, the database of a hospital information system (HIS) was queried for lists of MRI examinations, referring department, patient age, and patient gender. These were exported to a local workstation for analyses: 5258 and 1018 consecutive musculoskeletal MRI examinations were used for the training and test datasets, respectively. The subjects for pre-processing were routine or tumor protocols and the contents were word combinations of the referring department, region, contrast media (or not), gender, and age. The CNN Embedded vector classifier was used with Word2Vec Google news vectors. The test set was tested with each classification model and results were output as routine or tumor protocols. The CNN determinations were evaluated using the receiver operating characteristic (ROC) curves. The accuracies were evaluated by a radiologist-confirmed protocol as the reference protocols. The optimal cut-off values for protocol determination between routine protocols and tumor protocols was 0.5067 with a sensitivity of 92.10%, a specificity of 95.76%, and an area under curve (AUC) of 0.977. The overall accuracy was 94.2% for the ConvNet model. All MRI protocols were correct in the pelvic bone, upper arm, wrist, and lower leg MRIs. Deep-learning-based convolutional neural networks were clinically utilized to determine musculoskeletal MRI protocols. CNN-based text learning and applications could be extended to other radiologic tasks besides image interpretations, improving the work performance of the radiologist.

  14. TU-G-BRD-02: Automated Systematic Quality Assurance Program for Radiation Oncology Information System Upgrades

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, B; Yi, B; Eley, J

    Purpose: To: (1) describe an independent, automated, systematic software-based protocol for verifying clinical data accuracy/integrity for mitigation of data corruption/loss risks following radiation oncology information system (ROIS) upgrades; and (2) report on application of this approach in an academic/community practice environment. Methods: We propose a robust approach to perform quality assurance on the ROIS after an upgrade, targeting four data sources: (1) ROIS relational database; (2) ROIS DICOM interface; (3) ROIS treatment machine data configuration; and (4) ROIS-generated clinical reports. We investigated the database schema for differences between pre-/post-upgrade states. Paired DICOM data streams for the same object (such asmore » RT-Plan/Treatment Record) were compared between pre-/post-upgrade states for data corruption. We examined machine configuration and related commissioning data files for changes and corruption. ROIS-generated treatment appointment and treatment parameter reports were compared to ensure patient encounter and treatment plan accuracy. This protocol was supplemented by an end-to-end clinical workflow test to verify essential ROI functionality and integrity of components interfaced during patient care chain of activities. We describe the implementation of this protocol during a Varian ARIA system upgrade at our clinic. Results: We verified 1,638 data tables with 2.4 billion data records. For 222 under-treatment patients, 605 DICOM RT plans and 13,480 DICOM treatment records retrieved from the ROIS DICOM interface were compared, with no differences in fractions, doses delivered, or treatment parameters. We identified 82 new data tables and 78 amended/deleted tables consistent with the upgrade. Reports for 5,073 patient encounters over a 2-week horizon were compared and were identical to those before the upgrade. Content in 12,237 xml machine files was compared, with no differences identified. Conclusion: An independent QA/validation approach for ROIS upgrades was developed and implemented at our clinic. The success of this approach ensures a robust QA of ROIS upgrades without manual paper/electronic checks and associated intensive labor.« less

  15. Resident Choice and the Survey Process: The Need for Standardized Observation and Transparency

    ERIC Educational Resources Information Center

    Schnelle, John F.; Bertrand, Rosanna; Hurd, Donna; White, Alan; Squires, David; Feuerberg, Marvin; Hickey, Kelly; Simmons, Sandra F.

    2009-01-01

    Purpose: To describe a standardized observation protocol to determine if nursing home (NH) staff offer choice to residents during 3 morning activities of daily living (ADL) and compare the observational data with deficiency statements cited by state survey staff. Design and Methods: Morning ADL care was observed in 20 NHs in 5 states by research…

  16. The challenges of standardizing colonial waterbird survey protocols - what is working? What is not?

    Treesearch

    Melanie Steinkamp; Peter Frederick; Katharine Parsons; Harry Carter; Mike Parker

    2005-01-01

    Our ability to manage and conserve colonial waterbird species throughout Mexico, Meso-America, Canada, the Caribbean nations, and the United States is presently hampered by a lack of reliable information on the status and trends of their populations, information that can only be obtained by collecting comparable data using standardized data collection techniques that...

  17. DDN (Defense Data Network) Protocol Handbook. Volume 1. DoD Military Standard Protocols

    DTIC Science & Technology

    1985-12-01

    official Military Standard communication protocols in use on the DDN are included, as are several ARPANET (Advanced Research Projects Agency Network... research protocols which are currently in use, and some protocols currently undergoing review. Tutorial information and auxiliary documents are also...compatible with DoD needs, by researchers wishing to improve the protocols, and by impleroentors of local area networks (LANs) wishing their

  18. Cryotherapy for acute ankle sprains: a randomised controlled study of two different icing protocols

    PubMed Central

    Bleakley, C M; McDonough, S M; MacAuley, D C

    2006-01-01

    Background The use of cryotherapy in the management of acute soft tissue injury is largely based on anecdotal evidence. Preliminary evidence suggests that intermittent cryotherapy applications are most effective at reducing tissue temperature to optimal therapeutic levels. However, its efficacy in treating injured human subjects is not yet known. Objective : To compare the efficacy of an intermittent cryotherapy treatment protocol with a standard cryotherapy treatment protocol in the management of acute ankle sprains. Subjects Sportsmen (n  =  44) and members of the general public (n  =  45) with mild/moderate acute ankle sprains. Methods Subjects were randomly allocated, under strictly controlled double blind conditions, to one of two treatment groups: standard ice application (n  =  46) or intermittent ice application (n  =  43). The mode of cryotherapy was standardised across groups and consisted of melting iced water (0°C) in a standardised pack. Function, pain, and swelling were recorded at baseline and one, two, three, four, and six weeks after injury. Results Subjects treated with the intermittent protocol had significantly (p<0.05) less ankle pain on activity than those using a standard 20 minute protocol; however, one week after ankle injury, there were no significant differences between groups in terms of function, swelling, or pain at rest. Conclusion Intermittent applications may enhance the therapeutic effect of ice in pain relief after acute soft tissue injury. PMID:16611722

  19. Radiology metrics for safe use and regulatory compliance with CT imaging

    NASA Astrophysics Data System (ADS)

    Paden, Robert; Pavlicek, William

    2018-03-01

    The MACRA Act creates a Merit-Based Payment System, with monitoring patient exposure from CT providing one possible quality metric for meeting merit requirements. Quality metrics are also required by The Joint Commission, ACR, and CMS as facilities are tasked to perform reviews of CT irradiation events outside of expected ranges, review protocols for appropriateness, and validate parameters for low dose lung cancer screening. In order to efficiently collect and analyze irradiation events and associated DICOM tags, all clinical CT devices were DICOM connected to a parser which extracted dose related information for storage into a database. Dose data from every exam is compared to the appropriate external standard exam type. AAPM recommended CTDIvol values for head and torso, adult and pediatrics, coronary and perfusion exams are used for this study. CT doses outside the expected range were automatically formatted into a report for analysis and review documentation. CT Technologist textual content, the reason for proceeding with an irradiation above the recommended threshold, is captured for inclusion in the follow up reviews by physics staff. The use of a knowledge based approach in labeling individual protocol and device settings is a practical solution resulting in efficiency of analysis and review. Manual methods would require approximately 150 person-hours for our facility, exclusive of travel time and independent of device availability. An efficiency of 89% time savings occurs through use of this informatics tool including a low dose CT comparison review and low dose lung cancer screening requirements set forth by CMS.

  20. Latest developments for the IAGOS database: Interoperability and metadata

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.

  1. Constructing Benchmark Databases and Protocols for Medical Image Analysis: Diabetic Retinopathy

    PubMed Central

    Kauppi, Tomi; Kämäräinen, Joni-Kristian; Kalesnykiene, Valentina; Sorri, Iiris; Uusitalo, Hannu; Kälviäinen, Heikki

    2013-01-01

    We address the performance evaluation practices for developing medical image analysis methods, in particular, how to establish and share databases of medical images with verified ground truth and solid evaluation protocols. Such databases support the development of better algorithms, execution of profound method comparisons, and, consequently, technology transfer from research laboratories to clinical practice. For this purpose, we propose a framework consisting of reusable methods and tools for the laborious task of constructing a benchmark database. We provide a software tool for medical image annotation helping to collect class label, spatial span, and expert's confidence on lesions and a method to appropriately combine the manual segmentations from multiple experts. The tool and all necessary functionality for method evaluation are provided as public software packages. As a case study, we utilized the framework and tools to establish the DiaRetDB1 V2.1 database for benchmarking diabetic retinopathy detection algorithms. The database contains a set of retinal images, ground truth based on information from multiple experts, and a baseline algorithm for the detection of retinopathy lesions. PMID:23956787

  2. Size-Sorting Combined with Improved Nanocapillary-LC-MS for Identification of Intact Proteins up to 80 kDa

    PubMed Central

    Vellaichamy, Adaikkalam; Tran, John C.; Catherman, Adam D.; Lee, Ji Eun; Kellie, John F.; Sweet, Steve M.M.; Zamdborg, Leonid; Thomas, Paul M.; Ahlf, Dorothy R.; Durbin, Kenneth R.; Valaskovic, Gary A.; Kelleher, Neil L.

    2010-01-01

    Despite the availability of ultra-high resolution mass spectrometers, methods for separation and detection of intact proteins for proteome-scale analyses are still in a developmental phase. Here we report robust protocols for on-line LC-MS to drive high-throughput top-down proteomics in a fashion similar to bottom-up. Comparative work on protein standards showed that a polymeric stationary phase led to superior sensitivity over a silica-based medium in reversed-phase nanocapillary-LC, with detection of proteins >50 kDa routinely accomplished in the linear ion trap of a hybrid Fourier-Transform mass spectrometer. Protein identification was enabled by nozzle-skimmer dissociation (NSD) and detection of fragment ions with <5 ppm mass accuracy for highly-specific database searching using custom software. This overall approach led to identification of proteins up to 80 kDa, with 10-60 proteins identified in single LC-MS runs of samples from yeast and human cell lines pre-fractionated by their molecular weight using a gel-based sieving system. PMID:20073486

  3. Dose calculation accuracy of different image value to density tables for cone-beam CT planning in head & neck and pelvic localizations.

    PubMed

    Barateau, Anaïs; Garlopeau, Christopher; Cugny, Audrey; De Figueiredo, Bénédicte Henriques; Dupin, Charles; Caron, Jérôme; Antoine, Mikaël

    2015-03-01

    We aimed to identify the most accurate combination of phantom and protocol for image value to density table (IVDT) on volume-modulated arc therapy (VMAT) dose calculation based on kV-Cone-beam CT imaging, for head and neck (H&N) and pelvic localizations. Three phantoms (Catphan(®)600, CIRS(®)062M (inner phantom for head and outer phantom for body), and TomoTherapy(®) "Cheese" phantom) were used to create IVDT curves of CBCT systems with two different CBCT protocols (Standard-dose Head and Standard Pelvis). Hounsfield Unit (HU) time stability and repeatability for a single On-Board-Imager (OBI) and compatibility of two distinct devices were assessed with Catphan(®)600. Images from the anthropomorphic phantom CIRS ATOM(®) for both CT and CBCT modalities were used for VMAT dose calculation from different IVDT curves. Dosimetric indices from CT and CBCT imaging were compared. IVDT curves from CBCT images were highly different depending on phantom used (up to 1000 HU for high densities) and protocol applied (up to 200 HU for high densities). HU time stability was verified over seven weeks. A maximum difference of 3% on the dose calculation indices studied was found between CT and CBCT VMAT dose calculation across the two localizations using appropriate IVDT curves. One IVDT curve per localization can be established with a bi-monthly verification of IVDT-CBCT. The IVDT-CBCTCIRS-Head phantom with the Standard-dose Head protocol was the most accurate combination for dose calculation on H&N CBCT images. For pelvic localizations, the IVDT-CBCTCheese established with the Standard Pelvis protocol provided the best accuracy. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. Visualization of the internal globus pallidus: sequence and orientation for deep brain stimulation using a standard installation protocol at 3.0 Tesla.

    PubMed

    Nölte, Ingo S; Gerigk, Lars; Al-Zghloul, Mansour; Groden, Christoph; Kerl, Hans U

    2012-03-01

    Deep-brain stimulation (DBS) of the internal globus pallidus (GPi) has shown remarkable therapeutic benefits for treatment-resistant neurological disorders including dystonia and Parkinson's disease (PD). The success of the DBS is critically dependent on the reliable visualization of the GPi. The aim of the study was to evaluate promising 3.0 Tesla magnetic resonance imaging (MRI) methods for pre-stereotactic visualization of the GPi using a standard installation protocol. MRI at 3.0 T of nine healthy individuals and of one patient with PD was acquired (FLAIR, T1-MPRAGE, T2-SPACE, T2*-FLASH2D, susceptibility-weighted imaging mapping (SWI)). Image quality and visualization of the GPi for each sequence were assessed by two neuroradiologists independently using a 6-point scale. Axial, coronal, and sagittal planes of the T2*-FLASH2D images were compared. Inter-rater reliability, contrast-to-noise ratios (CNR) and signal-to-noise ratios (SNR) for the GPi were determined. For illustration, axial T2*-FLASH2D images were fused with a section schema of the Schaltenbrand-Wahren stereotactic atlas. The GPi was best and reliably visualized in axial and to a lesser degree on coronal T2*-FLASH2D images. No major artifacts in the GPi were observed in any of the sequences. SWI offered a significantly higher CNR for the GPi compared to standard T2-weighted imaging using the standard parameters. The fusion of the axial T2*-FLASH2D images and the atlas projected the GPi clearly in the boundaries of the section schema. Using a standard installation protocol at 3.0 T T2*-FLASH2D imaging (particularly axial view) provides optimal and reliable delineation of the GPi.

  5. A tonic heat test stimulus yields a larger and more reliable conditioned pain modulation effect compared to a phasic heat test stimulus

    PubMed Central

    Lie, Marie Udnesseter; Matre, Dagfinn; Hansson, Per; Stubhaug, Audun; Zwart, John-Anker; Nilsen, Kristian Bernhard

    2017-01-01

    Abstract Introduction: The interest in conditioned pain modulation (CPM) as a clinical tool for measuring endogenously induced analgesia is increasing. There is, however, large variation in the CPM methodology, hindering comparison of results across studies. Research comparing different CPM protocols is needed in order to obtain a standardized test paradigm. Objectives: The aim of the study was to assess whether a protocol with phasic heat stimuli as test-stimulus is preferable to a protocol with tonic heat stimulus as test-stimulus. Methods: In this experimental crossover study, we compared 2 CPM protocols with different test-stimulus; one with tonic test-stimulus (constant heat stimulus of 120-second duration) and one with phasic test-stimuli (3 heat stimulations of 5 seconds duration separated by 10 seconds). Conditioning stimulus was a 7°C water bath in parallel with the test-stimulus. Twenty-four healthy volunteers were assessed on 2 occasions with minimum 1 week apart. Differences in the magnitude and test–retest reliability of the CPM effect in the 2 protocols were investigated with repeated-measures analysis of variance and by relative and absolute reliability indices. Results: The protocol with tonic test-stimulus induced a significantly larger CPM effect compared to the protocol with phasic test-stimuli (P < 0.001). Fair and good relative reliability was found with the phasic and tonic test-stimuli, respectively. Absolute reliability indices showed large intraindividual variability from session to session in both protocols. Conclusion: The present study shows that a CPM protocol with a tonic test-stimulus is preferable to a protocol with phasic test-stimuli. However, we emphasize that one should be cautious to use the CPM effect as biomarker or in clinical decision making on an individual level due to large intraindividual variability. PMID:29392240

  6. Lack of evidence of a beneficial effect of azathioprine in dogs treated with prednisolone for idiopathic immune-mediated hemolytic anemia: a retrospective cohort study.

    PubMed

    Piek, Christine J; van Spil, Willem Evert; Junius, Greet; Dekker, Aldo

    2011-04-13

    Azathioprine is used as an immunosuppressant in canine immune-mediated hemolytic anemia (IMHA), but this potentially toxic and carcinogenic drug has not been proven to be beneficial. The aim of this study was to determine the difference in outcome and survival of dogs with idiopathic IMHA treated with a protocol that included azathioprine and prednisolone versus a protocol that included prednisolone alone. The study included 222 dogs with a hematocrit lower than 0.30 L/L and either a positive Coombs' test or spherocytosis and no evidence of diseases that could trigger IMHA. The clinical and laboratory data at the time of diagnosis and the response to therapy and survival were compared in dogs treated according to the prednisolone and azathioprine protocol (AP protocol; n = 149) and dogs treated according to the prednisolone protocol (P protocol; n = 73). At study entry, the two groups were comparable, except that thrombocyte counts were significantly lower and clinical signs had been present significantly longer in the AP protocol group. No significant difference in survival was found between the two groups: the 1-year survival was 64% (95% CI 54 - 77%) in the P protocol group and 69% (95% CI 59-80%) in the AP protocol group, respectively. Azathioprine would appear not to be beneficial as standard treatment for all cases of IMHA; however, a blinded, randomized clinical trial is needed to establish whether outcome is different with the two treatment protocols.

  7. A Standard-Driven Data Dictionary for Data Harmonization of Heterogeneous Datasets in Urban Geological Information Systems

    NASA Astrophysics Data System (ADS)

    Liu, G.; Wu, C.; Li, X.; Song, P.

    2013-12-01

    The 3D urban geological information system has been a major part of the national urban geological survey project of China Geological Survey in recent years. Large amount of multi-source and multi-subject data are to be stored in the urban geological databases. There are various models and vocabularies drafted and applied by industrial companies in urban geological data. The issues such as duplicate and ambiguous definition of terms and different coding structure increase the difficulty of information sharing and data integration. To solve this problem, we proposed a national standard-driven information classification and coding method to effectively store and integrate urban geological data, and we applied the data dictionary technology to achieve structural and standard data storage. The overall purpose of this work is to set up a common data platform to provide information sharing service. Research progresses are as follows: (1) A unified classification and coding method for multi-source data based on national standards. Underlying national standards include GB 9649-88 for geology and GB/T 13923-2006 for geography. Current industrial models are compared with national standards to build a mapping table. The attributes of various urban geological data entity models are reduced to several categories according to their application phases and domains. Then a logical data model is set up as a standard format to design data file structures for a relational database. (2) A multi-level data dictionary for data standardization constraint. Three levels of data dictionary are designed: model data dictionary is used to manage system database files and enhance maintenance of the whole database system; attribute dictionary organizes fields used in database tables; term and code dictionary is applied to provide a standard for urban information system by adopting appropriate classification and coding methods; comprehensive data dictionary manages system operation and security. (3) An extension to system data management function based on data dictionary. Data item constraint input function is making use of the standard term and code dictionary to get standard input result. Attribute dictionary organizes all the fields of an urban geological information database to ensure the consistency of term use for fields. Model dictionary is used to generate a database operation interface automatically with standard semantic content via term and code dictionary. The above method and technology have been applied to the construction of Fuzhou Urban Geological Information System, South-East China with satisfactory results.

  8. Simplified dispatch-assisted CPR instructions outperform standard protocol.

    PubMed

    Dias, J A; Brown, T B; Saini, D; Shah, R C; Cofield, S S; Waterbor, J W; Funkhouser, E; Terndrup, T E

    2007-01-01

    Dispatch-assisted chest compressions only CPR (CC-CPR) has gained widespread acceptance, and recent research suggests that increasing the proportion of compression time during CPR may increase survival from out-of-hospital cardiac arrest. We created a simplified CC-CPR protocol to reduce time to start chest compressions and to increase the proportion of time spent delivering chest compressions. This simplified protocol was compared to a published protocol, Medical Priority Dispatch System (MPDS) Version 11.2, recommended by the National Academies of Emergency Dispatch. Subjects were randomized to the MPDS v11.2 protocol or a simplified protocol. Data was recorded from a Laerdal Resusci Anne Skillreporter manikin. A simulated emergency medical dispatcher, contacted by cell phone, delivered standardized instructions for both protocols. Outcomes included chest compression rate, depth, hand position, full release, overall proportion of compressions without error, time to start of CPR and total hands-off chest time. Proportions were analyzed by Wilcoxon's Rank Sum tests and time variables with Welch ANOVA and Wilcoxon's Rank Sum test. All tests used a two-sided alpha-level of 0.05. One hundred and seventeen subjects were randomized prospectively, 58 to the standard protocol and 59 to the simplified protocol. The average age of subjects in both groups was 25 years old. For both groups, the compression rate was equivalent (104 simplified versus 94 MPDS, p = 0.13), as was the proportion with total release (1.0 simplified versus 1.0 MPDS, p = 0.09). The proportion to the correct depth was greater in the simplified protocol (0.31 versus 0.03, p < 0.01), as was the proportion of compressions done without error (0.05 versus 0.0, p = 0.16). Time to start of chest compressions and total hands-off chest time were better in the simplified protocol (start time 60.9s versus 78.6s, p < 0.0001; hands-off chest time 69 s versus 95 s, p < 0.0001). The proportion with correct hand position, however, was worse in the simplified protocol (0.35 versus 0.84, p < 0.01). The simplified protocol was as good as, or better than the MPDS v11.2 protocol in every aspect studied except hand position, and the simplified protocol resulted in significant time savings. The protocol may need modification to ensure correct hand position. Time savings and improved quality of CPR achieved by the new set of instructions could be important in strengthening critical links in the cardiac chain of survival.

  9. A protocol for combined Photinus and Renilla luciferase quantification compatible with protein assays.

    PubMed

    Hampf, Mathias; Gossen, Manfred

    2006-09-01

    We established a quantitative reporter gene protocol, the P/Rluc assay system, allowing the sequential measurement of Photinus and Renilla luciferase activities from the same extract. Other than comparable commercial reporter assay systems and their noncommercial counterparts, the P/Rluc assay system was formulated under the aspect of full compatibility with standard methods for protein assays. This feature greatly expands the range of applications for assay systems quantifying the expression of multiple luciferase reporters.

  10. Techniques and Protocols for Dispersing Nanoparticle Powders in Aqueous Media-Is there a Rationale for Harmonization?

    PubMed

    Hartmann, Nanna B; Jensen, Keld Alstrup; Baun, Anders; Rasmussen, Kirsten; Rauscher, Hubert; Tantra, Ratna; Cupi, Denisa; Gilliland, Douglas; Pianella, Francesca; Riego Sintes, Juan M

    2015-01-01

    Selecting appropriate ways of bringing engineered nanoparticles (ENP) into aqueous dispersion is a main obstacle for testing, and thus for understanding and evaluating, their potential adverse effects to the environment and human health. Using different methods to prepare (stock) dispersions of the same ENP may be a source of variation in the toxicity measured. Harmonization and standardization of dispersion methods applied in mammalian and ecotoxicity testing are needed to ensure a comparable data quality and to minimize test artifacts produced by modifications of ENP during the dispersion preparation process. Such harmonization and standardization will also enhance comparability among tests, labs, and studies on different types of ENP. The scope of this review was to critically discuss the essential parameters in dispersion protocols for ENP. The parameters are identified from individual scientific studies and from consensus reached in larger scale research projects and international organizations. A step-wise approach is proposed to develop tailored dispersion protocols for ecotoxicological and mammalian toxicological testing of ENP. The recommendations of this analysis may serve as a guide to researchers, companies, and regulators when selecting, developing, and evaluating the appropriateness of dispersion methods applied in mammalian and ecotoxicity testing. However, additional experimentation is needed to further document the protocol parameters and investigate to what extent different stock dispersion methods affect ecotoxicological and mammalian toxicological responses of ENP.

  11. [Treatment results with ALL-BFM-95 protocol in children with acute lymphoblastic leukemia in Hungary].

    PubMed

    Müller, Judit; Kovács, Gábor; Jakab, Zsuzsanna; Rényi, Imre; Galántai, Ilona; Békési, Andrea; Kiss, Csongor; Nagy, Kálmán; Kajtár, Pál; Bartyik, Katalin; Masát, Péter; Magyarosy, Edina

    2005-01-09

    In Hungary children (from 1 to 18 years of age) with de novo acute lymphoblastic leukemia were treated from January 1996 to October 2002, according to protocol ALL-BFM-95. The aim of this study was to evaluate the experience with this protocol, the treatment results according to the risk groups and to compare the Hungarian data with the international results. Patients were stratified into 3 risk groups, based on initial white blood cell count, age, immunology, cytogenetics and response to treatment: standard, medium and high risk group. Three hundred sixty eight children entered the study (male-female ratio was 1.27:1, median age 6 years and 4 months). 110 (29.9%) children were in the standard, 210 (57.1%) in the medium and 48 (13%) in the high risk group. Duration of the chemotherapy was 2 years, except of the boys in the standard risk group, their maintenance therapy was 1 year longer. The overall complete remission rate was 93.2%. 20 (5.4%) children died in induction and 5 (1.4%) were non-responders. The 5-year overall survival for all patients was 78.5%, in the standard risk group 93.2%, in the medium risk group 78.4% and in the high risk group 44.5% with a minimum follow up of 1.19 years and median follow up of 4.85 years. From the 368 patients 272 (73.9%) are still in their first complete clinical remission and other 18 children are alive after relapse. In 14.7% of the patients relapse was diagnosed; the most common site was the bone marrow. In one patient second malignancy occurred. The 5-year event free survival for all patients was 72.6%, in the standard risk group 87.6%, in the medium risk group 72.1% and in the high risk group 39.9%. The treatment outcome of children with acute lymphoblastic leukemia improved remarkably over the last decades. 78% of children suffering from acute lymphoblastic leukemia could be cured with the ALL-BFM-95 protocol. The Hungarian results are comparable to those achieved by other leukaemia study groups in the world regarding the ALL-BFM-95 protocol.

  12. Biplane interventional pediatric system with cone‐beam CT: dose and image quality characterization for the default protocols

    PubMed Central

    Vañó, Eliseo; Alejo, Luis; Ubeda, Carlos; Gutiérrez‐Larraya, Federico; Garayoa, Julia

    2016-01-01

    The aim of this study was to assess image quality and radiation dose of a biplane angiographic system with cone‐beam CT (CBCT) capability tuned for pediatric cardiac procedures. The results of this study can be used to explore dose reduction techniques. For pulsed fluoroscopy and cine modes, polymethyl methacrylate phantoms of various thicknesses and a Leeds TOR 18‐FG test object were employed. Various fields of view (FOV) were selected. For CBCT, the study employed head and body dose phantoms, Catphan 504, and an anthropomorphic cardiology phantom. The study also compared two 3D rotational angiography protocols. The entrance surface air kerma per frame increases by a factor of 3–12 when comparing cine and fluoroscopy frames. The biggest difference in the signal‐to‐noise ratio between fluoroscopy and cine modes occurs at FOV 32 cm because fluoroscopy is acquired at a 1440×1440 pixel matrix size and in unbinned mode, whereas cine is acquired at 720×720 pixels and in binned mode. The high‐contrast spatial resolution of cine is better than that of fluoroscopy, except for FOV 32 cm, because fluoroscopy mode with 32 cm FOV is unbinned. Acquiring CBCT series with a 16 cm head phantom using the standard dose protocol results in a threefold dose increase compared with the low‐dose protocol. Although the amount of noise present in the images acquired with the low‐dose protocol is much higher than that obtained with the standard mode, the images present better spatial resolution. A 1 mm diameter rod with 250 Hounsfield units can be distinguished in reconstructed images with an 8 mm slice width. Pediatric‐specific protocols provide lower doses while maintaining sufficient image quality. The system offers a novel 3D imaging mode. The acquisition of CBCT images results in increased doses administered to the patients, but also provides further diagnostic information contained in the volumetric images. The assessed CBCT protocols provide images that are noisy, but with very good spatial resolution. PACS number(s): 87.59.‐e, 87.59.‐C, 87.59.‐cf, 87.59.Dj, 87.57. uq PMID:27455474

  13. Biplane interventional pediatric system with cone-beam CT: dose and image quality characterization for the default protocols.

    PubMed

    Corredoira, Eva; Vañó, Eliseo; Alejo, Luis; Ubeda, Carlos; Gutiérrez-Larraya, Federico; Garayoa, Julia

    2016-07-08

    The aim of this study was to assess image quality and radiation dose of a biplane angiographic system with cone-beam CT (CBCT) capability tuned for pediatric cardiac procedures. The results of this study can be used to explore dose reduction techniques. For pulsed fluoroscopy and cine modes, polymethyl methacrylate phantoms of various thicknesses and a Leeds TOR 18-FG test object were employed. Various fields of view (FOV) were selected. For CBCT, the study employed head and body dose phantoms, Catphan 504, and an anthropomorphic cardiology phantom. The study also compared two 3D rotational angiography protocols. The entrance surface air kerma per frame increases by a factor of 3-12 when comparing cine and fluoroscopy frames. The biggest difference in the signal-to- noise ratio between fluoroscopy and cine modes occurs at FOV 32 cm because fluoroscopy is acquired at a 1440 × 1440 pixel matrix size and in unbinned mode, whereas cine is acquired at 720 × 720 pixels and in binned mode. The high-contrast spatial resolution of cine is better than that of fluoroscopy, except for FOV 32 cm, because fluoroscopy mode with 32 cm FOV is unbinned. Acquiring CBCT series with a 16 cm head phantom using the standard dose protocol results in a threefold dose increase compared with the low-dose protocol. Although the amount of noise present in the images acquired with the low-dose protocol is much higher than that obtained with the standard mode, the images present better spatial resolution. A 1 mm diameter rod with 250 Hounsfield units can be distinguished in reconstructed images with an 8 mm slice width. Pediatric-specific protocols provide lower doses while maintaining sufficient image quality. The system offers a novel 3D imaging mode. The acquisition of CBCT images results in increased doses administered to the patients, but also provides further diagnostic information contained in the volumetric images. The assessed CBCT protocols provide images that are noisy, but with very good spatial resolution. © 2016 The Authors.

  14. Accuracy of computer-aided design models of the jaws produced using ultra-low MDCT doses and ASIR and MBIR.

    PubMed

    Al-Ekrish, Asma'a A; Alfadda, Sara A; Ameen, Wadea; Hörmann, Romed; Puelacher, Wolfgang; Widmann, Gerlig

    2018-06-16

    To compare the surface of computer-aided design (CAD) models of the maxilla produced using ultra-low MDCT doses combined with filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASIR) and model-based iterative reconstruction (MBIR) reconstruction techniques with that produced from a standard dose/FBP protocol. A cadaveric completely edentulous maxilla was imaged using a standard dose protocol (CTDIvol: 29.4 mGy) and FBP, in addition to 5 low dose test protocols (LD1-5) (CTDIvol: 4.19, 2.64, 0.99, 0.53, and 0.29 mGy) reconstructed with FBP, ASIR 50, ASIR 100, and MBIR. A CAD model from each test protocol was superimposed onto the reference model using the 'Best Fit Alignment' function. Differences between the test and reference models were analyzed as maximum and mean deviations, and root-mean-square of the deviations, and color-coded models were obtained which demonstrated the location, magnitude and direction of the deviations. Based upon the magnitude, size, and distribution of areas of deviations, CAD models from the following protocols were comparable to the reference model: FBP/LD1; ASIR 50/LD1 and LD2; ASIR 100/LD1, LD2, and LD3; MBIR/LD1. The following protocols demonstrated deviations mostly between 1-2 mm or under 1 mm but over large areas, and so their effect on surgical guide accuracy is questionable: FBP/LD2; MBIR/LD2, LD3, LD4, and LD5. The following protocols demonstrated large deviations over large areas and therefore were not comparable to the reference model: FBP/LD3, LD4, and LD5; ASIR 50/LD3, LD4, and LD5; ASIR 100/LD4, and LD5. When MDCT is used for CAD models of the jaws, dose reductions of 86% may be possible with FBP, 91% with ASIR 50, and 97% with ASIR 100. Analysis of the stability and accuracy of CAD/CAM surgical guides as directly related to the jaws is needed to confirm the results.

  15. The impact of perioperative fluid therapy on short-term outcomes and 5-year survival among patients undergoing colorectal cancer surgery - A prospective cohort study within an ERAS protocol.

    PubMed

    Asklid, D; Segelman, J; Gedda, C; Hjern, F; Pekkari, K; Gustafsson, U O

    2017-08-01

    Restricted perioperative fluid therapy is one of several interventions in the enhanced recovery after surgery (ERAS) protocol, designed to reduce morbidity and hospital stay after surgery. The impact of this single intervention on short and long term outcome after colorectal surgery is unknown. This cohort study includes all consecutive patients operated with abdominal resection of colorectal cancer 2002-2007 at Ersta Hospital, Stockholm, Sweden. All patients were treated within an ERAS protocol and registered in the ERAS-database. Compliance to interventions in the ERAS protocol was analysed. The impact of a restrictive perioperative fluid therapy (≤3000 ml on the day of surgery) protocol on short-term outcomes as well as 5-year survival was assessed with multivariable analysis adjusted for confounding factors. Nine hundred and eleven patients were included. Patients receiving ≤3000 ml of intravenous fluids on the day of surgery had a lower risk of complications OR 0.44 (95% C I 0.28-0.71), symptoms delaying discharge OR 0.47(95% C I 0.32-0.70) and shorter length of stay compared with patients receiving >3000 ml. In cox regression analysis, the risk of cancer specific death was reduced with 55% HR 0.45(95% C I 0.25-0.81) for patients receiving ≤ 3000 ml compared with patients receiving >3000 ml. A restrictive compared with a non-restrictive perioperative fluid therapy on the day of surgery may be associated with lower short-term complication rates, faster recovery, shorter length of stay and improved 5-year survival. Copyright © 2017 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.

  16. Polyethylene glycol 3350 based colon cleaning protocol: 2 d vs 4 d head to head comparison.

    PubMed

    Elitsur, Rotem; Butcher, Lisa; Vicki, Lund; Elitsur, Yoram

    2013-04-16

    To compare between 2 and 4 d colon cleansing protocols. Children who were scheduled for colonoscopy procedure (2010-2012) for various medical reasons, were recruited from the pediatric gastroenterology clinic at Marshall University School of Medicine, Huntington, WV. Exclusion criteria were patients who were allergic to the medication used in the protocols [polyethylene glycol (PEG) 3350, Bisacodyl], or children with metabolic or renal diseases. Two PEG 3350 protocols for 4 d (A) and 2 d (B) were prescribed as previously described. A questionnaire describing the volume of PEG consumed, clinical data, and side effects were recorded. Colon preparation was graded by two observers according to previously described method. Rate of adequate colon preparation. A total of 78 patients were considered for final calculation (group A: 40, group B: 38). Age and stool consistency at the last day was comparable in both groups, but the number of stools/day was significantly higher in group B (P = 0.001). Adequate colon preparation was reached in 57.5% (A) and 73.6% (B), respectively (P = 0.206). Side effects were minimal and comparable in both groups. There was no difference in children's age, stool characteristics, or side effects between the children with adequate or inadequate colon preparation. Correlation and agreement between observers was excellent (Pearson correlation = 0.972, kappa = 1.0). No difference between protocols was observed, but the 2 d protocol was superior for its shorter time. Direct comparison between different colon cleansing protocols is crucial in order to establish the "gold standard" protocol for children.

  17. Efficacy of an accelerated recovery protocol for Oxford unicompartmental knee arthroplasty--a randomised controlled trial.

    PubMed

    Reilly, K A; Beard, D J; Barker, K L; Dodd, C A F; Price, A J; Murray, D W

    2005-10-01

    Unicompartmental knee arthroplasty (UKA) is appropriate for one in four patients with osteoarthritic knees. This study was performed to compare the safety, effectiveness and economic viability of a new accelerated protocol with current standard care in a state healthcare system. A single blind RCT design was used. Eligible patients were screened for NSAID tolerance, social circumstances and geographical location before allocation to an accelerated recovery group (A) or standard care group (S). Primary outcome was the Oxford Knee Assessment at 6 months post operation, compared using independent Mann-Whitney U-tests. A simple difference in costs incurred was calculated. The study power was sufficient to avoid type 2 errors. Forty-one patients were included. The average stay for Group A was 1.5 days. Group S averaged 4.3 days. No significant difference in outcomes was found between groups. The new protocol achieved cost savings of 27% and significantly reduced hospital bed occupancy. In addition, patient satisfaction was assessed as greater with the accelerated discharge than with the routine discharge time. The strict inclusion criteria meant that 75% of eligible patients were excluded. However, a large percentage of these were due to the distances patients lived from the hospital.

  18. Low-dose ionizing radiation increases the mortality risk of solid cancers in nuclear industry workers: A meta-analysis

    PubMed Central

    Qu, Shu-Gen; Gao, Jin; Tang, Bo; Yu, Bo; Shen, Yue-Ping; Tu, Yu

    2018-01-01

    Low-dose ionizing radiation (LDIR) may increase the mortality of solid cancers in nuclear industry workers, but only few individual cohort studies exist, and the available reports have low statistical power. The aim of the present study was to focus on solid cancer mortality risk from LDIR in the nuclear industry using standard mortality ratios (SMRs) and 95% confidence intervals. A systematic literature search through the PubMed and Embase databases identified 27 studies relevant to this meta-analysis. There was statistical significance for total, solid and lung cancers, with meta-SMR values of 0.88, 0.80, and 0.89, respectively. There was evidence of stochastic effects by IR, but more definitive conclusions require additional analyses using standardized protocols to determine whether LDIR increases the risk of solid cancer-related mortality. PMID:29725540

  19. Effectiveness and economic evaluation of chiropractic care for the treatment of low back pain: a systematic review protocol.

    PubMed

    Blanchette, Marc-André; Bussières, André; Stochkendahl, Mette Jensen; Boruff, Jill; Harrison, Pamela

    2015-03-18

    Chiropractic care is a common treatment for low back pain (LBP). Previous studies have failed to clarify the relative cost-effectiveness of chiropractic care in comparison with other commonly used approaches because previous attempts to synthetize the economic literature has only included partial economic evaluations. The objective of this project is to estimate the clinical effectiveness and cost-effectiveness of chiropractic care compared to other commonly used care approaches among adult patients with non-specific LBP. Two systematic reviews will be conducted to identify 1) randomized controlled trials and 2) full economic evaluations of chiropractic care for low back pain compared to standard care provided by other healthcare providers. We will conduct searches in specialized electronic databases for randomized controlled trials and full economic evaluations published between 1990 and 2014 using a combination of keywords and MeSH terms. This will be supplemented by a search of the gray literature. Citations, abstracts, and relevant papers will be screened for eligibility by two reviewers independently. Studies will be critically appraised using 1) the Cochrane risk of bias tool and 2) the Drummond (BMJ) checklist. Results will be summarized using Slavin's qualitative best-evidence synthesis approach. Data relating to the primary outcomes of the effectiveness study will be evaluated for inclusion in meta-analyses. The costs will be standardized to the same currency (USD) and adjusted to the same year for inflation. The incremental cost-effectiveness, incremental net benefit, and relevant confidant intervals will be recalculated in order to facilitate comparison between studies. Our review will evaluate both the clinical effectiveness and the cost-effectiveness associated with chiropractic care for LBP. A more precise estimate of the cost-effectiveness of chiropractic care for LBP relative to other forms of conservative care is needed for decision-makers and third-party payers to offer best care options for LBP. Our results will facilitate evidence-based management of patients with LBP and identify key areas for future research. The protocol is registered on PROSPERO ( CRD42014008746 ).

  20. Comparing the mental health of rural-to-urban migrant children and their counterparts in china: Protocol for a systematic review and meta-analysis.

    PubMed

    Zhang, Jun-Hua; Yan, Li-Xia; Yuan, Yang

    2018-04-01

    In recent years, the issue of migrant children with peasant parents working in cities has attracted widespread attention in recent years because of the sheer number and the benefits bundled in China's household. The focus has gradually extended from early education opportunities to all aspects of physical and mental development, especially the social adaptation and mental health of migrant children. The negative impact of environment changes on migrant children' mental health is very worrying for parents and the society. Some studies have found that immigrant children's mental health is significantly lower than their peers, but there are also studies that hold the opposite view. Thus, the mental health status of migrant children is still a controversial issue, which may have a certain relationship with the potential differences in the specific problems of mental health, regions, comparison objects, and researchers. The objective of this protocol is to investigate whether mental health and subdimensions differ between rural-to-urban migrant children and their counterparts living in China and examine study characteristics that might result in differences among studies. We will search PubMed, Embase, OVID, ERIC, Web of Science, and Chinese databases including CNKI, Chongqing VIP, and Wan Fang data from start to April 2018. Cross-sectional studies with a comparison of migrant children and their counterparts will be included. The primary outcome will be the mean and standard deviation of mental health and its sub-dimensions. Standardized mean difference is used as the main effect value. Subgroup analyses will be carried out by the location of studies and school type of. Sensitivity analyses will be conducted to assess the robustness of the findings. Analyses will be performed with RevMan and Stata software. This systematic review and meta-analysis will compare the mental health status of rural-to-urban migrant children and their counterparts living in China. The results of this systematic and meta-analysis will be helpful to get a more reliable understanding of the mental health of rural-to-urban migrant children and the reasons for the controversy on this issue.

  1. Yoga vs. physical therapy vs. education for chronic low back pain in predominantly minority populations: study protocol for a randomized controlled trial.

    PubMed

    Saper, Robert B; Sherman, Karen J; Delitto, Anthony; Herman, Patricia M; Stevans, Joel; Paris, Ruth; Keosaian, Julia E; Cerrada, Christian J; Lemaster, Chelsey M; Faulkner, Carol; Breuer, Maya; Weinberg, Janice

    2014-02-26

    Chronic low back pain causes substantial morbidity and cost to society while disproportionately impacting low-income and minority adults. Several randomized controlled trials show yoga is an effective treatment. However, the comparative effectiveness of yoga and physical therapy, a common mainstream treatment for chronic low back pain, is unknown. This is a randomized controlled trial for 320 predominantly low-income minority adults with chronic low back pain, comparing yoga, physical therapy, and education. Inclusion criteria are adults 18-64 years old with non-specific low back pain lasting ≥ 12 weeks and a self-reported average pain intensity of ≥ 4 on a 0-10 scale. Recruitment takes place at Boston Medical Center, an urban academic safety-net hospital and seven federally qualified community health centers located in diverse neighborhoods. The 52-week study has an initial 12-week Treatment Phase where participants are randomized in a 2:2:1 ratio into i) a standardized weekly hatha yoga class supplemented by home practice; ii) a standardized evidence-based exercise therapy protocol adapted from the Treatment Based Classification method, individually delivered by a physical therapist and supplemented by home practice; and iii) education delivered through a self-care book. Co-primary outcome measures are 12-week pain intensity measured on an 11-point numerical rating scale and back-specific function measured using the modified Roland Morris Disability Questionnaire. In the subsequent 40-week Maintenance Phase, yoga participants are re-randomized in a 1:1 ratio to either structured maintenance yoga classes or home practice only. Physical therapy participants are similarly re-randomized to either five booster sessions or home practice only. Education participants continue to follow recommendations of educational materials. We will also assess cost effectiveness from the perspectives of the individual, insurers, and society using claims databases, electronic medical records, self-report cost data, and study records. Qualitative data from interviews will add subjective detail to complement quantitative data. This trial is registered in ClinicalTrials.gov, with the ID number: NCT01343927.

  2. Yoga vs. physical therapy vs. education for chronic low back pain in predominantly minority populations: study protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Chronic low back pain causes substantial morbidity and cost to society while disproportionately impacting low-income and minority adults. Several randomized controlled trials show yoga is an effective treatment. However, the comparative effectiveness of yoga and physical therapy, a common mainstream treatment for chronic low back pain, is unknown. Methods/Design This is a randomized controlled trial for 320 predominantly low-income minority adults with chronic low back pain, comparing yoga, physical therapy, and education. Inclusion criteria are adults 18–64 years old with non-specific low back pain lasting ≥12 weeks and a self-reported average pain intensity of ≥4 on a 0–10 scale. Recruitment takes place at Boston Medical Center, an urban academic safety-net hospital and seven federally qualified community health centers located in diverse neighborhoods. The 52-week study has an initial 12-week Treatment Phase where participants are randomized in a 2:2:1 ratio into i) a standardized weekly hatha yoga class supplemented by home practice; ii) a standardized evidence-based exercise therapy protocol adapted from the Treatment Based Classification method, individually delivered by a physical therapist and supplemented by home practice; and iii) education delivered through a self-care book. Co-primary outcome measures are 12-week pain intensity measured on an 11-point numerical rating scale and back-specific function measured using the modified Roland Morris Disability Questionnaire. In the subsequent 40-week Maintenance Phase, yoga participants are re-randomized in a 1:1 ratio to either structured maintenance yoga classes or home practice only. Physical therapy participants are similarly re-randomized to either five booster sessions or home practice only. Education participants continue to follow recommendations of educational materials. We will also assess cost effectiveness from the perspectives of the individual, insurers, and society using claims databases, electronic medical records, self-report cost data, and study records. Qualitative data from interviews will add subjective detail to complement quantitative data. Trial registration This trial is registered in ClinicalTrials.gov, with the ID number: NCT01343927. PMID:24568299

  3. LBP and SIFT based facial expression recognition

    NASA Astrophysics Data System (ADS)

    Sumer, Omer; Gunes, Ece O.

    2015-02-01

    This study compares the performance of local binary patterns (LBP) and scale invariant feature transform (SIFT) with support vector machines (SVM) in automatic classification of discrete facial expressions. Facial expression recognition is a multiclass classification problem and seven classes; happiness, anger, sadness, disgust, surprise, fear and comtempt are classified. Using SIFT feature vectors and linear SVM, 93.1% mean accuracy is acquired on CK+ database. On the other hand, the performance of LBP-based classifier with linear SVM is reported on SFEW using strictly person independent (SPI) protocol. Seven-class mean accuracy on SFEW is 59.76%. Experiments on both databases showed that LBP features can be used in a fairly descriptive way if a good localization of facial points and partitioning strategy are followed.

  4. Evaluation of the performance of MP4-based procedures for a wide range of thermochemical and kinetic properties

    NASA Astrophysics Data System (ADS)

    Yu, Li-Juan; Wan, Wenchao; Karton, Amir

    2016-11-01

    We evaluate the performance of standard and modified MPn procedures for a wide set of thermochemical and kinetic properties, including atomization energies, structural isomerization energies, conformational energies, and reaction barrier heights. The reference data are obtained at the CCSD(T)/CBS level by means of the Wn thermochemical protocols. We find that none of the MPn-based procedures show acceptable performance for the challenging W4-11 and BH76 databases. For the other thermochemical/kinetic databases, the MP2.5 and MP3.5 procedures provide the most attractive accuracy-to-computational cost ratios. The MP2.5 procedure results in a weighted-total-root-mean-square deviation (WTRMSD) of 3.4 kJ/mol, whilst the computationally more expensive MP3.5 procedure results in a WTRMSD of 1.9 kJ/mol (the same WTRMSD obtained for the CCSD(T) method in conjunction with a triple-zeta basis set). We also assess the performance of the computationally economical CCSD(T)/CBS(MP2) method, which provides the best overall performance for all the considered databases, including W4-11 and BH76.

  5. Rationally optimized cryopreservation of multiple mouse embryonic stem cell lines: I--Comparative fundamental cryobiology of multiple mouse embryonic stem cell lines and the implications for embryonic stem cell cryopreservation protocols.

    PubMed

    Kashuba, Corinna M; Benson, James D; Critser, John K

    2014-04-01

    The post-thaw recovery of mouse embryonic stem cells (mESCs) is often assumed to be adequate with current methods. However as this publication will show, this recovery of viable cells actually varies significantly by genetic background. Therefore there is a need to improve the efficiency and reduce the variability of current mESC cryopreservation methods. To address this need, we employed the principles of fundamental cryobiology to improve the cryopreservation protocol of four mESC lines from different genetic backgrounds (BALB/c, CBA, FVB, and 129R1 mESCs) through a comparative study characterizing the membrane permeability characteristics and membrane integrity osmotic tolerance limits of each cell line. In the companion paper, these values were used to predict optimal cryoprotectants, cooling rates, warming rates, and plunge temperatures, and then these predicted optimal protocols were validated against standard freezing protocols. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. SimDoseCT: dose reporting software based on Monte Carlo simulation for a 320 detector-row cone-beam CT scanner and ICRP computational adult phantoms

    NASA Astrophysics Data System (ADS)

    Cros, Maria; Joemai, Raoul M. S.; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal

    2017-08-01

    This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.

  7. SimDoseCT: dose reporting software based on Monte Carlo simulation for a 320 detector-row cone-beam CT scanner and ICRP computational adult phantoms.

    PubMed

    Cros, Maria; Joemai, Raoul M S; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal

    2017-07-17

    This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.

  8. Meta-analysis comparing chewing gum versus standard postoperative care after colorectal resection

    PubMed Central

    Zhou, Jian-Guo; Tian, Xu

    2016-01-01

    Background Previous incomplete studies investigating the potential of chewing gum (CG) in patients undergoing colorectal resection did not obtain definitive conclusions. This updated meta-analysis was therefore conducted to evaluate the effect and safety of CG versus standard postoperative care protocols (SPCPs) after colorectal surgery. Results Total 26 RCTs enrolling 2214 patients were included in this study. The CG can be well-tolerated by all patients. Compared with SPCPs, CG was associated with shorter time to first flatus (weighted mean difference (WMD) −12.14 (95 per cent c.i. −15.71 to −8.56) hours; P < 0.001), bowl movement (WMD −17.32 (−23.41 to −11.22) hours; P < 0.001), bowel sounds (WMD −6.02 (−7.42 to −4.63) hours; P < 0.001), and length of hospital stay (WMD −0.95 (−1.55 to −0.35) days; P < 0.001), a lower risk of postoperative ileus (risk ratio (RR) 0.61 (0.44 to 0.83); P = 0.002), net beneficial and quality of life. There were no significant differences between the two groups in overall complications, nausea, vomiting, bloating, wound infection, bleeding, dehiscence, readmission, reoperation, mortality. Materials and Methods The potentially eligible randomized controlled trials (RCTs) that compared CG with SPCPs for colorectal resection were searched in PubMed, Embase, Cochrane library, China National Knowledge Infrastructure (CNKI), and Chinese Wanfang databases through May 2016. The trial sequential analysis was adopted to examine whether a firm conclusion for specific outcome can be drawn. Conclusions CG is benefit for enhancing return of gastrointestinal function after colorectal resection, and may be associated with lower risk of postoperative ileus. PMID:27588405

  9. Combination of Oral Antibiotics and Mechanical Bowel Preparation Reduces Surgical Site Infection in Colorectal Surgery.

    PubMed

    Ohman, Kerri A; Wan, Leping; Guthrie, Tracey; Johnston, Bonnie; Leinicke, Jennifer A; Glasgow, Sean C; Hunt, Steven R; Mutch, Matthew G; Wise, Paul E; Silviera, Matthew L

    2017-10-01

    Surgical site infections (SSI) are a common complication after colorectal surgery. An infection prevention bundle (IPB) was implemented to improve outcomes. A standardized IPB that included the administration of oral antibiotics with a mechanical bowel preparation, preoperative shower with chlorhexidine, hair removal and skin preparation in holding, antibiotic wound irrigation, and a "clean-closure" protocol was implemented in January 2013. Data from the American College of Surgeons NSQIP were analyzed at a single academic institution to compare pre-IPB and post-IPB SSI rates. In January 2014, a prospective database was implemented to determine compliance with individual IPB elements and their effect on outcomes. For the 24 months pre-IPB, the overall SSI rate was 19.7%. During the 30 months after IPB implementation, the SSI rate decreased to 8.2% (p < 0.0001). A subset of 307 patients was identified in both NSQIP and our prospective compliance databases. Elements of IPB associated with decreased SSI rates included preoperative shower with chlorhexidine (4.6% vs 16.2%; p = 0.005), oral antibiotics (3.4% vs 15.4%; p < 0.001), and mechanical bowel preparation (4.4% vs 14.3%; p = 0.008). Patients who received a full bowel preparation of both oral antibiotics and a mechanical bowel preparation had a 2.7% SSI rate compared with 15.8% for all others (p < 0.001). On multivariate analysis, full bowel preparation was independently associated with significantly fewer SSI (adjusted odds ratio 0.2; 95% CI 0.1 to 0.9; p = 0.006). Implementation of an IPB was successful in decreasing SSI rates in colorectal surgery patients. The combination of oral antibiotics with a mechanical bowel preparation was the strongest predictor of decreased SSI. Copyright © 2017 American College of Surgeons. All rights reserved.

  10. Inter Annual Variability of the Acoustic Propagation in the Yellow Sea Identified from a Synoptic Monthly Gridded Database as Compared with GDEM

    DTIC Science & Technology

    2016-09-01

    the world climate is in fact warming due to anthropogenic causes (Anderegg et al. 2010; Solomon et al. 2009). To put this in terms for this research ...2006). The present research uses a 0.5’ resolution. B. SEDIMENTS DATABASE There are four openly available sediment databases: Enhanced, Standard...DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) This research investigates the inter-annual acoustic variability in the Yellow Sea identified from

  11. MedBlock: Efficient and Secure Medical Data Sharing Via Blockchain.

    PubMed

    Fan, Kai; Wang, Shangyang; Ren, Yanhui; Li, Hui; Yang, Yintang

    2018-06-21

    With the development of electronic information technology, electronic medical records (EMRs) have been a common way to store the patients' data in hospitals. They are stored in different hospitals' databases, even for the same patient. Therefore, it is difficult to construct a summarized EMR for one patient from multiple hospital databases due to the security and privacy concerns. Meanwhile, current EMRs systems lack a standard data management and sharing policy, making it difficult for pharmaceutical scientists to develop precise medicines based on data obtained under different policies. To solve the above problems, we proposed a blockchain-based information management system, MedBlock, to handle patients' information. In this scheme, the distributed ledger of MedBlock allows the efficient EMRs access and EMRs retrieval. The improved consensus mechanism achieves consensus of EMRs without large energy consumption and network congestion. In addition, MedBlock also exhibits high information security combining the customized access control protocols and symmetric cryptography. MedBlock can play an important role in the sensitive medical information sharing.

  12. Evaluation of an adult insulin infusion protocol at an academic medical center.

    PubMed

    Petrov, Katerina I; Burns, Tammy L; Drincic, Andjela

    2012-05-01

    Acknowledging evidence of possible detrimental effects of tightly controlled blood glucose levels, the American Association of Clinical Endocrinologists and the American Diabetes Association published a consensus statement recommending less strict control for most diabetic patients. As a result of these recommendations, our academic center at Creighton University Medical Center revised its adult insulin infusion protocol to target blood glucose levels ranging from 120 to 180 mg/dL for regular (standard) glycemic control and 80 to 120 mg/dL for tight control; previous targets had ranged from 80 to 180 mg/dL and 70 to 110 mg/dL, respectively. The primary objective was to evaluate the time that blood glucose values were within the target range for patients receiving the new protocol, compared with patients receiving the previous protocol. Our study was designed to evaluate the effectiveness and safety of the revised protocol. Using a retrospective chart review, we collected data for 4 months from patients on the old insulin protocol (May to August 2009) and for 4 months from patients on the new protocol (September to December 2009). Secondary endpoints included the number of hypoglycemic episodes (blood glucose below 70 mg/dL) and severe hypoglycemic episodes (blood glucose 40 mg/dL or lower) experienced by patients receiving the new insulin protocol compared with those receiving the former protocol. Patient characteristics were similar at baseline. Blood glucose values stayed within the target range for a significantly shorter time with the new protocol than with the former protocol (44.6% vs. 56.8%, respectively; P < 0.001), probably because of the narrower target range in the revised protocol. No statistically significant differences in hypoglycemia were observed after the protocol was changed. Hypoglycemia occurred in 31% of the former-protocol patients compared with 18% of the revised-protocol patients. Severe hypoglycemia was experienced by 2.1% of patients on the old protocol and by 3.1% of patients on the new protocol. Rates of severe hypoglycemia were low (2.6%) with the original protocol. Patients' blood glucose levels were within the target range for a shorter time with the new protocol. Fewer episodes of hypoglycemia were recorded with the new protocol, but rates of severe hypoglycemia were similar with both protocols.

  13. Automatic control of pressure support for ventilator weaning in surgical intensive care patients.

    PubMed

    Schädler, Dirk; Engel, Christoph; Elke, Gunnar; Pulletz, Sven; Haake, Nils; Frerichs, Inéz; Zick, Günther; Scholz, Jens; Weiler, Norbert

    2012-03-15

    Despite its ability to reduce overall ventilation time, protocol-guided weaning from mechanical ventilation is not routinely used in daily clinical practice. Clinical implementation of weaning protocols could be facilitated by integration of knowledge-based, closed-loop controlled protocols into respirators. To determine whether automated weaning decreases overall ventilation time compared with weaning based on a standardized written protocol in an unselected surgical patient population. In this prospective controlled trial patients ventilated for longer than 9 hours were randomly allocated to receive either weaning with automatic control of pressure support ventilation (automated-weaning group) or weaning based on a standardized written protocol (control group) using the same ventilation mode. The primary end point of the study was overall ventilation time. Overall ventilation time (median [25th and 75th percentile]) did not significantly differ between the automated-weaning (31 [19-101] h; n = 150) and control groups (39 [20-118] h; n = 150; P = 0.178). Patients who underwent cardiac surgery (n = 132) exhibited significantly shorter overall ventilation times in the automated-weaning (24 [18-57] h) than in the control group (35 [20-93] h; P = 0.035). The automated-weaning group exhibited shorter ventilation times until the first spontaneous breathing trial (1 [0-15] vs. 9 [1-51] h; P = 0.001) and a trend toward fewer tracheostomies (17 vs. 28; P = 0.075). Overall ventilation times did not significantly differ between weaning using automatic control of pressure support ventilation and weaning based on a standardized written protocol. Patients after cardiac surgery may benefit from automated weaning. Implementation of additional control variables besides the level of pressure support may further improve automated-weaning systems. Clinical trial registered with www.clinicaltrials.gov (NCT 00445289).

  14. Efficacy of right unilateral ultrabrief pulse width ECT: a preliminary report.

    PubMed

    Magid, Michelle; Truong, Liz; Trevino, Kenneth; Husain, Mustafa

    2013-12-01

    Ultrabrief (right unilateral) electroconvulsive therapy (UB-RU ECT) is a newer form of ECT, which uses a shorter pulse width than the standard ECT (0.3 vs 1.0 millisecond, respectively). As a result, the use of UB ECT may provide a means of further decreasing ECT-related cognitive adverse effects. In 2011, the University of Texas Southwestern Department of ECT in Austin adopted a UB ECT protocol. The purpose of this study was to perform a preliminary evaluation of the effectiveness and efficiency of UB-RU ECT. This study also examined whether sex, age, or diagnosis affected response rates. This retrospective chart review identified 62 patients treated with the UB ECT protocol. An analysis of ECT response rates and demographic characteristics was conducted based on the data from clinical evaluations and Patient Health Questionnaire 9. Sixty-eight percent of patients in the study responded to ECT; 55% responded to UB pulse width RU ECT with another 13% responding when switched to standard pulse width bilateral ECT. The mean number of treatments in an index ECT series was 12.5. There was no statistically significant difference in response rates between bipolar and unipolar depressed patients. Men required progression to bilateral treatment more than women. This UB ECT protocol demonstrated a similar response rate when compared to standard ECT protocols; however, an increase in the number of treatments was required. Ultrabrief protocols are a viable option for both bipolar and unipolar depression. In men, UB ECT protocols may be less advantageous due to a need to overcome a potentially higher seizure threshold in men; however, additional research is needed to confirm this finding.

  15. Evaluation of a navigation system for dental implantation as a tool to train novice dental practitioners.

    PubMed

    Casap, Nardy; Nadel, Sahar; Tarazi, Eyal; Weiss, Ervin I

    2011-10-01

    This study evaluated the benefits of a virtual reality navigation system for teaching the surgical stage of dental implantation to final-year dental students. The study aimed to assess the students' performance in dental implantation assignments by comparing freehand protocols with virtual reality navigation. Forty final-year dentistry students without previous experience in dental implantation surgery were given an implantation assignment comprising 3 tasks. Marking, drilling, and widening of implant holes were executed by a freehand protocol on the 2 mandibular sides by 1 group and by virtual reality navigation on 1 side and contralaterally with the freehand protocol by the other group. Subjective and objective assessments of the students' performance were graded. Marking with the navigation system was more accurate than with the standard protocol. The 2 groups performed similarly in the 2-mm drilling on the 2 mandibular sides. Widening of the 2 mesial holes to 3 mm was significantly better with the second execution in the standard protocol group, but not in the navigation group. The navigation group's second-site freehand drilling of the molar was significantly worse than the first. The execution of all assignments was significantly faster in the freehand group than in the navigation group (60.75 vs 77.25 minutes, P = .02). Self-assessment only partly matched the objective measurements and was more realistic in the standard protocol group. Despite the improved performance with the navigation system, the added value of training in dental implantation surgery with virtual reality navigation was minimal. Copyright © 2011 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Cost-effectiveness of risk stratified followup after urethral reconstruction: a decision analysis.

    PubMed

    Belsante, Michael J; Zhao, Lee C; Hudak, Steven J; Lotan, Yair; Morey, Allen F

    2013-10-01

    We propose a novel risk stratified followup protocol for use after urethroplasty and explore potential cost savings. Decision analysis was performed comparing a symptom based, risk stratified protocol for patients undergoing excision and primary anastomosis urethroplasty vs a standard regimen of close followup for urethroplasty. Model assumptions included that excision and primary anastomosis has a 94% success rate, 11% of patients with successful urethroplasty had persistent lower urinary tract symptoms requiring cystoscopic evaluation, patients in whom treatment failed undergo urethrotomy and patients with recurrence on symptom based surveillance have a delayed diagnosis requiring suprapubic tube drainage. The Nationwide Inpatient Sample from 2010 was queried to identify the number of urethroplasties performed per year in the United States. Costs were obtained based on Medicare reimbursement rates. The 5-year cost of a symptom based, risk stratified followup protocol is $430 per patient vs $2,827 per patient using standard close followup practice. An estimated 7,761 urethroplasties were performed in the United States in 2010. Assuming that 60% were excision and primary anastomosis, and with more than 5 years of followup, the risk stratified protocol was projected to yield an estimated savings of $11,165,130. Sensitivity analysis showed that the symptom based, risk stratified followup protocol was far more cost-effective than standard close followup in all settings. Less than 1% of patients would be expected to have an asymptomatic recurrence using the risk stratified followup protocol. A risk stratified, symptom based approach to urethroplasty followup would produce a significant reduction in health care costs while decreasing unnecessary followup visits, invasive testing and radiation exposure. Copyright © 2013 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  17. Built to last? The sustainability of health system improvements, interventions and change strategies: a study protocol for a systematic review.

    PubMed

    Braithwaite, Jeffrey; Testa, Luke; Lamprell, Gina; Herkes, Jessica; Ludlow, Kristiana; McPherson, Elise; Campbell, Margie; Holt, Joanna

    2017-11-12

    The sustainability of healthcare interventions and change programmes is of increasing importance to researchers and healthcare stakeholders interested in creating sustainable health systems to cope with mounting stressors. The aim of this protocol is to extend earlier work and describe a systematic review to identify, synthesise and draw meaning from studies published within the last 5 years that measure the sustainability of interventions, improvement efforts and change strategies in the health system. The protocol outlines a method by which to execute a rigorous systematic review. The design includes applying primary and secondary data collection techniques, consisting of a comprehensive database search complemented by contact with experts, and searching secondary databases and reference lists, using snowballing techniques. The review and analysis process will occur via an abstract review followed by a full-text screening process. The inclusion criteria include English-language, peer-reviewed, primary, empirical research articles published after 2011 in scholarly journals, for which the full text is available. No restrictions on location will be applied. The review that results from this protocol will synthesise and compare characteristics of the included studies. Ultimately, it is intended that this will help make it easier to identify and design sustainable interventions, improvement efforts and change strategies. As no primary data were collected, ethical approval was not required. Results will be disseminated in conference presentations, peer-reviewed publications and among policymaker bodies interested in creating sustainable health systems. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Practical quantum private query of blocks based on unbalanced-state Bennett-Brassard-1984 quantum-key-distribution protocol

    NASA Astrophysics Data System (ADS)

    Wei, Chun-Yan; Gao, Fei; Wen, Qiao-Yan; Wang, Tian-Yin

    2014-12-01

    Until now, the only kind of practical quantum private query (QPQ), quantum-key-distribution (QKD)-based QPQ, focuses on the retrieval of a single bit. In fact, meaningful message is generally composed of multiple adjacent bits (i.e., a multi-bit block). To obtain a message from database, the user Alice has to query l times to get each ai. In this condition, the server Bob could gain Alice's privacy once he obtains the address she queried in any of the l queries, since each ai contributes to the message Alice retrieves. Apparently, the longer the retrieved message is, the worse the user privacy becomes. To solve this problem, via an unbalanced-state technique and based on a variant of multi-level BB84 protocol, we present a protocol for QPQ of blocks, which allows the user to retrieve a multi-bit block from database in one query. Our protocol is somewhat like the high-dimension version of the first QKD-based QPQ protocol proposed by Jacobi et al., but some nontrivial modifications are necessary.

  19. A proposed group management scheme for XTP multicast

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    The purpose of a group management scheme is to enable its associated transfer layer protocol to be responsive to user determined reliability requirements for multicasting. Group management (GM) must assist the client process in coordinating multicast group membership, allow the user to express the subset of the multicast group that a particular multicast distribution must reach in order to be successful (reliable), and provide the transfer layer protocol with the group membership information necessary to guarantee delivery to this subset. GM provides services and mechanisms that respond to the need of the client process or process level management protocols to coordinate, modify, and determine attributes of the multicast group, especially membership. XTP GM provides a link between process groups and their multicast groups by maintaining a group membership database that identifies members in a name space understood by the underlying transfer layer protocol. Other attributes of the multicast group useful to both the client process and the data transfer protocol may be stored in the database. Examples include the relative dispersion, most recent update, and default delivery parameters of a group.

  20. Practical quantum private query of blocks based on unbalanced-state Bennett-Brassard-1984 quantum-key-distribution protocol

    PubMed Central

    Wei, Chun-Yan; Gao, Fei; Wen, Qiao-Yan; Wang, Tian-Yin

    2014-01-01

    Until now, the only kind of practical quantum private query (QPQ), quantum-key-distribution (QKD)-based QPQ, focuses on the retrieval of a single bit. In fact, meaningful message is generally composed of multiple adjacent bits (i.e., a multi-bit block). To obtain a message from database, the user Alice has to query l times to get each ai. In this condition, the server Bob could gain Alice's privacy once he obtains the address she queried in any of the l queries, since each ai contributes to the message Alice retrieves. Apparently, the longer the retrieved message is, the worse the user privacy becomes. To solve this problem, via an unbalanced-state technique and based on a variant of multi-level BB84 protocol, we present a protocol for QPQ of blocks, which allows the user to retrieve a multi-bit block from database in one query. Our protocol is somewhat like the high-dimension version of the first QKD-based QPQ protocol proposed by Jacobi et al., but some nontrivial modifications are necessary. PMID:25518810

  1. Regional citrate anticoagulation in hemodialysis: an observational study of safety, efficacy, and effect on calcium balance during routine care.

    PubMed

    Singer, Richard F; Williams, Oliver; Mercado, Chari; Chen, Bonny; Talaulikar, Girish; Walters, Giles; Roberts, Darren M

    2016-01-01

    Regional citrate hemodialysis anticoagulation is used when heparin is contraindicated, but most protocols require large infusions of calcium and frequent intradialytic plasma ionized calcium measurements. The objective of this study was to determine the safety, efficacy, and effect on calcium balance of regional citrate anticoagulation using sparse plasma ionized calcium sampling. The design of this study was observational. The setting of this study was the hospital hemodialysis center. The subjects of this study were the hospital hemodialysis patients. Dialysate calcium concentration by atomic absorption spectroscopy and total dialysate weight were used as measurements. Regional citrate anticoagulation was introduced using zero calcium dialysate, pre-dialyzer citrate infusion, and post-dialyzer calcium infusion. Infusions were adjusted based on pre- and post-dialyzer calcium measurements obtained at least twice during a 4-h dialysis. The protocol was simplified after the first 357 sessions to dispense with post-dialyzer calcium measurements. Heparin-anticoagulated sessions were performed using acetate-acidified 1.25 mmol/L calcium or citrate-acidified 1.5 mmol/L calcium dialysate. Calcium balance assessment was by complete dialysate recovery. Safety and efficacy were assessed prospectively using a point-of-care database to record ionized calcium and clinical events. Groups were compared using t test, ANOVA, Wilcoxon rank sum, or Kruskal-Wallis as appropriate. Seventy-five patients received regional citrate-anticoagulated dialysis over 1051 dialysis sessions. Of these, 357 dialysis sessions were performed using the original citrate anticoagulation protocol and 694 using the simplified protocol. Dialysis was effective and safe. Only 3 dialyzers clotted; 1 patient suffered symptomatic hypercalcemia and none suffered symptomatic hypocalcemia. Calcium balance was assessed in 15 regional citrate-anticoagulated dialysis sessions and 30 heparin-anticoagulated sessions. The median calcium loss was 0.8 mmol/h dialyzed in both groups (p = 0.43), and end of treatment ionized calcium was the same in both groups (1.07 ± 0.04 mmol/L). Our findings for calcium balance, efficacy, and safety are valid only for the protocol studied, which excluded patient with severe liver dysfunction. Regional citrate dialysis can be performed safely and effectively using a sparse plasma calcium sampling protocol. The calcium balance induced by this protocol is not different to that seen in standard heparin-anticoagulated dialysis, but in the absence of prospective studies, it is unknown whether this is optimal for patient care.

  2. Landscape features, standards, and semantics in U.S. national topographic mapping databases

    USGS Publications Warehouse

    Varanka, Dalia

    2009-01-01

    The objective of this paper is to examine the contrast between local, field-surveyed topographical representation and feature representation in digital, centralized databases and to clarify their ontological implications. The semantics of these two approaches are contrasted by examining the categorization of features by subject domains inherent to national topographic mapping. When comparing five USGS topographic mapping domain and feature lists, results indicate that multiple semantic meanings and ontology rules were applied to the initial digital database, but were lost as databases became more centralized at national scales, and common semantics were replaced by technological terms.

  3. Bilateral key comparison SIM.T-K6.1 on humidity standards in the dew/frost-point temperature range from -25 °C to +20 °C

    NASA Astrophysics Data System (ADS)

    Meyer, C. W.; Hill, K. D.

    2015-01-01

    A Regional Metrology Organization (RMO) Key Comparison of dew/frost point temperatures was carried out by the National Institute of Standards and Technology (NIST, USA) and the National Research Council (NRC, Canada) between December 2014 and April 2015. The results of this comparison are reported here, along with descriptions of the humidity laboratory standards for NIST and NRC and the uncertainty budget for these standards. This report also describes the protocol for the comparison and presents the data acquired. The results are analyzed, determining degree of equivalence between the dew/frost-point standards of NIST and NRC. This paper is the final report of the comparison including analysis of the uncertainty of measurement results. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCT WG-KC, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  4. Spontaneous Swallow Frequency Compared with Clinical Screening in the Identification of Dysphagia in Acute Stroke

    PubMed Central

    Crary, Michael A.; Carnaby, Giselle D.; Sia, Isaac

    2017-01-01

    Background The aim of this study was to compare spontaneous swallow frequency analysis (SFA) with clinical screening protocols for identification of dysphagia in acute stroke. Methods In all, 62 patients with acute stroke were evaluated for spontaneous swallow frequency rates using a validated acoustic analysis technique. Independent of SFA, these same patients received a routine nurse-administered clinical dysphagia screening as part of standard stroke care. Both screening tools were compared against a validated clinical assessment of dysphagia for acute stroke. In addition, psychometric properties of SFA were compared against published, validated clinical screening protocols. Results Spontaneous SFA differentiates patients with versus without dysphagia after acute stroke. Using a previously identified cut point based on swallows per minute, spontaneous SFA demonstrated superior ability to identify dysphagia cases compared with a nurse-administered clinical screening tool. In addition, spontaneous SFA demonstrated equal or superior psychometric properties to 4 validated, published clinical dysphagia screening tools. Conclusions Spontaneous SFA has high potential to identify dysphagia in acute stroke with psychometric properties equal or superior to clinical screening protocols. PMID:25088166

  5. Spontaneous swallow frequency compared with clinical screening in the identification of dysphagia in acute stroke.

    PubMed

    Crary, Michael A; Carnaby, Giselle D; Sia, Isaac

    2014-09-01

    The aim of this study was to compare spontaneous swallow frequency analysis (SFA) with clinical screening protocols for identification of dysphagia in acute stroke. In all, 62 patients with acute stroke were evaluated for spontaneous swallow frequency rates using a validated acoustic analysis technique. Independent of SFA, these same patients received a routine nurse-administered clinical dysphagia screening as part of standard stroke care. Both screening tools were compared against a validated clinical assessment of dysphagia for acute stroke. In addition, psychometric properties of SFA were compared against published, validated clinical screening protocols. Spontaneous SFA differentiates patients with versus without dysphagia after acute stroke. Using a previously identified cut point based on swallows per minute, spontaneous SFA demonstrated superior ability to identify dysphagia cases compared with a nurse-administered clinical screening tool. In addition, spontaneous SFA demonstrated equal or superior psychometric properties to 4 validated, published clinical dysphagia screening tools. Spontaneous SFA has high potential to identify dysphagia in acute stroke with psychometric properties equal or superior to clinical screening protocols. Copyright © 2014 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  6. Quantification of the overall measurement uncertainty associated with the passive moss biomonitoring technique: Sample collection and processing.

    PubMed

    Aboal, J R; Boquete, M T; Carballeira, A; Casanova, A; Debén, S; Fernández, J A

    2017-05-01

    In this study we examined 6080 data gathered by our research group during more than 20 years of research on the moss biomonitoring technique, in order to quantify the variability generated by different aspects of the protocol and to calculate the overall measurement uncertainty associated with the technique. The median variance of the concentrations of different pollutants measured in moss tissues attributed to the different methodological aspects was high, reaching values of 2851 (ng·g -1 ) 2 for Cd (sample treatment), 35.1 (μg·g -1 ) 2 for Cu (sample treatment), 861.7 (ng·g -1 ) 2 and for Hg (material selection). These variances correspond to standard deviations that constitute 67, 126 and 59% the regional background levels of these elements in the study region. The overall measurement uncertainty associated with the worst experimental protocol (5 subsamples, refrigerated, washed, 5 × 5 m size of the sampling area and once a year sampling) was between 2 and 6 times higher than that associated with the optimal protocol (30 subsamples, dried, unwashed, 20 × 20 m size of the sampling area and once a week sampling), and between 1.5 and 7 times higher than that associated with the standardized protocol (30 subsamples and once a year sampling). The overall measurement uncertainty associated with the standardized protocol could generate variations of between 14 and 47% in the regional background levels of Cd, Cu, Hg, Pb and Zn in the study area and much higher levels of variation in polluted sampling sites. We demonstrated that although the overall measurement uncertainty of the technique is still high, it can be reduced by using already well defined aspects of the protocol. Further standardization of the protocol together with application of the information on the overall measurement uncertainty would improve the reliability and comparability of the results of different biomonitoring studies, thus extending use of the technique beyond the context of scientific research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. The Interlibrary Loan Protocol: An OSI Solution to ILL Messaging.

    ERIC Educational Resources Information Center

    Turner, Fay

    1990-01-01

    Discusses the interlibrary loan (ILL) protocol, a standard based on the principles of the Open Systems Interconnection (OSI) Reference Model. Benefits derived from protocol use are described, the status of the protocol as an international standard is reviewed, and steps taken by the National Library of Canada to facilitate migration to an ILL…

  8. An integrated tool for the diagnosis of voice disorders.

    PubMed

    Godino-Llorente, Juan I; Sáenz-Lechón, Nicolás; Osma-Ruiz, Víctor; Aguilera-Navarro, Santiago; Gómez-Vilda, Pedro

    2006-04-01

    A PC-based integrated aid tool has been developed for the analysis and screening of pathological voices. With it the user can simultaneously record speech, electroglottographic (EGG), and videoendoscopic signals, and synchronously edit them to select the most significant segments. These multimedia data are stored on a relational database, together with a patient's personal information, anamnesis, diagnosis, visits, explorations and any other comment the specialist may wish to include. The speech and EGG waveforms are analysed by means of temporal representations and the quantitative measurements of parameters such as spectrograms, frequency and amplitude perturbation measurements, harmonic energy, noise, etc. are calculated using digital signal processing techniques, giving an idea of the degree of hoarseness and quality of the voice register. Within this framework, the system uses a standard protocol to evaluate and build complete databases of voice disorders. The target users of this system are speech and language therapists and ear nose and throat (ENT) clinicians. The application can be easily configured to cover the needs of both groups of professionals. The software has a user-friendly Windows style interface. The PC should be equipped with standard sound and video capture cards. Signals are captured using common transducers: a microphone, an electroglottograph and a fiberscope or telelaryngoscope. The clinical usefulness of the system is addressed in a comprehensive evaluation section.

  9. Development of an exposure measurement database on five lung carcinogens (ExpoSYN) for quantitative retrospective occupational exposure assessment.

    PubMed

    Peters, Susan; Vermeulen, Roel; Olsson, Ann; Van Gelder, Rainer; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Williams, Nick; Woldbæk, Torill; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Dahmann, Dirk; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans

    2012-01-01

    SYNERGY is a large pooled analysis of case-control studies on the joint effects of occupational carcinogens and smoking in the development of lung cancer. A quantitative job-exposure matrix (JEM) will be developed to assign exposures to five major lung carcinogens [asbestos, chromium, nickel, polycyclic aromatic hydrocarbons (PAH), and respirable crystalline silica (RCS)]. We assembled an exposure database, called ExpoSYN, to enable such a quantitative exposure assessment. Existing exposure databases were identified and European and Canadian research institutes were approached to identify pertinent exposure measurement data. Results of individual air measurements were entered anonymized according to a standardized protocol. The ExpoSYN database currently includes 356 551 measurements from 19 countries. In total, 140 666 personal and 215 885 stationary data points were available. Measurements were distributed over the five agents as follows: RCS (42%), asbestos (20%), chromium (16%), nickel (15%), and PAH (7%). The measurement data cover the time period from 1951 to present. However, only a small portion of measurements (1.4%) were performed prior to 1975. The major contributing countries for personal measurements were Germany (32%), UK (22%), France (14%), and Norway and Canada (both 11%). ExpoSYN is a unique occupational exposure database with measurements from 18 European countries and Canada covering a time period of >50 years. This database will be used to develop a country-, job-, and time period-specific quantitative JEM. This JEM will enable data-driven quantitative exposure assessment in a multinational pooled analysis of community-based lung cancer case-control studies.

  10. SSPI - Space Service Provider Infrastructure: Image Information Mining and Management Prototype for a Distributed Environment

    NASA Astrophysics Data System (ADS)

    Candela, L.; Ruggieri, G.; Giancaspro, A.

    2004-09-01

    In the sphere of "Multi-Mission Ground Segment" Italian Space Agency project, some innovative technologies such as CORBA[1], Z39.50[2], XML[3], Java[4], Java server Pages[4] and C++ has been experimented. The SSPI system (Space Service Provider Infrastructure) is the prototype of a distributed environment aimed to facilitate the access to Earth Observation (EO) data. SSPI allows to ingests, archive, consolidate, visualize and evaluate these data. Hence, SSPI is not just a database of or a data repository, but an application that by means of a set of protocols, standards and specifications provides a unified access to multi-mission EO data.

  11. [Medical imaging in tumor precision medicine: opportunities and challenges].

    PubMed

    Xu, Jingjing; Tan, Yanbin; Zhang, Minming

    2017-05-25

    Tumor precision medicine is an emerging approach for tumor diagnosis, treatment and prevention, which takes account of individual variability of environment, lifestyle and genetic information. Tumor precision medicine is built up on the medical imaging innovations developed during the past decades, including the new hardware, new imaging agents, standardized protocols, image analysis and multimodal imaging fusion technology. Also the development of automated and reproducible analysis algorithm has extracted large amount of information from image-based features. With the continuous development and mining of tumor clinical and imaging databases, the radiogenomics, radiomics and artificial intelligence have been flourishing. Therefore, these new technological advances bring new opportunities and challenges to the application of imaging in tumor precision medicine.

  12. An overview of platelet products (PRP, PRGF, PRF, etc.) in the Iranian studies.

    PubMed

    Raeissadat, Seyed Ahmad; Babaee, Marzieh; Rayegani, Seyed Mansour; Hashemi, Zahra; Hamidieh, Amir Ali; Mojgani, Parviz; Fouladi Vanda, Hossein

    2017-11-01

    The aim of the study was to carry out a review of published studies on various platelet products in Iranian studies. Electronic databases were searched for relevant articles. Two review authors independently extracted data via a tested extraction sheet, and disagreements were resolved by a meeting with a third review author. Bone disorders (25%), wound and fistula (16%), dental and gingival disorders (14%) and osteoarthritis (11%) have more relative frequency based on different fields. The necessity of pursuing standard protocols in the preparation of platelet products, stating the precise content of platelets and growth factors, and long-term follow-up of study subjects were the most important points in Iranian studies.

  13. EuroPhenome and EMPReSS: online mouse phenotyping resource

    PubMed Central

    Mallon, Ann-Marie; Hancock, John M.

    2008-01-01

    EuroPhenome (http://www.europhenome.org) and EMPReSS (http://empress.har.mrc.ac.uk/) form an integrated resource to provide access to data and procedures for mouse phenotyping. EMPReSS describes 96 Standard Operating Procedures for mouse phenotyping. EuroPhenome contains data resulting from carrying out EMPReSS protocols on four inbred laboratory mouse strains. As well as web interfaces, both resources support web services to enable integration with other mouse phenotyping and functional genetics resources, and are committed to initiatives to improve integration of mouse phenotype databases. EuroPhenome will be the repository for a recently initiated effort to carry out large-scale phenotyping on a large number of knockout mouse lines (EUMODIC). PMID:17905814

  14. EuroPhenome and EMPReSS: online mouse phenotyping resource.

    PubMed

    Mallon, Ann-Marie; Blake, Andrew; Hancock, John M

    2008-01-01

    EuroPhenome (http://www.europhenome.org) and EMPReSS (http://empress.har.mrc.ac.uk/) form an integrated resource to provide access to data and procedures for mouse phenotyping. EMPReSS describes 96 Standard Operating Procedures for mouse phenotyping. EuroPhenome contains data resulting from carrying out EMPReSS protocols on four inbred laboratory mouse strains. As well as web interfaces, both resources support web services to enable integration with other mouse phenotyping and functional genetics resources, and are committed to initiatives to improve integration of mouse phenotype databases. EuroPhenome will be the repository for a recently initiated effort to carry out large-scale phenotyping on a large number of knockout mouse lines (EUMODIC).

  15. An Extensible Information Grid for Risk Management

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David G.

    2003-01-01

    This paper describes recent work on developing an extensible information grid for risk management at NASA - a RISK INFORMATION GRID. This grid is being developed by integrating information grid technology with risk management processes for a variety of risk related applications. To date, RISK GRID applications are being developed for three main NASA processes: risk management - a closed-loop iterative process for explicit risk management, program/project management - a proactive process that includes risk management, and mishap management - a feedback loop for learning from historical risks that escaped other processes. This is enabled through an architecture involving an extensible database, structuring information with XML, schemaless mapping of XML, and secure server-mediated communication using standard protocols.

  16. Combined Protocol for Acute Malnutrition Study (ComPAS) in rural South Sudan and urban Kenya: study protocol for a randomized controlled trial.

    PubMed

    Bailey, Jeanette; Lelijveld, Natasha; Marron, Bethany; Onyoo, Pamela; Ho, Lara S; Manary, Mark; Briend, André; Opondo, Charles; Kerac, Marko

    2018-04-24

    Acute malnutrition is a continuum condition, but severe and moderate forms are treated separately, with different protocols and therapeutic products, managed by separate United Nations agencies. The Combined Protocol for Acute Malnutrition Study (ComPAS) aims to simplify and unify the treatment of uncomplicated severe and moderate acute malnutrition (SAM and MAM) for children 6-59 months into one protocol in order to improve the global coverage, quality, continuity of care and cost-effectiveness of acute malnutrition treatment in resource-constrained settings. This study is a multi-site, cluster randomized non-inferiority trial with 12 clusters in Kenya and 12 clusters in South Sudan. Participants are 3600 children aged 6-59 months with uncomplicated acute malnutrition. This study will evaluate the impact of a simplified and combined protocol for the treatment of SAM and MAM compared to the standard protocol, which is the national treatment protocol in each country. We will assess recovery rate as a primary outcome and coverage, defaulting, death, length of stay, average weekly weight gain and average weekly mid-upper arm circumference (MUAC) gain as secondary outcomes. Recovery rate is defined across both treatment arms as MUAC ≥125 mm and no oedema for two consecutive visits. Per-protocol and intention-to-treat analyses will be conducted. If the combined protocol is shown to be non-inferior to the standard protocol, updating guidelines to use the combined protocol would eliminate the need for separate products, resources and procedures for MAM treatment. This would likely be more cost-effective, increase availability of services, enable earlier case finding and treatment before deterioration of MAM into SAM, promote better continuity of care and improve community perceptions of the programme. ISRCTN, ISRCTN30393230 . Registered on 16 March 2017.

  17. Simulation-based Randomized Comparative Assessment of Out-of-Hospital Cardiac Arrest Resuscitation Bundle Completion by Emergency Medical Service Teams Using Standard Life Support or an Experimental Automation-assisted Approach.

    PubMed

    Choi, Bryan; Asselin, Nicholas; Pettit, Catherine C; Dannecker, Max; Machan, Jason T; Merck, Derek L; Merck, Lisa H; Suner, Selim; Williams, Kenneth A; Jay, Gregory D; Kobayashi, Leo

    2016-12-01

    Effective resuscitation of out-of-hospital cardiac arrest (OHCA) patients is challenging. Alternative resuscitative approaches using electromechanical adjuncts may improve provider performance. Investigators applied simulation to study the effect of an experimental automation-assisted, goal-directed OHCA management protocol on EMS providers' resuscitation performance relative to standard protocols and equipment. Two-provider (emergency medical technicians (EMT)-B and EMT-I/C/P) teams were randomized to control or experimental group. Each team engaged in 3 simulations: baseline simulation (standard roles); repeat simulation (standard roles); and abbreviated repeat simulation (reversed roles, i.e., basic life support provider performing ALS tasks). Control teams used standard OHCA protocols and equipment (with high-performance cardiopulmonary resuscitation training intervention); for second and third simulations, experimental teams performed chest compression, defibrillation, airway, pulmonary ventilation, vascular access, medication, and transport tasks with goal-directed protocol and resuscitation-automating devices. Videorecorders and simulator logs collected resuscitation data. Ten control and 10 experimental teams comprised 20 EMT-B's; 1 EMT-I, 8 EMT-C's, and 11 EMT-P's; study groups were not fully matched. Both groups suboptimally performed chest compressions and ventilations at baseline. For their second simulations, control teams performed similarly except for reduced on-scene time, and experimental teams improved their chest compressions (P=0.03), pulmonary ventilations (P<0.01), and medication administration (P=0.02); changes in their performance of chest compression, defibrillation, airway, and transport tasks did not attain significance against control teams' changes. Experimental teams maintained performance improvements during reversed-role simulations. Simulation-based investigation into OHCA resuscitation revealed considerable variability and improvable deficiencies in small EMS teams. Goal-directed, automation-assisted OHCA management augmented select resuscitation bundle element performance without comprehensive improvement.

  18. Development of a web database portfolio system with PACS connectivity for undergraduate health education and continuing professional development.

    PubMed

    Ng, Curtise K C; White, Peter; McKay, Janice C

    2009-04-01

    Increasingly, the use of web database portfolio systems is noted in medical and health education, and for continuing professional development (CPD). However, the functions of existing systems are not always aligned with the corresponding pedagogy and hence reflection is often lost. This paper presents the development of a tailored web database portfolio system with Picture Archiving and Communication System (PACS) connectivity, which is based on the portfolio pedagogy. Following a pre-determined portfolio framework, a system model with the components of web, database and mail servers, server side scripts, and a Query/Retrieve (Q/R) broker for conversion between Hypertext Transfer Protocol (HTTP) requests and Q/R service class of Digital Imaging and Communication in Medicine (DICOM) standard, is proposed. The system was piloted with seventy-seven volunteers. A tailored web database portfolio system (http://radep.hti.polyu.edu.hk) was developed. Technological arrangements for reinforcing portfolio pedagogy include popup windows (reminders) with guidelines and probing questions of 'collect', 'select' and 'reflect' on evidence of development/experience, limitation in the number of files (evidence) to be uploaded, the 'Evidence Insertion' functionality to link the individual uploaded artifacts with reflective writing, capability to accommodate diversity of contents and convenient interfaces for reviewing portfolios and communication. Evidence to date suggests the system supports users to build their portfolios with sound hypertext reflection under a facilitator's guidance, and with reviewers to monitor students' progress providing feedback and comments online in a programme-wide situation.

  19. SMITH: a LIMS for handling next-generation sequencing workflows

    PubMed Central

    2014-01-01

    Background Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). Methods SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. Results SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. Conclusions SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis. PMID:25471934

  20. SMITH: a LIMS for handling next-generation sequencing workflows.

    PubMed

    Venco, Francesco; Vaskin, Yuriy; Ceol, Arnaud; Muller, Heiko

    2014-01-01

    Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis.

Top