Sample records for comprehensive analysis tool

  1. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  2. Financing Alternatives Comparison Tool

    EPA Pesticide Factsheets

    FACT is a financial analysis tool that helps identify the most cost-effective method to fund a wastewater or drinking water management project. It produces a comprehensive analysis that compares various financing options.

  3. User Guide for the Financing Alternatives Comparison Tool

    EPA Pesticide Factsheets

    FACT is a financial analysis tool that helps identify the most cost-effective method to fund a wastewater or drinking water management project. It creates a comprehensive analysis that compares various financing options.

  4. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  5. Implementation of a novel communication tool and its effect on patient comprehension of care and satisfaction.

    PubMed

    Simmons, Stefanie Anne; Sharp, Brian; Fowler, Jennifer; Singal, Bonita

    2013-05-01

    Emergency department (ED) communication has been demonstrated as requiring improvement and ED patients have repeatedly demonstrated poor comprehension of the care they receive. Through patient focus groups, the authors developed a novel tool designed to improve communication and patient comprehension. This is a prospective, randomised controlled clinical trial to test the efficacy of a novel, patient-centred communication tool. Patients in a small community hospital ED were randomised to receive the instrument, which was utilised by the entire ED care team and served as a checklist or guide to the patients' ED stay. At the end of the ED stay, patients completed a survey of their comprehension of the care and a communication assessment tool-team survey (a validated instrument to assess satisfaction with communication). Three blinded chart reviewers scored patients' comprehension of their ED care as concordant, partially concordant or discordant with charted care. The authors tested whether there was a difference in satisfaction using a two-sample t test and a difference in comprehension using ordinal logistic regression analysis. 146 patients were enrolled in the study with 72 randomised to receive the communication instrument. There was no significant difference between groups in comprehension (OR=0.65, 95% CI 0.34 to 1.23, p=0.18) or communication assessment tool-team scores (difference=0.2, 95% CI: -3.4 to 3.8, p=0.91). Using their novel communication tool, the authors were not able to show a statistically significant improvement in either comprehension or satisfaction, though a tendency towards improved comprehension was seen.

  6. Comprehensive Safety Analysis 2010 Safety Measurement System (SMS) Methodology, Version 2.1 Revised December 2010

    DOT National Transportation Integrated Search

    2010-12-01

    This report documents the Safety Measurement System (SMS) methodology developed to support the Comprehensive Safety Analysis 2010 (CSA 2010) Initiative for the Federal Motor Carrier Safety Administration (FMCSA). The SMS is one of the major tools for...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  8. HiC-bench: comprehensive and reproducible Hi-C data analysis designed for parameter exploration and benchmarking.

    PubMed

    Lazaris, Charalampos; Kelly, Stephen; Ntziachristos, Panagiotis; Aifantis, Iannis; Tsirigos, Aristotelis

    2017-01-05

    Chromatin conformation capture techniques have evolved rapidly over the last few years and have provided new insights into genome organization at an unprecedented resolution. Analysis of Hi-C data is complex and computationally intensive involving multiple tasks and requiring robust quality assessment. This has led to the development of several tools and methods for processing Hi-C data. However, most of the existing tools do not cover all aspects of the analysis and only offer few quality assessment options. Additionally, availability of a multitude of tools makes scientists wonder how these tools and associated parameters can be optimally used, and how potential discrepancies can be interpreted and resolved. Most importantly, investigators need to be ensured that slight changes in parameters and/or methods do not affect the conclusions of their studies. To address these issues (compare, explore and reproduce), we introduce HiC-bench, a configurable computational platform for comprehensive and reproducible analysis of Hi-C sequencing data. HiC-bench performs all common Hi-C analysis tasks, such as alignment, filtering, contact matrix generation and normalization, identification of topological domains, scoring and annotation of specific interactions using both published tools and our own. We have also embedded various tasks that perform quality assessment and visualization. HiC-bench is implemented as a data flow platform with an emphasis on analysis reproducibility. Additionally, the user can readily perform parameter exploration and comparison of different tools in a combinatorial manner that takes into account all desired parameter settings in each pipeline task. This unique feature facilitates the design and execution of complex benchmark studies that may involve combinations of multiple tool/parameter choices in each step of the analysis. To demonstrate the usefulness of our platform, we performed a comprehensive benchmark of existing and new TAD callers exploring different matrix correction methods, parameter settings and sequencing depths. Users can extend our pipeline by adding more tools as they become available. HiC-bench consists an easy-to-use and extensible platform for comprehensive analysis of Hi-C datasets. We expect that it will facilitate current analyses and help scientists formulate and test new hypotheses in the field of three-dimensional genome organization.

  9. Requirements for Next Generation Comprehensive Analysis of Rotorcraft

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne; Data, Anubhav

    2008-01-01

    The unique demands of rotorcraft aeromechanics analysis have led to the development of software tools that are described as comprehensive analyses. The next generation of rotorcraft comprehensive analyses will be driven and enabled by the tremendous capabilities of high performance computing, particularly modular and scaleable software executed on multiple cores. Development of a comprehensive analysis based on high performance computing both demands and permits a new analysis architecture. This paper describes a vision of the requirements for this next generation of comprehensive analyses of rotorcraft. The requirements are described and substantiated for what must be included and justification provided for what should be excluded. With this guide, a path to the next generation code can be found.

  10. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  11. COMAN: a web server for comprehensive metatranscriptomics analysis.

    PubMed

    Ni, Yueqiong; Li, Jun; Panagiotou, Gianni

    2016-08-11

    Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.

  12. Informed consent comprehension in African research settings.

    PubMed

    Afolabi, Muhammed O; Okebe, Joseph U; McGrath, Nuala; Larson, Heidi J; Bojang, Kalifa; Chandramohan, Daniel

    2014-06-01

    Previous reviews on participants' comprehension of informed consent information have focused on developed countries. Experience has shown that ethical standards developed on Western values may not be appropriate for African settings where research concepts are unfamiliar. We undertook this review to describe how informed consent comprehension is defined and measured in African research settings. We conducted a comprehensive search involving five electronic databases: Medline, Embase, Global Health, EthxWeb and Bioethics Literature Database (BELIT). We also examined African Index Medicus and Google Scholar for relevant publications on informed consent comprehension in clinical studies conducted in sub-Saharan Africa. 29 studies satisfied the inclusion criteria; meta-analysis was possible in 21 studies. We further conducted a direct comparison of participants' comprehension on domains of informed consent in all eligible studies. Comprehension of key concepts of informed consent varies considerably from country to country and depends on the nature and complexity of the study. Meta-analysis showed that 47% of a total of 1633 participants across four studies demonstrated comprehension about randomisation (95% CI 13.9-80.9%). Similarly, 48% of 3946 participants in six studies had understanding about placebo (95% CI 19.0-77.5%), while only 30% of 753 participants in five studies understood the concept of therapeutic misconception (95% CI 4.6-66.7%). Measurement tools for informed consent comprehension were developed with little or no validation. Assessment of comprehension was carried out at variable times after disclosure of study information. No uniform definition of informed consent comprehension exists to form the basis for development of an appropriate tool to measure comprehension in African participants. Comprehension of key concepts of informed consent is poor among study participants across Africa. There is a vital need to develop a uniform definition for informed consent comprehension in low literacy research settings in Africa. This will be an essential step towards developing appropriate tools that can adequately measure informed consent comprehension. This may consequently suggest adequate measures to improve the informed consent procedure. © 2014 John Wiley & Sons Ltd.

  13. CRCDA—Comprehensive resources for cancer NGS data analysis

    PubMed Central

    Thangam, Manonanthini; Gopal, Ramesh Kumar

    2015-01-01

    Next generation sequencing (NGS) innovations put a compelling landmark in life science and changed the direction of research in clinical oncology with its productivity to diagnose and treat cancer. The aim of our portal comprehensive resources for cancer NGS data analysis (CRCDA) is to provide a collection of different NGS tools and pipelines under diverse classes with cancer pathways and databases and furthermore, literature information from PubMed. The literature data was constrained to 18 most common cancer types such as breast cancer, colon cancer and other cancers that exhibit in worldwide population. NGS-cancer tools for the convenience have been categorized into cancer genomics, cancer transcriptomics, cancer epigenomics, quality control and visualization. Pipelines for variant detection, quality control and data analysis were listed to provide out-of-the box solution for NGS data analysis, which may help researchers to overcome challenges in selecting and configuring individual tools for analysing exome, whole genome and transcriptome data. An extensive search page was developed that can be queried by using (i) type of data [literature, gene data and sequence read archive (SRA) data] and (ii) type of cancer (selected based on global incidence and accessibility of data). For each category of analysis, variety of tools are available and the biggest challenge is in searching and using the right tool for the right application. The objective of the work is collecting tools in each category available at various places and arranging the tools and other data in a simple and user-friendly manner for biologists and oncologists to find information easier. To the best of our knowledge, we have collected and presented a comprehensive package of most of the resources available in cancer for NGS data analysis. Given these factors, we believe that this website will be an useful resource to the NGS research community working on cancer. Database URL: http://bioinfo.au-kbc.org.in/ngs/ngshome.html. PMID:26450948

  14. Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project

    NASA Astrophysics Data System (ADS)

    van Eck, T.; Giardini, D.

    2010-12-01

    The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.

  15. Does Use of Text-to-Speech and Related Read-Aloud Tools Improve Reading Comprehension for Students with Reading Disabilities? A Meta-Analysis

    ERIC Educational Resources Information Center

    Wood, Sarah G.; Moxley, Jerad H.; Tighe, Elizabeth L.; Wagner, Richard K.

    2018-01-01

    Text-to-speech and related read-aloud tools are being widely implemented in an attempt to assist students' reading comprehension skills. Read-aloud software, including text-to-speech, is used to translate written text into spoken text, enabling one to listen to written text while reading along. It is not clear how effective text-to-speech is at…

  16. VIPER: Visualization Pipeline for RNA-seq, a Snakemake workflow for efficient and complete RNA-seq analysis.

    PubMed

    Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W

    2018-04-12

    RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.

  17. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23,677 profiles into a comprehensive quality atlas with fine classification for users.

  18. PinAPL-Py: A comprehensive web-application for the analysis of CRISPR/Cas9 screens.

    PubMed

    Spahn, Philipp N; Bath, Tyler; Weiss, Ryan J; Kim, Jihoon; Esko, Jeffrey D; Lewis, Nathan E; Harismendy, Olivier

    2017-11-20

    Large-scale genetic screens using CRISPR/Cas9 technology have emerged as a major tool for functional genomics. With its increased popularity, experimental biologists frequently acquire large sequencing datasets for which they often do not have an easy analysis option. While a few bioinformatic tools have been developed for this purpose, their utility is still hindered either due to limited functionality or the requirement of bioinformatic expertise. To make sequencing data analysis of CRISPR/Cas9 screens more accessible to a wide range of scientists, we developed a Platform-independent Analysis of Pooled Screens using Python (PinAPL-Py), which is operated as an intuitive web-service. PinAPL-Py implements state-of-the-art tools and statistical models, assembled in a comprehensive workflow covering sequence quality control, automated sgRNA sequence extraction, alignment, sgRNA enrichment/depletion analysis and gene ranking. The workflow is set up to use a variety of popular sgRNA libraries as well as custom libraries that can be easily uploaded. Various analysis options are offered, suitable to analyze a large variety of CRISPR/Cas9 screening experiments. Analysis output includes ranked lists of sgRNAs and genes, and publication-ready plots. PinAPL-Py helps to advance genome-wide screening efforts by combining comprehensive functionality with user-friendly implementation. PinAPL-Py is freely accessible at http://pinapl-py.ucsd.edu with instructions and test datasets.

  19. Comparing 2 National Organization-Level Workplace Health Promotion and Improvement Tools, 2013–2015

    PubMed Central

    Lang, Jason E.; Davis, Whitney D.; Jones-Jack, Nkenge H.; Mukhtar, Qaiser; Lu, Hua; Acharya, Sushama D.; Molloy, Meg E.

    2016-01-01

    Creating healthy workplaces is becoming more common. Half of employers that have more than 50 employees offer some type of workplace health promotion program. Few employers implement comprehensive evidence-based interventions that reach all employees and achieve desired health and cost outcomes. A few organization-level assessment and benchmarking tools have emerged to help employers evaluate the comprehensiveness and rigor of their health promotion offerings. Even fewer tools exist that combine assessment with technical assistance and guidance to implement evidence-based practices. Our descriptive analysis compares 2 such tools, the Centers for Disease Control and Prevention’s Worksite Health ScoreCard and Prevention Partners’ WorkHealthy America, and presents data from both to describe workplace health promotion practices across the United States. These tools are reaching employers of all types (N = 1,797), and many employers are using a comprehensive approach (85% of those using WorkHealthy America and 45% of those using the ScoreCard), increasing program effectiveness and impact. PMID:27685429

  20. MVP-CA Methodology for the Expert System Advocate's Advisor (ESAA)

    DOT National Transportation Integrated Search

    1997-11-01

    The Multi-Viewpoint Clustering Analysis (MVP-CA) tool is a semi-automated tool to provide a valuable aid for comprehension, verification, validation, maintenance, integration, and evolution of complex knowledge-based software systems. In this report,...

  1. Analysis of the comprehensibility of chemical hazard communication tools at the industrial workplace.

    PubMed

    Ta, Goh Choo; Mokhtar, Mazlin Bin; Mohd Mokhtar, Hj Anuar Bin; Ismail, Azmir Bin; Abu Yazid, Mohd Fadhil Bin Hj

    2010-01-01

    Chemical classification and labelling systems may be roughly similar from one country to another but there are significant differences too. In order to harmonize various chemical classification systems and ultimately provide consistent chemical hazard communication tools worldwide, the Globally Harmonized System of Classification and Labelling of Chemicals (GHS) was endorsed by the United Nations Economic and Social Council (ECOSOC). Several countries, including Japan, Taiwan, Korea and Malaysia, are now in the process of implementing GHS. It is essential to ascertain the comprehensibility of chemical hazard communication tools that are described in the GHS documents, namely the chemical labels and Safety Data Sheets (SDS). Comprehensibility Testing (CT) was carried out with a mixed group of industrial workers in Malaysia (n=150) and factors that influence the comprehensibility were analysed using one-way ANOVA. The ability of the respondents to retrieve information from the SDS was also tested in this study. The findings show that almost all the GHS pictograms meet the ISO comprehension criteria and it is concluded that the underlying core elements that enhance comprehension of GHS pictograms and which are also essential in developing competent persons in the use of SDS are training and education.

  2. Developing Cost Accounting and Decision Support Software for Comprehensive Community-Based Support Systems: An Analysis of Needs, Interest, and Readiness in the Field.

    ERIC Educational Resources Information Center

    Harrington, Robert; Jenkins, Peter; Marzke, Carolyn; Cohen, Carol

    Prominent among the new models of social service delivery are organizations providing comprehensive, community-based supports and services (CCBSS) to children and their families. A needs analysis explored CCBSS sites' interest in and readiness to use a software tool designed to help them make more effective internal resource allocation decisions…

  3. methylPipe and compEpiTools: a suite of R packages for the integrative analysis of epigenomics data.

    PubMed

    Kishore, Kamal; de Pretis, Stefano; Lister, Ryan; Morelli, Marco J; Bianchi, Valerio; Amati, Bruno; Ecker, Joseph R; Pelizzola, Mattia

    2015-09-29

    Numerous methods are available to profile several epigenetic marks, providing data with different genome coverage and resolution. Large epigenomic datasets are then generated, and often combined with other high-throughput data, including RNA-seq, ChIP-seq for transcription factors (TFs) binding and DNase-seq experiments. Despite the numerous computational tools covering specific steps in the analysis of large-scale epigenomics data, comprehensive software solutions for their integrative analysis are still missing. Multiple tools must be identified and combined to jointly analyze histone marks, TFs binding and other -omics data together with DNA methylation data, complicating the analysis of these data and their integration with publicly available datasets. To overcome the burden of integrating various data types with multiple tools, we developed two companion R/Bioconductor packages. The former, methylPipe, is tailored to the analysis of high- or low-resolution DNA methylomes in several species, accommodating (hydroxy-)methyl-cytosines in both CpG and non-CpG sequence context. The analysis of multiple whole-genome bisulfite sequencing experiments is supported, while maintaining the ability of integrating targeted genomic data. The latter, compEpiTools, seamlessly incorporates the results obtained with methylPipe and supports their integration with other epigenomics data. It provides a number of methods to score these data in regions of interest, leading to the identification of enhancers, lncRNAs, and RNAPII stalling/elongation dynamics. Moreover, it allows a fast and comprehensive annotation of the resulting genomic regions, and the association of the corresponding genes with non-redundant GeneOntology terms. Finally, the package includes a flexible method based on heatmaps for the integration of various data types, combining annotation tracks with continuous or categorical data tracks. methylPipe and compEpiTools provide a comprehensive Bioconductor-compliant solution for the integrative analysis of heterogeneous epigenomics data. These packages are instrumental in providing biologists with minimal R skills a complete toolkit facilitating the analysis of their own data, or in accelerating the analyses performed by more experienced bioinformaticians.

  4. Development of advanced structural analysis methodologies for predicting widespread fatigue damage in aircraft structures

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.

    1995-01-01

    NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.

  5. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*

    PubMed Central

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying

    2016-01-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  6. Impact of comprehensive two-dimensional gas chromatography with mass spectrometry on food analysis.

    PubMed

    Tranchida, Peter Q; Purcaro, Giorgia; Maimone, Mariarosa; Mondello, Luigi

    2016-01-01

    Comprehensive two-dimensional gas chromatography with mass spectrometry has been on the separation-science scene for about 15 years. This three-dimensional method has made a great positive impact on various fields of research, and among these that related to food analysis is certainly at the forefront. The present critical review is based on the use of comprehensive two-dimensional gas chromatography with mass spectrometry in the untargeted (general qualitative profiling and fingerprinting) and targeted analysis of food volatiles; attention is focused not only on its potential in such applications, but also on how recent advances in comprehensive two-dimensional gas chromatography with mass spectrometry will potentially be important for food analysis. Additionally, emphasis is devoted to the many instances in which straightforward gas chromatography with mass spectrometry is a sufficiently-powerful analytical tool. Finally, possible future scenarios in the comprehensive two-dimensional gas chromatography with mass spectrometry food analysis field are discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  8. A comprehensive comparison of tools for differential ChIP-seq analysis

    PubMed Central

    Steinhauser, Sebastian; Kurzawa, Nils; Eils, Roland

    2016-01-01

    ChIP-seq has become a widely adopted genomic assay in recent years to determine binding sites for transcription factors or enrichments for specific histone modifications. Beside detection of enriched or bound regions, an important question is to determine differences between conditions. While this is a common analysis for gene expression, for which a large number of computational approaches have been validated, the same question for ChIP-seq is particularly challenging owing to the complexity of ChIP-seq data in terms of noisiness and variability. Many different tools have been developed and published in recent years. However, a comprehensive comparison and review of these tools is still missing. Here, we have reviewed 14 tools, which have been developed to determine differential enrichment between two conditions. They differ in their algorithmic setups, and also in the range of applicability. Hence, we have benchmarked these tools on real data sets for transcription factors and histone modifications, as well as on simulated data sets to quantitatively evaluate their performance. Overall, there is a great variety in the type of signal detected by these tools with a surprisingly low level of agreement. Depending on the type of analysis performed, the choice of method will crucially impact the outcome. PMID:26764273

  9. Relevance of graph literacy in the development of patient-centered communication tools.

    PubMed

    Nayak, Jasmir G; Hartzler, Andrea L; Macleod, Liam C; Izard, Jason P; Dalkin, Bruce M; Gore, John L

    2016-03-01

    To determine the literacy skill sets of patients in the context of graphical interpretation of interactive dashboards. We assessed literacy characteristics of prostate cancer patients and assessed comprehension of quality of life dashboards. Health literacy, numeracy and graph literacy were assessed with validated tools. We divided patients into low vs. high numeracy and graph literacy. We report descriptive statistics on literacy, dashboard comprehension, and relationships between groups. We used correlation and multiple linear regressions to examine factors associated with dashboard comprehension. Despite high health literacy in educated patients (78% college educated), there was variation in numeracy and graph literacy. Numeracy and graph literacy scores were correlated (r=0.37). In those with low literacy, graph literacy scores most strongly correlated with dashboard comprehension (r=0.59-0.90). On multivariate analysis, graph literacy was independently associated with dashboard comprehension, adjusting for age, education, and numeracy level. Even among higher educated patients; variation in the ability to comprehend graphs exists. Clinicians must be aware of these differential proficiencies when counseling patients. Tools for patient-centered communication that employ visual displays need to account for literacy capabilities to ensure that patients can effectively engage these resources. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  11. A modelling tool for policy analysis to support the design of efficient and effective policy responses for complex public health problems.

    PubMed

    Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew

    2015-03-03

    In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.

  12. Competencies in Organizational E-Learning: Concepts and Tools

    ERIC Educational Resources Information Center

    Sicilia, Miguel-Angel, Ed.

    2007-01-01

    "Competencies in Organizational E-Learning: Concepts and Tools" provides a comprehensive view of the way competencies can be used to drive organizational e-learning, including the main conceptual elements, competency gap analysis, advanced related computing topics, the application of semantic Web technologies, and the integration of competencies…

  13. Microarray Я US: a user-friendly graphical interface to Bioconductor tools that enables accurate microarray data analysis and expedites comprehensive functional analysis of microarray results.

    PubMed

    Dai, Yilin; Guo, Ling; Li, Meng; Chen, Yi-Bu

    2012-06-08

    Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.

  14. Making a Game out of It: Using Web-Based Competitive Quizzes for Quantitative Analysis Content Review

    ERIC Educational Resources Information Center

    Grinias, James P.

    2017-01-01

    Online student-response systems provide instructors with an easy-to-use tool to instantly evaluate student comprehension. For comprehensive content review, turning this evaluation into a competitive game where students can compete against each other was found to be helpful and enjoyable for participating students. One specific online resource,…

  15. Lexical development of noun and predicate comprehension and production in isiZulu

    PubMed Central

    Ahmed, Saaliha

    2016-01-01

    This study seeks to investigate the development of noun and predicate comprehension and production in isiZulu-speaking children between the ages of 25 and 36 months. It compares lexical comprehension and production in isiZulu, using an Italian developed and validated vocabulary assessment tool: The Picture Naming Game (PiNG) developed by Bello, Giannantoni, Pettenati, Stefanini and Caselli (2012). The PiNG tool includes four subtests, one each for subnoun comprehension (NC), noun production (NP), predicate comprehension (PC), and predicate production (PP). Children are shown these lexical items and then asked to show comprehension and produce certain lexical items. After adaptation into the South African context, the adapted version of PiNG was used to directly assess the lexical development of isiZulu with the three main objectives to (1) test the efficiency of the adaptation of a vocabulary tool to measure isiZulu comprehension and production development, (2) test previous findings done in many cross-linguistic comparisons that have found that both comprehension and production performance increase with age for a lesser-studied language, and (3) present our findings around the comprehension and production of the linguistic categories of nouns and predicates. An analysis of the results reported in this study show an age effect throughout the entire sample. Across all the age groups, the comprehension of the noun and predicate subtests was better performed than the production of noun and predicate subtests. With regard to lexical items, the responses of children showed an influence of various factors, including the late acquisition of items, possible problems with stimuli presented to them, and the possible input received by the children from their home environment. PMID:27542416

  16. Lexical development of noun and predicate comprehension and production in isiZulu.

    PubMed

    Nicolas, Ramona Kunene; Ahmed, Saaliha

    2016-07-28

    This study seeks to investigate the development of noun and predicate comprehension and production in isiZulu-speaking children between the ages of 25 and 36 months. It compares lexical comprehension and production in isiZulu, using an Italian developed and validated vocabulary assessment tool: The Picture Naming Game (PiNG) developed by Bello, Giannantoni, Pettenati, Stefanini and Caselli (2012). The PiNG tool includes four subtests, one each for subnoun comprehension (NC), noun production (NP), predicate comprehension (PC), and predicate production (PP). Children are shown these lexical items and then asked to show comprehension and produce certain lexical items. After adaptation into the South African context, the adapted version of PiNG was used to directly assess the lexical development of isiZulu with the three main objectives to (1) test the efficiency of the adaptation of a vocabulary tool to measure isiZulu comprehension and production development, (2) test previous findings done in many cross-linguistic comparisons that have found that both comprehension and production performance increase with age for a lesser-studied language, and (3) present our findings around the comprehension and production of the linguistic categories of nouns and predicates. An analysis of the results reported in this study show an age effect throughout the entire sample. Across all the age groups, the comprehension of the noun and predicate subtests was better performed than the production of noun and predicate subtests. With regard to lexical items, the responses of children showed an influence of various factors, including the late acquisition of items, possible problems with stimuli presented to them, and the possible input received by the children from their home environment.

  17. Novel features and enhancements in BioBin, a tool for the biologically inspired binning and association analysis of rare variants

    PubMed Central

    Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D

    2018-01-01

    Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757

  18. FMAP: Functional Mapping and Analysis Pipeline for metagenomics and metatranscriptomics studies.

    PubMed

    Kim, Jiwoong; Kim, Min Soo; Koh, Andrew Y; Xie, Yang; Zhan, Xiaowei

    2016-10-10

    Given the lack of a complete and comprehensive library of microbial reference genomes, determining the functional profile of diverse microbial communities is challenging. The available functional analysis pipelines lack several key features: (i) an integrated alignment tool, (ii) operon-level analysis, and (iii) the ability to process large datasets. Here we introduce our open-sourced, stand-alone functional analysis pipeline for analyzing whole metagenomic and metatranscriptomic sequencing data, FMAP (Functional Mapping and Analysis Pipeline). FMAP performs alignment, gene family abundance calculations, and statistical analysis (three levels of analyses are provided: differentially-abundant genes, operons and pathways). The resulting output can be easily visualized with heatmaps and functional pathway diagrams. FMAP functional predictions are consistent with currently available functional analysis pipelines. FMAP is a comprehensive tool for providing functional analysis of metagenomic/metatranscriptomic sequencing data. With the added features of integrated alignment, operon-level analysis, and the ability to process large datasets, FMAP will be a valuable addition to the currently available functional analysis toolbox. We believe that this software will be of great value to the wider biology and bioinformatics communities.

  19. TOOLS FOR COMPARATIVE ANALYSIS OF ALTERNATIVES: COMPETING OR COMPLEMENTARY PERSPECTIVES?

    EPA Science Inventory

    A third generation of environmental policymaking and risk management will increasingly impose environmental measures, which may give rise to analyzing countervailing risks. Therefore, a comprehensive analysis of these risks associated with the decision alternatives at hand will e...

  20. Dynamic visualization techniques for high consequence software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less

  1. Views of general practitioners on the use of STOPP&START in primary care: a qualitative study.

    PubMed

    Dalleur, O; Feron, J-M; Spinewine, A

    2014-08-01

    STOPP (Screening Tool of Older Person's Prescriptions) and START (Screening Tool to Alert Doctors to Right Treatment) criteria aim at detecting potentially inappropriate prescribing in older people. The objective was to explore general practitioners' (GPs) perceptions regarding the use of the STOPP&START tool in their practice. We conducted three focus groups which were conveniently sampled. Vignettes with clinical cases were provided for discussion as well as a full version of the STOPP&START tool. Knowledge, strengths and weaknesses of the tool and its implementation were discussed. Two researchers independently performed content analysis, classifying quotes and creating new categories for emerging themes. Discussions highlighted incentives (e.g. systematic procedure for medication review) and barriers (e.g. time-consuming application) influencing the use of STOPP&START in primary care. Usefulness, comprehensiveness, and relevance of the tool were also questioned. Another important category emerging from the content analysis was the projected use of the tool. The GPs imagined key elements for the implementation in daily practice: computerized clinical decision support system, education, and multidisciplinary collaborations, especially at care transitions and in nursing homes. Despite variables views on the usefulness, comprehensiveness, and relevance of STOPP&START, GPs suggest the implementation of this tool in primary care within computerized clinical decision support systems, through education, and used as part of multidisciplinary collaborations.

  2. Bioconductor | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Bioconductor provides tools for the analysis and comprehension of high-throughput genomic data. R/Bioconductor will be enhanced to meet the increasing complexity of multiassay cancer genomics experiments.

  3. Comprehensive Analysis of Semantic Web Reasoners and Tools: A Survey

    ERIC Educational Resources Information Center

    Khamparia, Aditya; Pandey, Babita

    2017-01-01

    Ontologies are emerging as best representation techniques for knowledge based context domains. The continuing need for interoperation, collaboration and effective information retrieval has lead to the creation of semantic web with the help of tools and reasoners which manages personalized information. The future of semantic web lies in an ontology…

  4. PHOXTRACK-a tool for interpreting comprehensive datasets of post-translational modifications of proteins.

    PubMed

    Weidner, Christopher; Fischer, Cornelius; Sauer, Sascha

    2014-12-01

    We introduce PHOXTRACK (PHOsphosite-X-TRacing Analysis of Causal Kinases), a user-friendly freely available software tool for analyzing large datasets of post-translational modifications of proteins, such as phosphorylation, which are commonly gained by mass spectrometry detection. In contrast to other currently applied data analysis approaches, PHOXTRACK uses full sets of quantitative proteomics data and applies non-parametric statistics to calculate whether defined kinase-specific sets of phosphosite sequences indicate statistically significant concordant differences between various biological conditions. PHOXTRACK is an efficient tool for extracting post-translational information of comprehensive proteomics datasets to decipher key regulatory proteins and to infer biologically relevant molecular pathways. PHOXTRACK will be maintained over the next years and is freely available as an online tool for non-commercial use at http://phoxtrack.molgen.mpg.de. Users will also find a tutorial at this Web site and can additionally give feedback at https://groups.google.com/d/forum/phoxtrack-discuss. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. 16th IHIW: Global analysis of registry HLA haplotypes from 20 Million individuals: Report from the IHIW Registry Diversity Group

    PubMed Central

    Maiers, M; Gragert, L; Madbouly, A; Steiner, D; Marsh, S G E; Gourraud, P-A; Oudshoorn, M; Zanden, H; Schmidt, A H; Pingel, J; Hofmann, J; Müller, C; Eberhard, H-P

    2013-01-01

    This project has the goal to validate bioinformatics methods and tools for HLA haplotype frequency analysis specifically addressing unique issues of haematopoietic stem cell registry data sets. In addition to generating new methods and tools for the analysis of registry data sets, the intent is to produce a comprehensive analysis of HLA data from 20 million donors from the Bone Marrow Donors Worldwide (BMDW) database. This report summarizes the activity on this project as of the 16IHIW meeting in Liverpool. PMID:23280139

  6. Student Monitoring through Performance Matters and the Florida Comprehensive Assessment Exam: A Regression Analysis of Student Reading Achievement

    ERIC Educational Resources Information Center

    Cassidy-Floyd, Juliet

    2017-01-01

    Florida, from 1971 to 2014 has used the Florida Comprehensive Assessment Test (FCAT) as a yearly accountability tool throughout the education system in the state (Bureau of K-12 Assessment, 2005). Schools use their own assessments to determine if students are making progress throughout the year. In one school district within Florida, Performance…

  7. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  8. Open source tools and toolkits for bioinformatics: significance, and where are we?

    PubMed

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  9. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    EPA Pesticide Factsheets

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  10. A new spatial multi-criteria decision support tool for site selection for implementation of managed aquifer recharge.

    PubMed

    Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin

    2012-05-30

    This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  12. Comprehensive Analysis Modeling of Small-Scale UAS Rotors

    NASA Technical Reports Server (NTRS)

    Russell, Carl R.; Sekula, Martin K.

    2017-01-01

    Multicopter unmanned aircraft systems (UAS), or drones, have continued their explosive growth in recent years. With this growth comes demand for increased performance as the limits of existing technologies are reached. In order to better design multicopter UAS aircraft, better performance prediction tools are needed. This paper presents the results of a study aimed at using the rotorcraft comprehensive analysis code CAMRAD II to model a multicopter UAS rotor in hover. Parametric studies were performed to determine the level of fidelity needed in the analysis code inputs to achieve results that match test data. Overall, the results show that CAMRAD II is well suited to model small-scale UAS rotors in hover. This paper presents the results of the parametric studies as well as recommendations for the application of comprehensive analysis codes to multicopter UAS rotors.

  13. Introducing W.A.T.E.R.S.: a workflow for the alignment, taxonomy, and ecology of ribosomal sequences.

    PubMed

    Hartman, Amber L; Riddle, Sean; McPhillips, Timothy; Ludäscher, Bertram; Eisen, Jonathan A

    2010-06-12

    For more than two decades microbiologists have used a highly conserved microbial gene as a phylogenetic marker for bacteria and archaea. The small-subunit ribosomal RNA gene, also known as 16 S rRNA, is encoded by ribosomal DNA, 16 S rDNA, and has provided a powerful comparative tool to microbial ecologists. Over time, the microbial ecology field has matured from small-scale studies in a select number of environments to massive collections of sequence data that are paired with dozens of corresponding collection variables. As the complexity of data and tool sets have grown, the need for flexible automation and maintenance of the core processes of 16 S rDNA sequence analysis has increased correspondingly. We present WATERS, an integrated approach for 16 S rDNA analysis that bundles a suite of publicly available 16 S rDNA analysis software tools into a single software package. The "toolkit" includes sequence alignment, chimera removal, OTU determination, taxonomy assignment, phylogentic tree construction as well as a host of ecological analysis and visualization tools. WATERS employs a flexible, collection-oriented 'workflow' approach using the open-source Kepler system as a platform. By packaging available software tools into a single automated workflow, WATERS simplifies 16 S rDNA analyses, especially for those without specialized bioinformatics, programming expertise. In addition, WATERS, like some of the newer comprehensive rRNA analysis tools, allows researchers to minimize the time dedicated to carrying out tedious informatics steps and to focus their attention instead on the biological interpretation of the results. One advantage of WATERS over other comprehensive tools is that the use of the Kepler workflow system facilitates result interpretation and reproducibility via a data provenance sub-system. Furthermore, new "actors" can be added to the workflow as desired and we see WATERS as an initial seed for a sizeable and growing repository of interoperable, easy-to-combine tools for asking increasingly complex microbial ecology questions.

  14. PRAPI: post-transcriptional regulation analysis pipeline for Iso-Seq.

    PubMed

    Gao, Yubang; Wang, Huiyuan; Zhang, Hangxiao; Wang, Yongsheng; Chen, Jinfeng; Gu, Lianfeng

    2018-05-01

    The single-molecule real-time (SMRT) isoform sequencing (Iso-Seq) based on Pacific Bioscience (PacBio) platform has received increasing attention for its ability to explore full-length isoforms. Thus, comprehensive tools for Iso-Seq bioinformatics analysis are extremely useful. Here, we present a one-stop solution for Iso-Seq analysis, called PRAPI to analyze alternative transcription initiation (ATI), alternative splicing (AS), alternative cleavage and polyadenylation (APA), natural antisense transcripts (NAT), and circular RNAs (circRNAs) comprehensively. PRAPI is capable of combining Iso-Seq full-length isoforms with short read data, such as RNA-Seq or polyadenylation site sequencing (PAS-seq) for differential expression analysis of NAT, AS, APA and circRNAs. Furthermore, PRAPI can annotate new genes and correct mis-annotated genes when gene annotation is available. Finally, PRAPI generates high-quality vector graphics to visualize and highlight the Iso-Seq results. The Dockerfile of PRAPI is available at http://www.bioinfor.org/tool/PRAPI. lfgu@fafu.edu.cn.

  15. ATLAS (Automatic Tool for Local Assembly Structures) - A Comprehensive Infrastructure for Assembly, Annotation, and Genomic Binning of Metagenomic and Metaranscripomic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Richard A.; Brown, Joseph M.; Colby, Sean M.

    ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based onmore » majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.« less

  16. The Development and Validation of a Rapid Assessment Tool of Primary Care in China

    PubMed Central

    Mei, Jie; Liang, Yuan; Shi, LeiYu; Zhao, JingGe; Wang, YuTan; Kuang, Li

    2016-01-01

    Introduction. With Chinese health care reform increasingly emphasizing the importance of primary care, the need for a tool to evaluate primary care performance and service delivery is clear. This study presents a methodology for a rapid assessment of primary care organizations and service delivery in China. Methods. The study translated and adapted the Primary Care Assessment Tool-Adult Edition (PCAT-AE) into a Chinese version to measure core dimensions of primary care, namely, first contact, continuity, comprehensiveness, and coordination. A cross-sectional survey was conducted to assess the validity and reliability of the Chinese Rapid Primary Care Assessment Tool (CR-PCAT). Eight community health centers in Guangdong province have been selected to participate in the survey. Results. A total of 1465 effective samples were included for data analysis. Eight items were eliminated following principal component analysis and reliability testing. The principal component analysis extracted five multiple-item scales (first contact utilization, first contact accessibility, ongoing care, comprehensiveness, and coordination). The tests of scaling assumptions were basically met. Conclusion. The standard psychometric evaluation indicates that the scales have achieved relatively good reliability and validity. The CR-PCAT provides a rapid and reliable measure of four core dimensions of primary care, which could be applied in various scenarios. PMID:26885509

  17. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  18. Timeline analysis tools for law enforcement

    NASA Astrophysics Data System (ADS)

    Mucks, John

    1997-02-01

    The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.

  19. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Treesearch

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  20. Helping Undergraduates Think Like a CEO: The APPLE Analysis as a Teaching Tool for Strategic Management

    ERIC Educational Resources Information Center

    Domke-Damonte, Darla J.; Keels, J. Kay; Black, Janice A.

    2013-01-01

    This paper presents a class assignment, entitled the APPLE Analysis, for developing pre-analysis comprehension about company conditions, resources and challenges as a part of the undergraduate strategic management capstone course. Because undergraduate students lack the causal maps of seasoned executives, this assignment helps students to develop…

  1. Comprehensive Modeling and Analysis of Rotorcraft Variable Speed Propulsion System With Coupled Engine/Transmission/Rotor Dynamics

    NASA Technical Reports Server (NTRS)

    DeSmidt, Hans A.; Smith, Edward C.; Bill, Robert C.; Wang, Kon-Well

    2013-01-01

    This project develops comprehensive modeling and simulation tools for analysis of variable rotor speed helicopter propulsion system dynamics. The Comprehensive Variable-Speed Rotorcraft Propulsion Modeling (CVSRPM) tool developed in this research is used to investigate coupled rotor/engine/fuel control/gearbox/shaft/clutch/flight control system dynamic interactions for several variable rotor speed mission scenarios. In this investigation, a prototypical two-speed Dual-Clutch Transmission (DCT) is proposed and designed to achieve 50 percent rotor speed variation. The comprehensive modeling tool developed in this study is utilized to analyze the two-speed shift response of both a conventional single rotor helicopter and a tiltrotor drive system. In the tiltrotor system, both a Parallel Shift Control (PSC) strategy and a Sequential Shift Control (SSC) strategy for constant and variable forward speed mission profiles are analyzed. Under the PSC strategy, selecting clutch shift-rate results in a design tradeoff between transient engine surge margins and clutch frictional power dissipation. In the case of SSC, clutch power dissipation is drastically reduced in exchange for the necessity to disengage one engine at a time which requires a multi-DCT drive system topology. In addition to comprehensive simulations, several sections are dedicated to detailed analysis of driveline subsystem components under variable speed operation. In particular an aeroelastic simulation of a stiff in-plane rotor using nonlinear quasi-steady blade element theory was conducted to investigate variable speed rotor dynamics. It was found that 2/rev and 4/rev flap and lag vibrations were significant during resonance crossings with 4/rev lagwise loads being directly transferred into drive-system torque disturbances. To capture the clutch engagement dynamics, a nonlinear stick-slip clutch torque model is developed. Also, a transient gas-turbine engine model based on first principles mean-line compressor and turbine approximations is developed. Finally an analysis of high frequency gear dynamics including the effect of tooth mesh stiffness variation under variable speed operation is conducted including experimental validation. Through exploring the interactions between the various subsystems, this investigation provides important insights into the continuing development of variable-speed rotorcraft propulsion systems.

  2. WholePathwayScope: a comprehensive pathway-based analysis tool for high-throughput data

    PubMed Central

    Yi, Ming; Horton, Jay D; Cohen, Jonathan C; Hobbs, Helen H; Stephens, Robert M

    2006-01-01

    Background Analysis of High Throughput (HTP) Data such as microarray and proteomics data has provided a powerful methodology to study patterns of gene regulation at genome scale. A major unresolved problem in the post-genomic era is to assemble the large amounts of data generated into a meaningful biological context. We have developed a comprehensive software tool, WholePathwayScope (WPS), for deriving biological insights from analysis of HTP data. Result WPS extracts gene lists with shared biological themes through color cue templates. WPS statistically evaluates global functional category enrichment of gene lists and pathway-level pattern enrichment of data. WPS incorporates well-known biological pathways from KEGG (Kyoto Encyclopedia of Genes and Genomes) and Biocarta, GO (Gene Ontology) terms as well as user-defined pathways or relevant gene clusters or groups, and explores gene-term relationships within the derived gene-term association networks (GTANs). WPS simultaneously compares multiple datasets within biological contexts either as pathways or as association networks. WPS also integrates Genetic Association Database and Partial MedGene Database for disease-association information. We have used this program to analyze and compare microarray and proteomics datasets derived from a variety of biological systems. Application examples demonstrated the capacity of WPS to significantly facilitate the analysis of HTP data for integrative discovery. Conclusion This tool represents a pathway-based platform for discovery integration to maximize analysis power. The tool is freely available at . PMID:16423281

  3. Connected vehicle impacts on transportation planning analysis of the need for new and enhanced analysis tools, techniques and data : Highway Capacity Manual briefing.

    DOT National Transportation Integrated Search

    2016-03-02

    The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...

  4. Connected vehicle impacts on transportation planning technical memorandum #3 : analysis of the need for new and enhanced analysis tools, techniques, and data.

    DOT National Transportation Integrated Search

    2015-06-01

    The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...

  5. Task Analysis Inventories. Series II.

    ERIC Educational Resources Information Center

    Wesson, Carl E.

    This second in a series of task analysis inventories contains checklists of work performed in twenty-two occupations. Each inventory is a comprehensive list of work activities, responsibilities, educational courses, machines, tools, equipment, and work aids used and the products produced or services rendered in a designated occupational area. The…

  6. A RESTful API for accessing microbial community data for MG-RAST.

    PubMed

    Wilke, Andreas; Bischof, Jared; Harrison, Travis; Brettin, Tom; D'Souza, Mark; Gerlach, Wolfgang; Matthews, Hunter; Paczian, Tobias; Wilkening, Jared; Glass, Elizabeth M; Desai, Narayan; Meyer, Folker

    2015-01-01

    Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MG-RAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, as well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http://kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase's microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service.

  7. Connected vehicle impacts on transportation planning : analysis of the need for new and enhanced analysis tools, techniques and data, briefing for traffic simulation models.

    DOT National Transportation Integrated Search

    2016-03-11

    The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...

  8. Connected vehicle impacts on transportation planning : analysis of the need for new and enhanced analysis tools, techniques and data—briefing for traffic simulation models.

    DOT National Transportation Integrated Search

    2016-03-11

    The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...

  9. Foreign Language Analysis and Recognition (FLARe) Initial Progress

    DTIC Science & Technology

    2012-11-29

    University Language Modeling ToolKit CoMMA Count Mediated Morphological Analysis CRUD Create, Read , Update & Delete CPAN Comprehensive Perl Archive...DATES COVERED (From - To) 1 October 2010 – 30 September 2012 4. TITLE AND SUBTITLE Foreign Language Analysis and Recognition (FLARe) Initial Progress...AFRL-RH-WP-TR-2012-0165 FOREIGN LANGUAGE ANALYSIS AND RECOGNITION (FLARE) INITIAL PROGRESS Brian M. Ore

  10. The Geomorphic Road Analysis and Inventory Package (GRAIP) Volume 2: Office Procedures

    Treesearch

    Richard M. Cissel; Thomas A. Black; Kimberly A. T. Schreuders; Ajay Prasad; Charles H. Luce; David G. Tarboton; Nathan A. Nelson

    2012-01-01

    An important first step in managing forest roads for improved water quality and aquatic habitat is the performance of an inventory. The Geomorphic Roads Analysis and Inventory Package (GRAIP) was developed as a tool for making a comprehensive inventory and analysis of the effects of forest roads on watersheds. This manual describes the data analysis and process of a...

  11. Gap analysis: synergies and opportunities for effective nursing leadership.

    PubMed

    Davis-Ajami, Mary Lynn; Costa, Linda; Kulik, Susan

    2014-01-01

    Gap analysis encompasses a comprehensive process to identify, understand, address, and bridge gaps in service delivery and nursing practice. onducting gap analysis provides structure to information gathering and the process of finding sustainable solutions to important deficiencies. Nursing leaders need to recognize, measure, monitor, and execute on feasible actionable solutions to help organizations make adjustments to address gaps between what is desired and the actual real-world conditions contributing to the quality chasm in health care. Gap analysis represents a functional and comprehensive tool to address organizational deficiencies. Using gap analysis proactively helps organizations map out and sustain corrective efforts to close the quality chasm. Gaining facility in gap analysis should help the nursing profession's contribution to narrowing the quality chasm.

  12. Methylation Integration (Mint) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    A comprehensive software pipeline and set of Galaxy tools/workflows for integrative analysis of genome-wide DNA methylation and hydroxymethylation data. Data types can be either bisulfite sequencing and/or pull-down methods.

  13. Phases of ERA - Analysis

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  14. Cleft Audit Protocol for Speech (CAPS-A): A Comprehensive Training Package for Speech Analysis

    ERIC Educational Resources Information Center

    Sell, D.; John, A.; Harding-Bell, A.; Sweeney, T.; Hegarty, F.; Freeman, J.

    2009-01-01

    Background: The previous literature has largely focused on speech analysis systems and ignored process issues, such as the nature of adequate speech samples, data acquisition, recording and playback. Although there has been recognition of the need for training on tools used in speech analysis associated with cleft palate, little attention has been…

  15. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  16. Sensitive and comprehensive analysis of O-glycosylation in biotherapeutics: a case study of novel erythropoiesis stimulating protein.

    PubMed

    Kim, Unyong; Oh, Myung Jin; Seo, Youngsuk; Jeon, Yinae; Eom, Joon-Ho; An, Hyun Joo

    2017-09-01

    Glycosylation of recombinant human erythropoietins (rhEPOs) is significantly associated with drug's quality and potency. Thus, comprehensive characterization of glycosylation is vital to assess the biotherapeutic quality and establish the equivalency of biosimilar rhEPOs. However, current glycan analysis mainly focuses on the N-glycans due to the absence of analytical tools to liberate O-glycans with high sensitivity. We developed selective and sensitive method to profile native O-glycans on rhEPOs. O-glycosylation on rhEPO including O-acetylation on a sialic acid was comprehensively characterized. Details such as O-glycan structure and O-acetyl-modification site were obtained from tandem MS. This method may be applied to QC and batch analysis of not only rhEPOs but also other biotherapeutics bearing multiple O-glycosylations.

  17. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    ERIC Educational Resources Information Center

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  18. Software tool for mining liquid chromatography/multi-stage mass spectrometry data for comprehensive glycerophospholipid profiling.

    PubMed

    Hein, Eva-Maria; Bödeker, Bertram; Nolte, Jürgen; Hayen, Heiko

    2010-07-30

    Electrospray ionization mass spectrometry (ESI-MS) has emerged as an indispensable tool in the field of lipidomics. Despite the growing interest in lipid analysis, there are only a few software tools available for data evaluation, as compared for example to proteomics applications. This makes comprehensive lipid analysis a complex challenge. Thus, a computational tool for harnessing the raw data from liquid chromatography/mass spectrometry (LC/MS) experiments was developed in this study and is available from the authors on request. The Profiler-Merger-Viewer tool is a software package for automatic processing of raw-data from data-dependent experiments, measured by high-performance liquid chromatography hyphenated to electrospray ionization hybrid linear ion trap Fourier transform mass spectrometry (FTICR-MS and Orbitrap) in single and multi-stage mode. The software contains three parts: processing of the raw data by Profiler for lipid identification, summarizing of replicate measurements by Merger and visualization of all relevant data (chromatograms as well as mass spectra) for validation of the results by Viewer. The tool is easily accessible, since it is implemented in Java and uses Microsoft Excel (XLS) as output format. The motivation was to develop a tool which supports and accelerates the manual data evaluation (identification and relative quantification) significantly but does not make a complete data analysis within a black-box system. The software's mode of operation, usage and options will be demonstrated on the basis of a lipid extract of baker's yeast (S. cerevisiae). In this study, we focused on three important representatives of lipids: glycerophospholipids, lyso-glycerophospholipids and free fatty acids. Copyright 2010 John Wiley & Sons, Ltd.

  19. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  20. The UEA sRNA Workbench (version 4.4): a comprehensive suite of tools for analyzing miRNAs and sRNAs.

    PubMed

    Stocks, Matthew B; Mohorianu, Irina; Beckers, Matthew; Paicu, Claudia; Moxon, Simon; Thody, Joshua; Dalmay, Tamas; Moulton, Vincent

    2018-05-02

    RNA interference, a highly conserved regulatory mechanism, is mediated via small RNAs. Recent technical advances enabled the analysis of larger, complex datasets and the investigation of microRNAs and the less known small interfering RNAs. However, the size and intricacy of current data requires a comprehensive set of tools, able to discriminate the patterns from the low-level, noise-like, variation; numerous and varied suggestions from the community represent an invaluable source of ideas for future tools, the ability of the community to contribute to this software is essential. We present a new version of the UEA sRNA Workbench, reconfigured to allow an easy insertion of new tools/workflows. In its released form, it comprises of a suite of tools in a user-friendly environment, with enhanced capabilities for a comprehensive processing of sRNA-seq data e.g. tools for an accurate prediction of sRNA loci (CoLIde) and miRNA loci (miRCat2), as well as workflows to guide the users through common steps such as quality checking of the input data, normalization of abundances or detection of differential expression represent the first step in sRNA-seq analyses. The UEA sRNA Workbench is available at: http://srna-workbench.cmp.uea.ac.uk The source code is available at: https://github.com/sRNAworkbenchuea/UEA_sRNA_Workbench. v.moulton@uea.ac.uk.

  1. 2010 Comprehensive Economic Development Strategy "Vision Hampton Roads"

    DOT National Transportation Integrated Search

    2010-02-19

    The strategy is an economic development planning tool intended to aid : local governments in decision-making. The document provides an analysis : of regional and local economic conditions within the Hampton Roads : region, defined as including the te...

  2. Using miscue analysis to assess comprehension in deaf college readers.

    PubMed

    Albertini, John; Mayer, Connie

    2011-01-01

    For over 30 years, teachers have used miscue analysis as a tool to assess and evaluate the reading abilities of hearing students in elementary and middle schools and to design effective literacy programs. More recently, teachers of deaf and hard-of-hearing students have also reported its usefulness for diagnosing word- and phrase-level reading difficulties and for planning instruction. To our knowledge, miscue analysis has not been used with older, college-age deaf students who might also be having difficulty decoding and understanding text at the word level. The goal of this study was to determine whether such an analysis would be helpful in identifying the source of college students' reading comprehension difficulties. After analyzing the miscues of 10 college-age readers and the results of other comprehension-related tasks, we concluded that comprehension of basic grade school-level passages depended on the ability to recognize and comprehend key words and phrases in these texts. We also concluded that these diagnostic procedures provided useful information about the reading abilities and strategies of each reader that had implications for designing more effective interventions.

  3. Identifying and tracking attacks on networks: C3I displays and related technologies

    NASA Astrophysics Data System (ADS)

    Manes, Gavin W.; Dawkins, J.; Shenoi, Sujeet; Hale, John C.

    2003-09-01

    Converged network security is extremely challenging for several reasons; expanded system and technology perimeters, unexpected feature interaction, and complex interfaces all conspire to provide hackers with greater opportunities for compromising large networks. Preventive security services and architectures are essential, but in and of themselves do not eliminate all threat of compromise. Attack management systems mitigate this residual risk by facilitating incident detection, analysis and response. There are a wealth of attack detection and response tools for IP networks, but a dearth of such tools for wireless and public telephone networks. Moreover, methodologies and formalisms have yet to be identified that can yield a common model for vulnerabilities and attacks in converged networks. A comprehensive attack management system must coordinate detection tools for converged networks, derive fully-integrated attack and network models, perform vulnerability and multi-stage attack analysis, support large-scale attack visualization, and orchestrate strategic responses to cyber attacks that cross network boundaries. We present an architecture that embodies these principles for attack management. The attack management system described engages a suite of detection tools for various networking domains, feeding real-time attack data to a comprehensive modeling, analysis and visualization subsystem. The resulting early warning system not only provides network administrators with a heads-up cockpit display of their entire network, it also supports guided response and predictive capabilities for multi-stage attacks in converged networks.

  4. Advanced Stoichiometric Analysis of Metabolic Networks of Mammalian Systems

    PubMed Central

    Orman, Mehmet A.; Berthiaume, Francois; Androulakis, Ioannis P.; Ierapetritou, Marianthi G.

    2013-01-01

    Metabolic engineering tools have been widely applied to living organisms to gain a comprehensive understanding about cellular networks and to improve cellular properties. Metabolic flux analysis (MFA), flux balance analysis (FBA), and metabolic pathway analysis (MPA) are among the most popular tools in stoichiometric network analysis. Although application of these tools into well-known microbial systems is extensive in the literature, various barriers prevent them from being utilized in mammalian cells. Limited experimental data, complex regulatory mechanisms, and the requirement of more complex nutrient media are some major obstacles in mammalian cell systems. However, mammalian cells have been used to produce therapeutic proteins, to characterize disease states or related abnormal metabolic conditions, and to analyze the toxicological effects of some medicinally important drugs. Therefore, there is a growing need for extending metabolic engineering principles to mammalian cells in order to understand their underlying metabolic functions. In this review article, advanced metabolic engineering tools developed for stoichiometric analysis including MFA, FBA, and MPA are described. Applications of these tools in mammalian cells are discussed in detail, and the challenges and opportunities are highlighted. PMID:22196224

  5. miRNet - dissecting miRNA-target interactions and functional associations through network-based visual analysis

    PubMed Central

    Fan, Yannan; Siklenka, Keith; Arora, Simran K.; Ribeiro, Paula; Kimmins, Sarah; Xia, Jianguo

    2016-01-01

    MicroRNAs (miRNAs) can regulate nearly all biological processes and their dysregulation is implicated in various complex diseases and pathological conditions. Recent years have seen a growing number of functional studies of miRNAs using high-throughput experimental technologies, which have produced a large amount of high-quality data regarding miRNA target genes and their interactions with small molecules, long non-coding RNAs, epigenetic modifiers, disease associations, etc. These rich sets of information have enabled the creation of comprehensive networks linking miRNAs with various biologically important entities to shed light on their collective functions and regulatory mechanisms. Here, we introduce miRNet, an easy-to-use web-based tool that offers statistical, visual and network-based approaches to help researchers understand miRNAs functions and regulatory mechanisms. The key features of miRNet include: (i) a comprehensive knowledge base integrating high-quality miRNA-target interaction data from 11 databases; (ii) support for differential expression analysis of data from microarray, RNA-seq and quantitative PCR; (iii) implementation of a flexible interface for data filtering, refinement and customization during network creation; (iv) a powerful fully featured network visualization system coupled with enrichment analysis. miRNet offers a comprehensive tool suite to enable statistical analysis and functional interpretation of various data generated from current miRNA studies. miRNet is freely available at http://www.mirnet.ca. PMID:27105848

  6. V-FOR-WaTer - a new virtual research environment for environmental research

    NASA Astrophysics Data System (ADS)

    Strobl, Marcus; Azmi, Elnaz; Hassler, Sibylle; Mälicke, Mirko; Meyer, Jörg; Zehe, Erwin

    2017-04-01

    The preparation of heterogeneous datasets for scientific analysis is still a demanding task. Data preprocessing for hydrological models typically involves gathering datasets from different sources, extensive work within geoinformation systems, data transformation, the generation of computational grids and the definition of initial and boundary conditions. V-FOR-WaTer, a standardized and scalable data hub with compatible analysis tools, will ease comprehensive studies and significantly reduce data preparation time. The idea behind V-FOR-WaTer is to bring together various datasets (e.g. point measurements, 2D/3D data, time series data) from different sources (e.g. gathered in research projects, or as part of regular monitoring of state offices) and to provide common as well as innovative scaling tools in space and time to generate a coherent data grid. Each dataset holds detailed standardized metadata to ensure usability of the data, offer a comprehensive search function and provide reference information for appropriate citation of the dataset creators. V-FOR-WaTer includes a basis of data and tools, but its purpose is to grow by users who extend the virtual research environment with their own tools and research data. Researchers who upload new data or tools can receive a digital object identifier, or protect their data and tools from others until publication. Access to data and tools provided from V-FOR-WaTer happens via an easy-to-use web portal. Due to its modular architecture the portal is ready to be extended with new tools and features and also offers interfaces to Matlab, Python and R.

  7. The business process management software for successful quality management and organization: A case study from the University of Split School of Medicine.

    PubMed

    Sapunar, Damir; Grković, Ivica; Lukšić, Davor; Marušić, Matko

    2016-05-01

    Our aim was to describe a comprehensive model of internal quality management (QM) at a medical school founded on the business process analysis (BPA) software tool. BPA software tool was used as the core element for description of all working processes in our medical school, and subsequently the system served as the comprehensive model of internal QM. The quality management system at the University of Split School of Medicine included the documentation and analysis of all business processes within the School. The analysis revealed 80 weak points related to one or several business processes. A precise analysis of medical school business processes allows identification of unfinished, unclear and inadequate points in these processes, and subsequently the respective improvements and increase of the QM level and ultimately a rationalization of the institution's work. Our approach offers a potential reference model for development of common QM framework allowing a continuous quality control, i.e. the adjustments and adaptation to contemporary educational needs of medical students. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  8. Text Analysis: Critical Component of Planning for Text-Based Discussion Focused on Comprehension of Informational Texts

    ERIC Educational Resources Information Center

    Kucan, Linda; Palincsar, Annemarie Sullivan

    2018-01-01

    This investigation focuses on a tool used in a reading methods course to introduce reading specialist candidates to text analysis as a critical component of planning for text-based discussions. Unlike planning that focuses mainly on important text content or information, a text analysis approach focuses both on content and how that content is…

  9. Comprehensive, powerful, efficient, intuitive: a new software framework for clinical imaging applications

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Hanson, Dennis P.; Robb, Richard A.

    2006-03-01

    One of the greatest challenges for a software engineer is to create a complex application that is comprehensive enough to be useful to a diverse set of users, yet focused enough for individual tasks to be carried out efficiently with minimal training. This "powerful yet simple" paradox is particularly prevalent in advanced medical imaging applications. Recent research in the Biomedical Imaging Resource (BIR) at Mayo Clinic has been directed toward development of an imaging application framework that provides powerful image visualization/analysis tools in an intuitive, easy-to-use interface. It is based on two concepts very familiar to physicians - Cases and Workflows. Each case is associated with a unique patient and a specific set of routine clinical tasks, or a workflow. Each workflow is comprised of an ordered set of general-purpose modules which can be re-used for each unique workflow. Clinicians help describe and design the workflows, and then are provided with an intuitive interface to both patient data and analysis tools. Since most of the individual steps are common to many different workflows, the use of general-purpose modules reduces development time and results in applications that are consistent, stable, and robust. While the development of individual modules may reflect years of research by imaging scientists, new customized workflows based on the new modules can be developed extremely fast. If a powerful, comprehensive application is difficult to learn and complicated to use, it will be unacceptable to most clinicians. Clinical image analysis tools must be intuitive and effective or they simply will not be used.

  10. The economic cost of traffic congestion in Florida.

    DOT National Transportation Integrated Search

    2010-08-01

    Traffic congestion in the U.S. is bad and getting worse, and it is expensive. Appropriate solutions to this problem require appropriate information. A comprehensive and accurate analysis of congestion costs is a critical tool for planning and impleme...

  11. Analysis of past NBI ratings to determine future bridge preservation needs.

    DOT National Transportation Integrated Search

    2004-01-01

    Bridge Management System (BMS) needs an analytical tool that can predict bridge element deterioration and answer questions related to bridge preservation. PONTIS, a comprehensive BMS software, was developed to serve this purpose. However, the intensi...

  12. Anaconda: AN automated pipeline for somatic COpy Number variation Detection and Annotation from tumor exome sequencing data.

    PubMed

    Gao, Jianing; Wan, Changlin; Zhang, Huan; Li, Ao; Zang, Qiguang; Ban, Rongjun; Ali, Asim; Yu, Zhenghua; Shi, Qinghua; Jiang, Xiaohua; Zhang, Yuanwei

    2017-10-03

    Copy number variations (CNVs) are the main genetic structural variations in cancer genome. Detecting CNVs in genetic exome region is efficient and cost-effective in identifying cancer associated genes. Many tools had been developed accordingly and yet these tools lack of reliability because of high false negative rate, which is intrinsically caused by genome exonic bias. To provide an alternative option, here, we report Anaconda, a comprehensive pipeline that allows flexible integration of multiple CNV-calling methods and systematic annotation of CNVs in analyzing WES data. Just by one command, Anaconda can generate CNV detection result by up to four CNV detecting tools. Associated with comprehensive annotation analysis of genes involved in shared CNV regions, Anaconda is able to deliver a more reliable and useful report in assistance with CNV-associate cancer researches. Anaconda package and manual can be freely accessed at http://mcg.ustc.edu.cn/bsc/ANACONDA/ .

  13. Information security system quality assessment through the intelligent tools

    NASA Astrophysics Data System (ADS)

    Trapeznikov, E. V.

    2018-04-01

    The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.

  14. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  15. Cost-effective use of minicomputers to solve structural problems

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Foster, E. P.

    1978-01-01

    Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.

  16. Ball Bearing Analysis with the ORBIS Tool

    NASA Technical Reports Server (NTRS)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  17. Polar bear encephalitis: establishment of a comprehensive next-generation pathogen analysis pipeline for captive and free-living wildlife.

    PubMed

    Szentiks, C A; Tsangaras, K; Abendroth, B; Scheuch, M; Stenglein, M D; Wohlsein, P; Heeger, F; Höveler, R; Chen, W; Sun, W; Damiani, A; Nikolin, V; Gruber, A D; Grobbel, M; Kalthoff, D; Höper, D; Czirják, G Á; Derisi, J; Mazzoni, C J; Schüle, A; Aue, A; East, M L; Hofer, H; Beer, M; Osterrieder, N; Greenwood, A D

    2014-05-01

    This report describes three possibly related incidences of encephalitis, two of them lethal, in captive polar bears (Ursus maritimus). Standard diagnostic methods failed to identify pathogens in any of these cases. A comprehensive, three-stage diagnostic 'pipeline' employing both standard serological methods and new DNA microarray and next generation sequencing-based diagnostics was developed, in part as a consequence of this initial failure. This pipeline approach illustrates the strengths, weaknesses and limitations of these tools in determining pathogen caused deaths in non-model organisms such as wildlife species and why the use of a limited number of diagnostic tools may fail to uncover important wildlife pathogens. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  19. 2010 Comprehensive Economic Development Strategy "Vision Hampton Roads" : Executive Summary

    DOT National Transportation Integrated Search

    2010-02-19

    The strategy is an economic development planning tool intended to aid local governments in decision-making. The document provides an analysis of regional and local economic conditions within the Hampton Roads region, defined as including the ten (10)...

  20. NEW GIS WATERSHED ANALYSIS TOOLS FOR SOIL CHARACTERIZATION AND EROSION AND SEDIMENTATION MODELING

    EPA Science Inventory

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed which utilizes a suite of automated scripts and a pair of processing-intensive executable programs operating on a personal computer platform.

  1. Computational nanoelectronics towards: design, analysis, synthesis, and fundamental limits

    NASA Technical Reports Server (NTRS)

    Klimeck, G.

    2003-01-01

    This seminar will review the development of a comprehensive nanoelectronic modeling tool (NEMO 1-D and NEMO 3-D) and its application to high-speed electronics (resonant tunneling diodes) and IR detectors and lasers (quantum dots and 1-D heterostructures).

  2. Methods, Tools and Current Perspectives in Proteogenomics *

    PubMed Central

    Ruggles, Kelly V.; Krug, Karsten; Wang, Xiaojing; Clauser, Karl R.; Wang, Jing; Payne, Samuel H.; Fenyö, David; Zhang, Bing; Mani, D. R.

    2017-01-01

    With combined technological advancements in high-throughput next-generation sequencing and deep mass spectrometry-based proteomics, proteogenomics, i.e. the integrative analysis of proteomic and genomic data, has emerged as a new research field. Early efforts in the field were focused on improving protein identification using sample-specific genomic and transcriptomic sequencing data. More recently, integrative analysis of quantitative measurements from genomic and proteomic studies have identified novel insights into gene expression regulation, cell signaling, and disease. Many methods and tools have been developed or adapted to enable an array of integrative proteogenomic approaches and in this article, we systematically classify published methods and tools into four major categories, (1) Sequence-centric proteogenomics; (2) Analysis of proteogenomic relationships; (3) Integrative modeling of proteogenomic data; and (4) Data sharing and visualization. We provide a comprehensive review of methods and available tools in each category and highlight their typical applications. PMID:28456751

  3. PATRIC: the Comprehensive Bacterial Bioinformatics Resource with a Focus on Human Pathogenic Species ▿ ‡ #

    PubMed Central

    Gillespie, Joseph J.; Wattam, Alice R.; Cammer, Stephen A.; Gabbard, Joseph L.; Shukla, Maulik P.; Dalay, Oral; Driscoll, Timothy; Hix, Deborah; Mane, Shrinivasrao P.; Mao, Chunhong; Nordberg, Eric K.; Scott, Mark; Schulman, Julie R.; Snyder, Eric E.; Sullivan, Daniel E.; Wang, Chunxia; Warren, Andrew; Williams, Kelly P.; Xue, Tian; Seung Yoo, Hyun; Zhang, Chengdong; Zhang, Yan; Will, Rebecca; Kenyon, Ronald W.; Sobral, Bruno W.

    2011-01-01

    Funded by the National Institute of Allergy and Infectious Diseases, the Pathosystems Resource Integration Center (PATRIC) is a genomics-centric relational database and bioinformatics resource designed to assist scientists in infectious-disease research. Specifically, PATRIC provides scientists with (i) a comprehensive bacterial genomics database, (ii) a plethora of associated data relevant to genomic analysis, and (iii) an extensive suite of computational tools and platforms for bioinformatics analysis. While the primary aim of PATRIC is to advance the knowledge underlying the biology of human pathogens, all publicly available genome-scale data for bacteria are compiled and continually updated, thereby enabling comparative analyses to reveal the basis for differences between infectious free-living and commensal species. Herein we summarize the major features available at PATRIC, dividing the resources into two major categories: (i) organisms, genomes, and comparative genomics and (ii) recurrent integration of community-derived associated data. Additionally, we present two experimental designs typical of bacterial genomics research and report on the execution of both projects using only PATRIC data and tools. These applications encompass a broad range of the data and analysis tools available, illustrating practical uses of PATRIC for the biologist. Finally, a summary of PATRIC's outreach activities, collaborative endeavors, and future research directions is provided. PMID:21896772

  4. The Geomorphic Road Analysis and Inventory Package (GRAIP) Volume 1: Data Collection Method

    Treesearch

    Thomas A. Black; Richard M. Cissel; Charles H. Luce

    2012-01-01

    An important first step in managing forest roads for improved water quality and aquatic habitat is the performance of an inventory. The Geomorphic Roads Analysis and Inventory Package (GRAIP) was developed as a tool for making a comprehensive inventory and analysis of the effects of forest roads on watersheds. This manual describes the data collection and process of a...

  5. Using Miscue Analysis to Assess Comprehension in Deaf College Readers

    ERIC Educational Resources Information Center

    Albertini, John; Mayer, Connie

    2011-01-01

    For over 30 years, teachers have used miscue analysis as a tool to assess and evaluate the reading abilities of hearing students in elementary and middle schools and to design effective literacy programs. More recently, teachers of deaf and hard-of-hearing students have also reported its usefulness for diagnosing word- and phrase-level reading…

  6. Accounting for Student Success: An Empirical Analysis of the Origins and Spread of State Student Unit-Record Systems

    ERIC Educational Resources Information Center

    Hearn, James C.; McLendon, Michael K.; Mokher, Christine G.

    2008-01-01

    This event history analysis explores factors driving the emergence over recent decades of comprehensive state-level student unit-record [SUR] systems, a potentially powerful tool for increasing student success. Findings suggest that the adoption of these systems is rooted in demand and ideological factors. Larger states, states with high…

  7. Construct Validation of the Louisiana School Analysis Model (SAM) Instructional Staff Questionnaire

    ERIC Educational Resources Information Center

    Bray-Clark, Nikki; Bates, Reid

    2005-01-01

    The purpose of this study was to validate the Louisiana SAM Instructional Staff Questionnaire, a key component of the Louisiana School Analysis Model. The model was designed as a comprehensive evaluation tool for schools. Principle axis factoring with oblique rotation was used to uncover the underlying structure of the SISQ. (Contains 1 table.)

  8. Analysis of the Image of Scientists Portrayed in the Lebanese National Science Textbooks

    ERIC Educational Resources Information Center

    Yacoubian, Hagop A.; Al-Khatib, Layan; Mardirossian, Taline

    2017-01-01

    This article presents an analysis of how scientists are portrayed in the Lebanese national science textbooks. The purpose of this study was twofold. First, to develop a comprehensive analytical framework that can serve as a tool to analyze the image of scientists portrayed in educational resources. Second, to analyze the image of scientists…

  9. On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment

    NASA Astrophysics Data System (ADS)

    Guterres, Rui M.

    The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.

  10. Development of a prototype commonality analysis tool for use in space programs

    NASA Technical Reports Server (NTRS)

    Yeager, Dorian P.

    1988-01-01

    A software tool to aid in performing commonality analyses, called Commonality Analysis Problem Solver (CAPS), was designed, and a prototype version (CAPS 1.0) was implemented and tested. The CAPS 1.0 runs in an MS-DOS or IBM PC-DOS environment. The CAPS is designed around a simple input language which provides a natural syntax for the description of feasibility constraints. It provides its users with the ability to load a database representing a set of design items, describe the feasibility constraints on items in that database, and do a comprehensive cost analysis to find the most economical substitution pattern.

  11. BeadArray Expression Analysis Using Bioconductor

    PubMed Central

    Ritchie, Matthew E.; Dunning, Mark J.; Smith, Mike L.; Shi, Wei; Lynch, Andy G.

    2011-01-01

    Illumina whole-genome expression BeadArrays are a popular choice in gene profiling studies. Aside from the vendor-provided software tools for analyzing BeadArray expression data (GenomeStudio/BeadStudio), there exists a comprehensive set of open-source analysis tools in the Bioconductor project, many of which have been tailored to exploit the unique properties of this platform. In this article, we explore a number of these software packages and demonstrate how to perform a complete analysis of BeadArray data in various formats. The key steps of importing data, performing quality assessments, preprocessing, and annotation in the common setting of assessing differential expression in designed experiments will be covered. PMID:22144879

  12. The Health Impact Assessment (HIA) Resource and Tool ...

    EPA Pesticide Factsheets

    Health Impact Assessment (HIA) is a relatively new and rapidly emerging field in the U.S. An inventory of available HIA resources and tools was conducted, with a primary focus on resources developed in the U.S. The resources and tools available to HIA practitioners in the conduct of their work were identified through multiple methods and compiled into a comprehensive list. The compilation includes tools and resources related to the HIA process itself and those that can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation. These resources include literature and evidence bases, data and statistics, guidelines, benchmarks, decision and economic analysis tools, scientific models, methods, frameworks, indices, mapping, and various data collection tools. Understanding the data, tools, models, methods, and other resources available to perform HIAs will help to advance the HIA community of practice in the U.S., improve the quality and rigor of assessments upon which stakeholder and policy decisions are based, and potentially improve the overall effectiveness of HIA to promote healthy and sustainable communities. The Health Impact Assessment (HIA) Resource and Tool Compilation is a comprehensive list of resources and tools that can be utilized by HIA practitioners with all levels of HIA experience to guide them throughout the HIA process. The HIA Resource

  13. The integration of FMEA with other problem solving tools: A review of enhancement opportunities

    NASA Astrophysics Data System (ADS)

    Ng, W. C.; Teh, S. Y.; Low, H. C.; Teoh, P. C.

    2017-09-01

    Failure Mode Effect Analysis (FMEA) is one the most effective and accepted problem solving (PS) tools for most of the companies in the world. Since FMEA was first introduced in 1949, practitioners have implemented FMEA in various industries for their quality improvement initiatives. However, studies have shown that there are drawbacks that hinder the effectiveness of FMEA for continuous quality improvement from product design to manufacturing. Therefore, FMEA is integrated with other PS tools such as inventive problem solving methodology (TRIZ), Quality Function Deployment (QFD), Root Cause Analysis (RCA) and seven basic tools of quality to address the drawbacks. This study begins by identifying the drawbacks in FMEA. A comprehensive literature review on the integration of FMEA with other tools is carried out to categorise the integrations based on the drawbacks identified. The three categories are inefficiency of failure analysis, psychological inertia and neglect of customers’ perspective. This study concludes by discussing the gaps and opportunities in the integration for future research.

  14. A RESTful API for accessing microbial community data for MG-RAST

    DOE PAGES

    Wilke, Andreas; Bischof, Jared; Harrison, Travis; ...

    2015-01-08

    Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MGRAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, asmore » well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http:// kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase’s microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service.« less

  15. A RESTful API for Accessing Microbial Community Data for MG-RAST

    PubMed Central

    Wilke, Andreas; Bischof, Jared; Harrison, Travis; Brettin, Tom; D'Souza, Mark; Gerlach, Wolfgang; Matthews, Hunter; Paczian, Tobias; Wilkening, Jared; Glass, Elizabeth M.; Desai, Narayan; Meyer, Folker

    2015-01-01

    Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MG-RAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, as well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http://kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase's microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service. PMID:25569221

  16. IPMP 2013 - A comprehensive data analysis tool for predictive microbiology

    USDA-ARS?s Scientific Manuscript database

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods undergoing complex environmental changes during processing, transportation, distribution, and storage. It f...

  17. Freeway performance measurement system : an operational analysis tool

    DOT National Transportation Integrated Search

    2001-07-30

    PeMS is a freeway performance measurement system for all of California. It processes 2 : GB/day of 30-second loop detector data in real time to produce useful information. Managers : at any time can have a uniform, and comprehensive assessment of fre...

  18. CARD 2017: expansion and model-centric curation of the comprehensive antibiotic resistance database

    PubMed Central

    Jia, Baofeng; Raphenya, Amogelang R.; Alcock, Brian; Waglechner, Nicholas; Guo, Peiyao; Tsang, Kara K.; Lago, Briony A.; Dave, Biren M.; Pereira, Sheldon; Sharma, Arjun N.; Doshi, Sachin; Courtot, Mélanie; Lo, Raymond; Williams, Laura E.; Frye, Jonathan G.; Elsayegh, Tariq; Sardar, Daim; Westman, Erin L.; Pawlowski, Andrew C.; Johnson, Timothy A.; Brinkman, Fiona S.L.; Wright, Gerard D.; McArthur, Andrew G.

    2017-01-01

    The Comprehensive Antibiotic Resistance Database (CARD; http://arpcard.mcmaster.ca) is a manually curated resource containing high quality reference data on the molecular basis of antimicrobial resistance (AMR), with an emphasis on the genes, proteins and mutations involved in AMR. CARD is ontologically structured, model centric, and spans the breadth of AMR drug classes and resistance mechanisms, including intrinsic, mutation-driven and acquired resistance. It is built upon the Antibiotic Resistance Ontology (ARO), a custom built, interconnected and hierarchical controlled vocabulary allowing advanced data sharing and organization. Its design allows the development of novel genome analysis tools, such as the Resistance Gene Identifier (RGI) for resistome prediction from raw genome sequence. Recent improvements include extensive curation of additional reference sequences and mutations, development of a unique Model Ontology and accompanying AMR detection models to power sequence analysis, new visualization tools, and expansion of the RGI for detection of emergent AMR threats. CARD curation is updated monthly based on an interplay of manual literature curation, computational text mining, and genome analysis. PMID:27789705

  19. Multi-Metric Sustainability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowlin, Shannon; Heimiller, Donna; Macknick, Jordan

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  20. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    PubMed

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  1. New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Eric W; Rames, Clement L; Muratori, Matteo

    This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.

  2. Experimental Approaches to Microarray Analysis of Tumor Samples

    ERIC Educational Resources Information Center

    Furge, Laura Lowe; Winter, Michael B.; Meyers, Jacob I.; Furge, Kyle A.

    2008-01-01

    Comprehensive measurement of gene expression using high-density nucleic acid arrays (i.e. microarrays) has become an important tool for investigating the molecular differences in clinical and research samples. Consequently, inclusion of discussion in biochemistry, molecular biology, or other appropriate courses of microarray technologies has…

  3. Tularosa Basin Play Fairway Analysis: Methodology Flow Charts

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    These images show the comprehensive methodology used for creation of a Play Fairway Analysis to explore the geothermal resource potential of the Tularosa Basin, New Mexico. The deterministic methodology was originated by the petroleum industry, but was custom-modified to function as a knowledge-based geothermal exploration tool. The stochastic PFA flow chart uses weights of evidence, and is data-driven.

  4. Comprehension Tools for Teachers: Reading for Understanding from Prekindergarten through Fourth Grade

    ERIC Educational Resources Information Center

    Connor, Carol McDonald; Phillips, Beth M.; Kaschak, Michael; Apel, Kenn; Kim, Young-Suk; Al Otaiba, Stephanie; Crowe, Elizabeth C.; Thomas-Tate, Shurita; Johnson, Lakeisha Cooper; Lonigan, Christopher J.

    2014-01-01

    This paper describes the theoretical framework, as well as the development and testing of the intervention, "Comprehension Tools for Teachers" (CTT), which is composed of eight component interventions targeting malleable language and reading comprehension skills that emerging research indicates contribute to proficient reading for…

  5. The Environment-Power System Analysis Tool development program. [for spacecraft power supplies

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.

    1989-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.

  6. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis.

    PubMed

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-12-13

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental &biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data.

  7. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis

    PubMed Central

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-01-01

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental & biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data. PMID:27958387

  8. Advanced Power System Analysis Capabilities

    NASA Technical Reports Server (NTRS)

    1997-01-01

    As a continuing effort to assist in the design and characterization of space power systems, the NASA Lewis Research Center's Power and Propulsion Office developed a powerful computerized analysis tool called System Power Analysis for Capability Evaluation (SPACE). This year, SPACE was used extensively in analyzing detailed operational timelines for the International Space Station (ISS) program. SPACE was developed to analyze the performance of space-based photovoltaic power systems such as that being developed for the ISS. It is a highly integrated tool that combines numerous factors in a single analysis, providing a comprehensive assessment of the power system's capability. Factors particularly critical to the ISS include the orientation of the solar arrays toward the Sun and the shadowing of the arrays by other portions of the station.

  9. Participant comprehension of research for which they volunteer: a systematic review.

    PubMed

    Montalvo, Wanda; Larson, Elaine

    2014-11-01

    Evidence indicates that research participants often do not fully understand the studies for which they have volunteered. The aim of this systematic review was to examine the relationship between the process of obtaining informed consent for research and participant comprehension and satisfaction with the research. Systematic review of published research on informed consent and participant comprehension of research for which they volunteer using the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) Statement as a guide. PubMed, Cumulative Index for Nursing and Allied Health Literature, Cochrane Central Register of Controlled Trails, and Cochrane Database of Systematic Reviews were used to search the literature for studies meeting the following inclusion criteria: (a) published between January 1, 2006, and December 31, 2013, (b) interventional or descriptive quantitative design, (c) published in a peer-reviewed journal, (d) written in English, and (e) assessed participant comprehension or satisfaction with the research process. Studies were assessed for quality using seven indicators: sampling method, use of controls or comparison groups, response rate, description of intervention, description of outcome, statistical method, and health literacy assessment. Of 176 studies identified, 27 met inclusion criteria: 13 (48%) were randomized interventional designs and 14 (52%) were descriptive. Three categories of studies included projects assessing (a) enhanced consent process or form, (b) multimedia methods, and (c) education to improve participant understanding. Most (78%) used investigator-developed tools to assess participant comprehension, did not assess participant health literacy (74%), or did not assess the readability level of the consent form (89%). Researchers found participants lacked basic understanding of research elements: randomization, placebo, risks, and therapeutic misconception. Findings indicate (a) inconsistent assessment of participant reading or health literacy level, (b) measurement variation associated with use of nonstandardized tools, and (c) continued therapeutic misconception and lack of understanding among research participants of randomization, placebo, benefit, and risk. While the Agency for Healthcare and Quality and National Quality Forum have published informed consent and authorization toolkits, previously published validated tools are underutilized. Informed consent requires the assessment of health literacy, reading level, and comprehension of research participants using validated assessment tools and methods. © 2014 Sigma Theta Tau International.

  10. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  11. CHESS (CgHExpreSS): a comprehensive analysis tool for the analysis of genomic alterations and their effects on the expression profile of the genome.

    PubMed

    Lee, Mikyung; Kim, Yangseok

    2009-12-16

    Genomic alterations frequently occur in many cancer patients and play important mechanistic roles in the pathogenesis of cancer. Furthermore, they can modify the expression level of genes due to altered copy number in the corresponding region of the chromosome. An accumulating body of evidence supports the possibility that strong genome-wide correlation exists between DNA content and gene expression. Therefore, more comprehensive analysis is needed to quantify the relationship between genomic alteration and gene expression. A well-designed bioinformatics tool is essential to perform this kind of integrative analysis. A few programs have already been introduced for integrative analysis. However, there are many limitations in their performance of comprehensive integrated analysis using published software because of limitations in implemented algorithms and visualization modules. To address this issue, we have implemented the Java-based program CHESS to allow integrative analysis of two experimental data sets: genomic alteration and genome-wide expression profile. CHESS is composed of a genomic alteration analysis module and an integrative analysis module. The genomic alteration analysis module detects genomic alteration by applying a threshold based method or SW-ARRAY algorithm and investigates whether the detected alteration is phenotype specific or not. On the other hand, the integrative analysis module measures the genomic alteration's influence on gene expression. It is divided into two separate parts. The first part calculates overall correlation between comparative genomic hybridization ratio and gene expression level by applying following three statistical methods: simple linear regression, Spearman rank correlation and Pearson's correlation. In the second part, CHESS detects the genes that are differentially expressed according to the genomic alteration pattern with three alternative statistical approaches: Student's t-test, Fisher's exact test and Chi square test. By successive operations of two modules, users can clarify how gene expression levels are affected by the phenotype specific genomic alterations. As CHESS was developed in both Java application and web environments, it can be run on a web browser or a local machine. It also supports all experimental platforms if a properly formatted text file is provided to include the chromosomal position of probes and their gene identifiers. CHESS is a user-friendly tool for investigating disease specific genomic alterations and quantitative relationships between those genomic alterations and genome-wide gene expression profiling.

  12. Java web tools for PCR, in silico PCR, and oligonucleotide assembly and analysis.

    PubMed

    Kalendar, Ruslan; Lee, David; Schulman, Alan H

    2011-08-01

    The polymerase chain reaction is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. We have developed and tested efficient tools for PCR primer and probe design, which also predict oligonucleotide properties based on experimental studies of PCR efficiency. The tools provide comprehensive facilities for designing primers for most PCR applications and their combinations, including standard, multiplex, long-distance, inverse, real-time, unique, group-specific, bisulphite modification assays, Overlap-Extension PCR Multi-Fragment Assembly, as well as a programme to design oligonucleotide sets for long sequence assembly by ligase chain reaction. The in silico PCR primer or probe search includes comprehensive analyses of individual primers and primer pairs. It calculates the melting temperature for standard and degenerate oligonucleotides including LNA and other modifications, provides analyses for a set of primers with prediction of oligonucleotide properties, dimer and G-quadruplex detection, linguistic complexity, and provides a dilution and resuspension calculator. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Assessing Comprehension During Reading with the Reading Strategy Assessment Tool (RSAT)

    PubMed Central

    Magliano, Joseph P.; Millis, Keith K.; Levinstein, Irwin

    2011-01-01

    Comprehension emerges as the results of inference and strategic processes that support the construction of a coherent mental model for a text. However, the vast majority of comprehension skills tests adopt a format that does not afford an assessment of these processes as they operate during reading. This study assessed the viability of the Reading Strategy Assessment Tool (RSAT), which is an automated computer-based reading assessment designed to measure readers’ comprehension and spontaneous use of reading strategies while reading texts. In the tool, readers comprehend passages one sentence at a time, and are asked either an indirect (“What are your thoughts regarding your understanding of the sentence in the context of the passage?”) or direct (e.g., why X?) question after reading each pre-selected target sentence. The answers to the indirect questions are analyzed on the extent that they contain words associated with comprehension processes. The answers to direct questions are coded for the number of content words in common with an ideal answer, which is intended to be an assessment of emerging comprehension. In the study, the RSAT approach was shown to predict measures of comprehension comparable to standardized tests. The RSAT variables were also shown to correlate with human ratings. The results of this study constitute a “proof of concept” and demonstrate that it is possible to develop a comprehension skills assessment tool that assesses both comprehension and comprehension strategies. PMID:23901332

  14. MannDB: A microbial annotation database for protein characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, C; Lam, M; Smith, J

    2006-05-19

    MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-sourcemore » tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins) are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. MannDB comprises a large number of genomes and comprehensive protein sequence analyses representing organisms listed as high-priority agents on the websites of several governmental organizations concerned with bio-terrorism. MannDB provides the user with a BLAST interface for comparison of native and non-native sequences and a query tool for conveniently selecting proteins of interest. In addition, the user has access to a web-based browser that compiles comprehensive and extensive reports.« less

  15. Development and validation of a Malawian version of the primary care assessment tool.

    PubMed

    Dullie, Luckson; Meland, Eivind; Hetlevik, Øystein; Mildestvedt, Thomas; Gjesdal, Sturla

    2018-05-16

    Malawi does not have validated tools for assessing primary care performance from patients' experience. The aim of this study was to develop a Malawian version of Primary Care Assessment Tool (PCAT-Mw) and to evaluate its reliability and validity in the assessment of the core primary care dimensions from adult patients' perspective in Malawi. A team of experts assessed the South African version of the primary care assessment tool (ZA-PCAT) for face and content validity. The adapted questionnaire underwent forward and backward translation and a pilot study. The tool was then used in an interviewer administered cross-sectional survey in Neno district, Malawi, to test validity and reliability. Exploratory factor analysis was performed on a random half of the sample to evaluate internal consistency, reliability and construct validity of items and scales. The identified constructs were then tested with confirmatory factor analysis. Likert scale assumption testing and descriptive statistics were done on the final factor structure. The PCAT-Mw was further tested for intra-rater and inter-rater reliability. From the responses of 631 patients, a 29-item PCAT-Mw was constructed comprising seven multi-item scales, representing five primary care dimensions (first contact, continuity, comprehensiveness, coordination and community orientation). All the seven scales achieved good internal consistency, item-total correlations and construct validity. Cronbach's alpha coefficient ranged from 0.66 to 0.91. A satisfactory goodness of fit model was achieved (GFI = 0.90, CFI = 0.91, RMSEA = 0.05, PCLOSE = 0.65). The full range of possible scores was observed for all scales. Scaling assumptions tests were achieved for all except the two comprehensiveness scales. Intra-class correlation coefficient (ICC) was 0.90 (n = 44, 95% CI 0.81-0.94, p < 0.001) for intra-rater reliability and 0.84 (n = 42, 95% CI 0.71-0.96, p < 0.001) for inter-rater reliability. Comprehensive metric analyses supported the reliability and validity of PCAT-Mw in assessing the core concepts of primary care from adult patients' experience. This tool could be used for health service research in primary care in Malawi.

  16. Subsonic Wing Optimization for Handling Qualities Using ACSYNT

    NASA Technical Reports Server (NTRS)

    Soban, Danielle Suzanne

    1996-01-01

    The capability to accurately and rapidly predict aircraft stability derivatives using one comprehensive analysis tool has been created. The PREDAVOR tool has the following capabilities: rapid estimation of stability derivatives using a vortex lattice method, calculation of a longitudinal handling qualities metric, and inherent methodology to optimize a given aircraft configuration for longitudinal handling qualities, including an intuitive graphical interface. The PREDAVOR tool may be applied to both subsonic and supersonic designs, as well as conventional and unconventional, symmetric and asymmetric configurations. The workstation-based tool uses as its model a three-dimensional model of the configuration generated using a computer aided design (CAD) package. The PREDAVOR tool was applied to a Lear Jet Model 23 and the North American XB-70 Valkyrie.

  17. E-Learning for Depth in the Semantic Web

    ERIC Educational Resources Information Center

    Shafrir, Uri; Etkind, Masha

    2006-01-01

    In this paper, we describe concept parsing algorithms, a novel semantic analysis methodology at the core of a new pedagogy that focuses learners attention on deep comprehension of the conceptual content of learned material. Two new e-learning tools are described in some detail: interactive concept discovery learning and meaning equivalence…

  18. AUTOMATED GIS WATERSHED ANALYSIS TOOLS FOR RUSLE/SEDMOD SOIL EROSION AND SEDIMENTATION MODELING

    EPA Science Inventory

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed using a suite of automated Arc Macro Language (AML ) scripts and a pair of processing- intensive ANSI C++ executable programs operating on an ESRI ArcGIS 8.x Workstation platform...

  19. Data Mining and Knowledge Management in Higher Education -Potential Applications.

    ERIC Educational Resources Information Center

    Luan, Jing

    This paper introduces a new decision support tool, data mining, in the context of knowledge management. The most striking features of data mining techniques are clustering and prediction. The clustering aspect of data mining offers comprehensive characteristics analysis of students, while the predicting function estimates the likelihood for a…

  20. MATREX: A Unifying Modeling and Simulation Architecture for Live-Virtual-Constructive Applications

    DTIC Science & Technology

    2007-05-23

    Deployment Systems Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems...CMS2 – Comprehensive Munitions & Sensor Server • CSAT – C4ISR Static Analysis Tool • C4ISR – Command & Control, Communications, Computers

  1. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  2. Simulation-based comprehensive benchmarking of RNA-seq aligners

    PubMed Central

    Baruzzo, Giacomo; Hayer, Katharina E; Kim, Eun Ji; Di Camillo, Barbara; FitzGerald, Garret A; Grant, Gregory R

    2018-01-01

    Alignment is the first step in most RNA-seq analysis pipelines, and the accuracy of downstream analyses depends heavily on it. Unlike most steps in the pipeline, alignment is particularly amenable to benchmarking with simulated data. We performed a comprehensive benchmarking of 14 common splice-aware aligners for base, read, and exon junction-level accuracy and compared default with optimized parameters. We found that performance varied by genome complexity, and accuracy and popularity were poorly correlated. The most widely cited tool underperforms for most metrics, particularly when using default settings. PMID:27941783

  3. Falls screening and assessment tools used in acute mental health settings: a review of policies in England and Wales

    PubMed Central

    Narayanan, V.; Dickinson, A.; Victor, C.; Griffiths, C.; Humphrey, D.

    2016-01-01

    Objectives There is an urgent need to improve the care of older people at risk of falls or who experience falls in mental health settings. The aims of this study were to evaluate the individual falls risk assessment tools adopted by National Health Service (NHS) mental health trusts in England and healthcare boards in Wales, to evaluate the comprehensiveness of these tools and to review their predictive validity. Methods All NHS mental health trusts in England (n = 56) and healthcare boards in Wales (n = 6) were invited to supply their falls policies and other relevant documentation (e.g. local falls audits). In order to check the comprehensiveness of tools listed in policy documents, the risk variables of the tools adopted by the mental health trusts’ policies were compared with the 2004 National Institute for Health and Care Excellence (NICE) falls prevention guidelines. A comprehensive analytical literature review was undertaken to evaluate the predictive validity of the tools used in these settings. Results Falls policies were obtained from 46 mental health trusts. Thirty-five policies met the study inclusion criteria and were included in the analysis. The main falls assessment tools used were the St. Thomas’ Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY), Falls Risk Assessment Scale for the Elderly, Morse Falls Scale (MFS) and Falls Risk Assessment Tool (FRAT). On detailed examination, a number of different versions of the FRAT were evident; validated tools had inconsistent predictive validity and none of them had been validated in mental health settings. Conclusions Falls risk assessment is the most commonly used component of risk prevention strategies, but most policies included unvalidated tools and even well validated tool such as the STRATIFY and the MFS that are reported to have inconsistent predictive accuracy. This raises questions about operational usefulness, as none of these tools have been tested in acute mental health settings. The falls risk assessment tools from only four mental health trusts met all the recommendations of the NICE falls guidelines on multifactorial assessment for prevention of falls. The recent NICE (2013) guidance states that tools predicting risk using numeric scales should no longer be used; however, multifactorial risk assessment and interventions tailored to patient needs is recommended. Trusts will need to update their policies in response to this guidance. PMID:26395210

  4. Data Standards for Flow Cytometry

    PubMed Central

    SPIDLEN, JOSEF; GENTLEMAN, ROBERT C.; HAALAND, PERRY D.; LANGILLE, MORGAN; MEUR, NOLWENN LE; OCHS, MICHAEL F.; SCHMITT, CHARLES; SMITH, CLAYTON A.; TREISTER, ADAM S.; BRINKMAN, RYAN R.

    2009-01-01

    Flow cytometry (FCM) is an analytical tool widely used for cancer and HIV/AIDS research, and treatment, stem cell manipulation and detecting microorganisms in environmental samples. Current data standards do not capture the full scope of FCM experiments and there is a demand for software tools that can assist in the exploration and analysis of large FCM datasets. We are implementing a standardized approach to capturing, analyzing, and disseminating FCM data that will facilitate both more complex analyses and analysis of datasets that could not previously be efficiently studied. Initial work has focused on developing a community-based guideline for recording and reporting the details of FCM experiments. Open source software tools that implement this standard are being created, with an emphasis on facilitating reproducible and extensible data analyses. As well, tools for electronic collaboration will assist the integrated access and comprehension of experiments to empower users to collaborate on FCM analyses. This coordinated, joint development of bioinformatics standards and software tools for FCM data analysis has the potential to greatly facilitate both basic and clinical research—impacting a notably diverse range of medical and environmental research areas. PMID:16901228

  5. The Effects of Literacy Support Tools on the Comprehension of Informational e-Books and Print-Based Text

    ERIC Educational Resources Information Center

    Herman, Heather A.

    2017-01-01

    This mixed methods research explores the effects of literacy support tools to support comprehension strategies when reading informational e-books and print-based text with 14 first-grade students. This study focused on the following comprehension strategies: annotating connections, annotating "I wonders," and looking back in the text.…

  6. Enrichr: a comprehensive gene set enrichment analysis web server 2016 update

    PubMed Central

    Kuleshov, Maxim V.; Jones, Matthew R.; Rouillard, Andrew D.; Fernandez, Nicolas F.; Duan, Qiaonan; Wang, Zichen; Koplev, Simon; Jenkins, Sherry L.; Jagodnik, Kathleen M.; Lachmann, Alexander; McDermott, Michael G.; Monteiro, Caroline D.; Gundersen, Gregory W.; Ma'ayan, Avi

    2016-01-01

    Enrichment analysis is a popular method for analyzing gene sets generated by genome-wide experiments. Here we present a significant update to one of the tools in this domain called Enrichr. Enrichr currently contains a large collection of diverse gene set libraries available for analysis and download. In total, Enrichr currently contains 180 184 annotated gene sets from 102 gene set libraries. New features have been added to Enrichr including the ability to submit fuzzy sets, upload BED files, improved application programming interface and visualization of the results as clustergrams. Overall, Enrichr is a comprehensive resource for curated gene sets and a search engine that accumulates biological knowledge for further biological discoveries. Enrichr is freely available at: http://amp.pharm.mssm.edu/Enrichr. PMID:27141961

  7. Solid Modeling Aerospace Research Tool (SMART) user's guide, version 2.0

    NASA Technical Reports Server (NTRS)

    Mcmillin, Mark L.; Spangler, Jan L.; Dahmen, Stephen M.; Rehder, John J.

    1993-01-01

    The Solid Modeling Aerospace Research Tool (SMART) software package is used in the conceptual design of aerospace vehicles. It provides a highly interactive and dynamic capability for generating geometries with Bezier cubic patches. Features include automatic generation of commonly used aerospace constructs (e.g., wings and multilobed tanks); cross-section skinning; wireframe and shaded presentation; area, volume, inertia, and center-of-gravity calculations; and interfaces to various aerodynamic and structural analysis programs. A comprehensive description of SMART and how to use it is provided.

  8. R-SWAT-FME user's guide

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shu-Guang

    2012-01-01

    R program language-Soil and Water Assessment Tool-Flexible Modeling Environment (R-SWAT-FME) (Wu and Liu, 2012) is a comprehensive modeling framework that adopts an R package, Flexible Modeling Environment (FME) (Soetaert and Petzoldt, 2010), for the Soil and Water Assessment Tool (SWAT) model (Arnold and others, 1998; Neitsch and others, 2005). This framework provides the functionalities of parameter identifiability, model calibration, and sensitivity and uncertainty analysis with instant visualization. This user's guide shows how to apply this framework for a customized SWAT project.

  9. Critical thinking as an educational outcome: an evaluation of current tools of measurement.

    PubMed

    Adams, M H; Whitlow, J F; Stover, L M; Johnson, K W

    1996-01-01

    Critical thinking, an outcome criterion of the National League for Nursing and the Council of Baccalaureate and Higher Degree Programs, is an abstract skill difficult to measure. The authors provide a comprehensive review of four instruments designed to measure critical thinking and summarize research in which the tools were used. Analysis of this information will empower nursing faculty members to select a critical-thinking instrument that is individualized to the needs of their respective nursing programs.

  10. Comprehensive Analysis of DNA Methylation Data with RnBeads

    PubMed Central

    Walter, Jörn; Lengauer, Thomas; Bock, Christoph

    2014-01-01

    RnBeads is a software tool for large-scale analysis and interpretation of DNA methylation data, providing a user-friendly analysis workflow that yields detailed hypertext reports (http://rnbeads.mpi-inf.mpg.de). Supported assays include whole genome bisulfite sequencing, reduced representation bisulfite sequencing, Infinium microarrays, and any other protocol that produces high-resolution DNA methylation data. Important applications of RnBeads include the analysis of epigenome-wide association studies and epigenetic biomarker discovery in cancer cohorts. PMID:25262207

  11. Performance and Sizing Tool for Quadrotor Biplane Tailsitter UAS

    NASA Astrophysics Data System (ADS)

    Strom, Eric

    The Quadrotor-Biplane-Tailsitter (QBT) configuration is the basis for a mechanically simplistic rotorcraft capable of both long-range, high-speed cruise as well as hovering flight. This work presents the development and validation of a set of preliminary design tools built specifically for this aircraft to enable its further development, including: a QBT weight model, preliminary sizing framework, and vehicle analysis tools. The preliminary sizing tool presented here shows the advantage afforded by QBT designs in missions with aggressive cruise requirements, such as offshore wind turbine inspections, wherein transition from a quadcopter configuration to a QBT allows for a 5:1 trade of battery weight for wing weight. A 3D, unsteady panel method utilizing a nonlinear implementation of the Kutta-Joukowsky condition is also presented as a means of computing aerodynamic interference effects and, through the implementation of rotor, body, and wing geometry generators, is prepared for coupling with a comprehensive rotor analysis package.

  12. Extraction and Analysis of Display Data

    NASA Technical Reports Server (NTRS)

    Land, Chris; Moye, Kathryn

    2008-01-01

    The Display Audit Suite is an integrated package of software tools that partly automates the detection of Portable Computer System (PCS) Display errors. [PCS is a lap top computer used onboard the International Space Station (ISS).] The need for automation stems from the large quantity of PCS displays (6,000+, with 1,000,000+ lines of command and telemetry data). The Display Audit Suite includes data-extraction tools, automatic error detection tools, and database tools for generating analysis spread sheets. These spread sheets allow engineers to more easily identify many different kinds of possible errors. The Suite supports over 40 independent analyses, 16 NASA Tech Briefs, November 2008 and complements formal testing by being comprehensive (all displays can be checked) and by revealing errors that are difficult to detect via test. In addition, the Suite can be run early in the development cycle to find and correct errors in advance of testing.

  13. A computer-aided approach to nonlinear control systhesis

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Anthony, Tobin

    1988-01-01

    The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.

  14. Evaluation of the sustainability of contrasted pig farming systems: economy.

    PubMed

    Ilari-Antoine, E; Bonneau, M; Klauke, T N; Gonzàlez, J; Dourmad, J Y; De Greef, K; Houwers, H W J; Fabrega, E; Zimmer, C; Hviid, M; Van der Oever, B; Edwards, S A

    2014-12-01

    The aim of this paper is to present an efficient tool for evaluating the economy part of the sustainability of pig farming systems. The selected tool IDEA was tested on a sample of farms from 15 contrasted systems in Europe. A statistical analysis was carried out to check the capacity of the indicators to illustrate the variability of the population and to analyze which of these indicators contributed the most towards it. The scores obtained for the farms were consistent with the reality of pig production; the variable distribution showed an important variability of the sample. The principal component analysis and cluster analysis separated the sample into five subgroups, in which the six main indicators significantly differed, which underlines the robustness of the tool. The IDEA method was proven to be easily comprehensible, requiring few initial variables and with an efficient benchmarking system; all six indicators contributed to fully describe a varied and contrasted population.

  15. BFPTool: a software tool for analysis of Biomembrane Force Probe experiments.

    PubMed

    Šmít, Daniel; Fouquet, Coralie; Doulazmi, Mohamed; Pincet, Frédéric; Trembleau, Alain; Zapotocky, Martin

    2017-01-01

    The Biomembrane Force Probe is an approachable experimental technique commonly used for single-molecule force spectroscopy and experiments on biological interfaces. The technique operates in the range of forces from 0.1 pN to 1000 pN. Experiments are typically repeated many times, conditions are often not optimal, the captured video can be unstable and lose focus; this makes efficient analysis challenging, while out-of-the-box non-proprietary solutions are not freely available. This dedicated tool was developed to integrate and simplify the image processing and analysis of videomicroscopy recordings from BFP experiments. A novel processing feature, allowing the tracking of the pipette, was incorporated to address a limitation of preceding methods. Emphasis was placed on versatility and comprehensible user interface implemented in a graphical form. An integrated analytical tool was implemented to provide a faster, simpler and more convenient way to process and analyse BFP experiments.

  16. metAlignID: a high-throughput software tool set for automated detection of trace level contaminants in comprehensive LECO two-dimensional gas chromatography time-of-flight mass spectrometry data.

    PubMed

    Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J

    2012-11-09

    A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Estimating the Diets of Animals Using Stable Isotopes and a Comprehensive Bayesian Mixing Model

    PubMed Central

    Hopkins, John B.; Ferguson, Jake M.

    2012-01-01

    Using stable isotope mixing models (SIMMs) as a tool to investigate the foraging ecology of animals is gaining popularity among researchers. As a result, statistical methods are rapidly evolving and numerous models have been produced to estimate the diets of animals—each with their benefits and their limitations. Deciding which SIMM to use is contingent on factors such as the consumer of interest, its food sources, sample size, the familiarity a user has with a particular framework for statistical analysis, or the level of inference the researcher desires to make (e.g., population- or individual-level). In this paper, we provide a review of commonly used SIMM models and describe a comprehensive SIMM that includes all features commonly used in SIMM analysis and two new features. We used data collected in Yosemite National Park to demonstrate IsotopeR's ability to estimate dietary parameters. We then examined the importance of each feature in the model and compared our results to inferences from commonly used SIMMs. IsotopeR's user interface (in R) will provide researchers a user-friendly tool for SIMM analysis. The model is also applicable for use in paleontology, archaeology, and forensic studies as well as estimating pollution inputs. PMID:22235246

  18. The most common technologies and tools for functional genome analysis.

    PubMed

    Gasperskaja, Evelina; Kučinskas, Vaidutis

    2017-01-01

    Since the sequence of the human genome is complete, the main issue is how to understand the information written in the DNA sequence. Despite numerous genome-wide studies that have already been performed, the challenge to determine the function of genes, gene products, and also their interaction is still open. As changes in the human genome are highly likely to cause pathological conditions, functional analysis is vitally important for human health. For many years there have been a variety of technologies and tools used in functional genome analysis. However, only in the past decade there has been rapid revolutionizing progress and improvement in high-throughput methods, which are ranging from traditional real-time polymerase chain reaction to more complex systems, such as next-generation sequencing or mass spectrometry. Furthermore, not only laboratory investigation, but also accurate bioinformatic analysis is required for reliable scientific results. These methods give an opportunity for accurate and comprehensive functional analysis that involves various fields of studies: genomics, epigenomics, proteomics, and interactomics. This is essential for filling the gaps in the knowledge about dynamic biological processes at both cellular and organismal level. However, each method has both advantages and limitations that should be taken into account before choosing the right method for particular research in order to ensure successful study. For this reason, the present review paper aims to describe the most frequent and widely-used methods for the comprehensive functional analysis.

  19. Irena : tool suite for modeling and analysis of small-angle scattering.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilavsky, J.; Jemian, P.

    2009-04-01

    Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less

  20. Strategic Planning and Energy Options Analysis for the Fort Peck Assiniboine and Sioux Tribes. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, Jim S; Greenwood Village, CO 80112

    2007-03-31

    Strategic Planning and Energy Options Analysis provides the Fort Peck Tribes with a tool to build analytical capabilities and local capacity to extract the natural and energy resource potential for the benefit of the tribal community. Each resource is identified irrespective of the development potential and is viewed as an absolute resulting in a comprehensive resource assessment for Tribal energy planning

  1. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  2. MIPS: a database for genomes and protein sequences

    PubMed Central

    Mewes, H. W.; Frishman, D.; Güldener, U.; Mannhaupt, G.; Mayer, K.; Mokrejs, M.; Morgenstern, B.; Münsterkötter, M.; Rudd, S.; Weil, B.

    2002-01-01

    The Munich Information Center for Protein Sequences (MIPS-GSF, Neuherberg, Germany) continues to provide genome-related information in a systematic way. MIPS supports both national and European sequencing and functional analysis projects, develops and maintains automatically generated and manually annotated genome-specific databases, develops systematic classification schemes for the functional annotation of protein sequences, and provides tools for the comprehensive analysis of protein sequences. This report updates the information on the yeast genome (CYGD), the Neurospora crassa genome (MNCDB), the databases for the comprehensive set of genomes (PEDANT genomes), the database of annotated human EST clusters (HIB), the database of complete cDNAs from the DHGP (German Human Genome Project), as well as the project specific databases for the GABI (Genome Analysis in Plants) and HNB (Helmholtz–Netzwerk Bioinformatik) networks. The Arabidospsis thaliana database (MATDB), the database of mitochondrial proteins (MITOP) and our contribution to the PIR International Protein Sequence Database have been described elsewhere [Schoof et al. (2002) Nucleic Acids Res., 30, 91–93; Scharfe et al. (2000) Nucleic Acids Res., 28, 155–158; Barker et al. (2001) Nucleic Acids Res., 29, 29–32]. All databases described, the protein analysis tools provided and the detailed descriptions of our projects can be accessed through the MIPS World Wide Web server (http://mips.gsf.de). PMID:11752246

  3. MIPS: a database for genomes and protein sequences.

    PubMed

    Mewes, H W; Frishman, D; Güldener, U; Mannhaupt, G; Mayer, K; Mokrejs, M; Morgenstern, B; Münsterkötter, M; Rudd, S; Weil, B

    2002-01-01

    The Munich Information Center for Protein Sequences (MIPS-GSF, Neuherberg, Germany) continues to provide genome-related information in a systematic way. MIPS supports both national and European sequencing and functional analysis projects, develops and maintains automatically generated and manually annotated genome-specific databases, develops systematic classification schemes for the functional annotation of protein sequences, and provides tools for the comprehensive analysis of protein sequences. This report updates the information on the yeast genome (CYGD), the Neurospora crassa genome (MNCDB), the databases for the comprehensive set of genomes (PEDANT genomes), the database of annotated human EST clusters (HIB), the database of complete cDNAs from the DHGP (German Human Genome Project), as well as the project specific databases for the GABI (Genome Analysis in Plants) and HNB (Helmholtz-Netzwerk Bioinformatik) networks. The Arabidospsis thaliana database (MATDB), the database of mitochondrial proteins (MITOP) and our contribution to the PIR International Protein Sequence Database have been described elsewhere [Schoof et al. (2002) Nucleic Acids Res., 30, 91-93; Scharfe et al. (2000) Nucleic Acids Res., 28, 155-158; Barker et al. (2001) Nucleic Acids Res., 29, 29-32]. All databases described, the protein analysis tools provided and the detailed descriptions of our projects can be accessed through the MIPS World Wide Web server (http://mips.gsf.de).

  4. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  5. Comprehensive benchmarking and ensemble approaches for metagenomic classifiers.

    PubMed

    McIntyre, Alexa B R; Ounit, Rachid; Afshinnekoo, Ebrahim; Prill, Robert J; Hénaff, Elizabeth; Alexander, Noah; Minot, Samuel S; Danko, David; Foox, Jonathan; Ahsanuddin, Sofia; Tighe, Scott; Hasan, Nur A; Subramanian, Poorani; Moffat, Kelly; Levy, Shawn; Lonardi, Stefano; Greenfield, Nick; Colwell, Rita R; Rosen, Gail L; Mason, Christopher E

    2017-09-21

    One of the main challenges in metagenomics is the identification of microorganisms in clinical and environmental samples. While an extensive and heterogeneous set of computational tools is available to classify microorganisms using whole-genome shotgun sequencing data, comprehensive comparisons of these methods are limited. In this study, we use the largest-to-date set of laboratory-generated and simulated controls across 846 species to evaluate the performance of 11 metagenomic classifiers. Tools were characterized on the basis of their ability to identify taxa at the genus, species, and strain levels, quantify relative abundances of taxa, and classify individual reads to the species level. Strikingly, the number of species identified by the 11 tools can differ by over three orders of magnitude on the same datasets. Various strategies can ameliorate taxonomic misclassification, including abundance filtering, ensemble approaches, and tool intersection. Nevertheless, these strategies were often insufficient to completely eliminate false positives from environmental samples, which are especially important where they concern medically relevant species. Overall, pairing tools with different classification strategies (k-mer, alignment, marker) can combine their respective advantages. This study provides positive and negative controls, titrated standards, and a guide for selecting tools for metagenomic analyses by comparing ranges of precision, accuracy, and recall. We show that proper experimental design and analysis parameters can reduce false positives, provide greater resolution of species in complex metagenomic samples, and improve the interpretation of results.

  6. Benefits and applications of interdisciplinary digital tools for environmental meta-reviews and analyses

    NASA Astrophysics Data System (ADS)

    Grubert, Emily; Siders, Anne

    2016-09-01

    Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.

  7. A comprehensive scoring system to measure healthy community design in land use plans and regulations.

    PubMed

    Maiden, Kristin M; Kaplan, Marina; Walling, Lee Ann; Miller, Patricia P; Crist, Gina

    2017-02-01

    Comprehensive land use plans and their corresponding regulations play a role in determining the nature of the built environment and community design, which are factors that influence population health and health disparities. To determine the level in which a plan addresses healthy living and active design, there is a need for a systematic, reliable and valid method of analyzing and scoring health-related content in plans and regulations. This paper describes the development and validation of a scoring tool designed to measure the strength and comprehensiveness of health-related content found in land use plans and the corresponding regulations. The measures are scored based on the presence of a specific item and the specificity and action-orientation of language. To establish reliability and validity, 42 land use plans and regulations from across the United States were scored January-April 2016. Results of the psychometric analysis indicate the scorecard is a reliable scoring tool for land use plans and regulations related to healthy living and active design. Intraclass correlation coefficients (ICC) scores showed strong inter-rater reliability for total strength and comprehensiveness. ICC scores for total implementation scores showed acceptable consistency among scorers. Cronbach's alpha values for all focus areas were acceptable. Strong content validity was measured through a committee vetting process. The development of this tool has far-reaching implications, bringing standardization of measurement to the field of land use plan assessment, and paving the way for systematic inclusion of health-related design principles, policies, and requirements in land use plans and their corresponding regulations. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. PANTHER version 11: expanded annotation data from Gene Ontology and Reactome pathways, and data analysis tool enhancements.

    PubMed

    Mi, Huaiyu; Huang, Xiaosong; Muruganujan, Anushya; Tang, Haiming; Mills, Caitlin; Kang, Diane; Thomas, Paul D

    2017-01-04

    The PANTHER database (Protein ANalysis THrough Evolutionary Relationships, http://pantherdb.org) contains comprehensive information on the evolution and function of protein-coding genes from 104 completely sequenced genomes. PANTHER software tools allow users to classify new protein sequences, and to analyze gene lists obtained from large-scale genomics experiments. In the past year, major improvements include a large expansion of classification information available in PANTHER, as well as significant enhancements to the analysis tools. Protein subfamily functional classifications have more than doubled due to progress of the Gene Ontology Phylogenetic Annotation Project. For human genes (as well as a few other organisms), PANTHER now also supports enrichment analysis using pathway classifications from the Reactome resource. The gene list enrichment tools include a new 'hierarchical view' of results, enabling users to leverage the structure of the classifications/ontologies; the tools also allow users to upload genetic variant data directly, rather than requiring prior conversion to a gene list. The updated coding single-nucleotide polymorphisms (SNP) scoring tool uses an improved algorithm. The hidden Markov model (HMM) search tools now use HMMER3, dramatically reducing search times and improving accuracy of E-value statistics. Finally, the PANTHER Tree-Attribute Viewer has been implemented in JavaScript, with new views for exploring protein sequence evolution. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Tools for Knowledge Analysis, Synthesis, and Sharing

    NASA Astrophysics Data System (ADS)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  10. Multiple Criteria Evaluation of Quality and Optimisation of e-Learning System Components

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Dagiene, Valentina

    2010-01-01

    The main research object of the paper is investigation and proposal of the comprehensive Learning Object Repositories (LORs) quality evaluation tool suitable for their multiple criteria decision analysis, evaluation and optimisation. Both LORs "internal quality" and "quality in use" evaluation (decision making) criteria are analysed in the paper.…

  11. The Textevaluator Tool: Helping Teachers and Test Developers Select Texts for Use in Instruction and Assessment

    ERIC Educational Resources Information Center

    Sheehan, Kathleen M.; Kostin, Irene; Napolitano, Diane; Flor, Michael

    2014-01-01

    This article describes TextEvaluator, a comprehensive text-analysis system designed to help teachers, textbook publishers, test developers, and literacy researchers select reading materials that are consistent with the text complexity goals outlined in the Common Core State Standards. Three particular aspects of the TextEvaluator measurement…

  12. Student Evaluation of Teaching: A Study Exploring Student Rating Instrument Free-Form Text Comments

    ERIC Educational Resources Information Center

    Stupans, Ieva; McGuren, Therese; Babey, Anna Marie

    2016-01-01

    Student rating instruments are recognised to be valid indicators of effective instruction, providing a valuable tool to improve teaching. However, free-form text comments obtained from the open-ended question component of such surveys are only infrequently analysed comprehensively. We employed an innovative, systematic approach to the analysis of…

  13. User Centered Reading Intervention for Individuals with Autism and Intellectual Disability.

    PubMed

    Yakkundi, Anita; Dillenburger, Karola; Goodman, Lizbeth; Dounavi, Katerina

    2017-01-01

    Individuals with autism and intellectual disability (ID) have complex learning needs and often have difficulty in acquiring reading comprehension skills using conventional teaching tools. Evidence based reading interventions for these learners and the use of assistive technology and application of behaviour analysis to develop user-centered teaching is discussed in this paper.

  14. Main differences between volatiles of sparkling and base wines accessed through comprehensive two dimensional gas chromatography with time-of-flight mass spectrometric detection and chemometric tools.

    PubMed

    Welke, Juliane Elisa; Zanus, Mauro; Lazzarotto, Marcelo; Pulgati, Fernando Hepp; Zini, Cláudia Alcaraz

    2014-12-01

    The main changes in the volatile profile of base wines and their corresponding sparkling wines produced by traditional method were evaluated and investigated for the first time using headspace solid-phase microextraction combined with comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry detection (GC×GC/TOFMS) and chemometric tools. Fisher ratios helped to find the 119 analytes that were responsible for the main differences between base and sparkling wines and principal component analysis explained 93.1% of the total variance related to the selected 78 compounds. It was also possible to observe five subclusters in base wines and four subclusters in sparkling wines samples through hierarchical cluster analysis, which seemed to have an organised distribution according to the regions where the wines came from. Twenty of the most important volatile compounds co-eluted with other components and separation of some of them was possible due to GC×GC/TOFMS performance. Copyright © 2014. Published by Elsevier Ltd.

  15. MutAIT: an online genetic toxicology data portal and analysis tools.

    PubMed

    Avancini, Daniele; Menzies, Georgina E; Morgan, Claire; Wills, John; Johnson, George E; White, Paul A; Lewis, Paul D

    2016-05-01

    Assessment of genetic toxicity and/or carcinogenic activity is an essential element of chemical screening programs employed to protect human health. Dose-response and gene mutation data are frequently analysed by industry, academia and governmental agencies for regulatory evaluations and decision making. Over the years, a number of efforts at different institutions have led to the creation and curation of databases to house genetic toxicology data, largely, with the aim of providing public access to facilitate research and regulatory assessments. This article provides a brief introduction to a new genetic toxicology portal called Mutation Analysis Informatics Tools (MutAIT) (www.mutait.org) that provides easy access to two of the largest genetic toxicology databases, the Mammalian Gene Mutation Database (MGMD) and TransgenicDB. TransgenicDB is a comprehensive collection of transgenic rodent mutation data initially compiled and collated by Health Canada. The updated MGMD contains approximately 50 000 individual mutation spectral records from the published literature. The portal not only gives access to an enormous quantity of genetic toxicology data, but also provides statistical tools for dose-response analysis and calculation of benchmark dose. Two important R packages for dose-response analysis are provided as web-distributed applications with user-friendly graphical interfaces. The 'drsmooth' package performs dose-response shape analysis and determines various points of departure (PoD) metrics and the 'PROAST' package provides algorithms for dose-response modelling. The MutAIT statistical tools, which are currently being enhanced, provide users with an efficient and comprehensive platform to conduct quantitative dose-response analyses and determine PoD values that can then be used to calculate human exposure limits or margins of exposure. © The Author 2015. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. miRToolsGallery: a tag-based and rankable microRNA bioinformatics resources database portal

    PubMed Central

    Chen, Liang; Heikkinen, Liisa; Wang, ChangLiang; Yang, Yang; Knott, K Emily

    2018-01-01

    Abstract Hundreds of bioinformatics tools have been developed for MicroRNA (miRNA) investigations including those used for identification, target prediction, structure and expression profile analysis. However, finding the correct tool for a specific application requires the tedious and laborious process of locating, downloading, testing and validating the appropriate tool from a group of nearly a thousand. In order to facilitate this process, we developed a novel database portal named miRToolsGallery. We constructed the portal by manually curating > 950 miRNA analysis tools and resources. In the portal, a query to locate the appropriate tool is expedited by being searchable, filterable and rankable. The ranking feature is vital to quickly identify and prioritize the more useful from the obscure tools. Tools are ranked via different criteria including the PageRank algorithm, date of publication, number of citations, average of votes and number of publications. miRToolsGallery provides links and data for the comprehensive collection of currently available miRNA tools with a ranking function which can be adjusted using different criteria according to specific requirements. Database URL: http://www.mirtoolsgallery.org PMID:29688355

  17. Systems biology: A tool for charting the antiviral landscape.

    PubMed

    Bowen, James R; Ferris, Martin T; Suthar, Mehul S

    2016-06-15

    The host antiviral programs that are initiated following viral infection form a dynamic and complex web of responses that we have collectively termed as "the antiviral landscape". Conventional approaches to studying antiviral responses have primarily used reductionist systems to assess the function of a single or a limited subset of molecules. Systems biology is a holistic approach that considers the entire system as a whole, rather than individual components or molecules. Systems biology based approaches facilitate an unbiased and comprehensive analysis of the antiviral landscape, while allowing for the discovery of emergent properties that are missed by conventional approaches. The antiviral landscape can be viewed as a hierarchy of complexity, beginning at the whole organism level and progressing downward to isolated tissues, populations of cells, and single cells. In this review, we will discuss how systems biology has been applied to better understand the antiviral landscape at each of these layers. At the organismal level, the Collaborative Cross is an invaluable genetic resource for assessing how genetic diversity influences the antiviral response. Whole tissue and isolated bulk cell transcriptomics serves as a critical tool for the comprehensive analysis of antiviral responses at both the tissue and cellular levels of complexity. Finally, new techniques in single cell analysis are emerging tools that will revolutionize our understanding of how individual cells within a bulk infected cell population contribute to the overall antiviral landscape. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Cross-cultural adaptation and psychometric assessment of the Chinese version of the comprehensive needs assessment tool for cancer caregivers (CNAT-C).

    PubMed

    Zhang, Yin-Ping; Zhao, Xin-Shuang; Zhang, Bei; Zhang, Lu-Lu; Ni, Chun-Ping; Hao, Nan; Shi, Chang-Bei; Porr, Caroline

    2015-07-01

    The comprehensive needs assessment tool for cancer caregivers (CNAT-C) is a systematic and comprehensive needs assessment tool for the family caregivers. The purpose of this project was twofold: (1) to adapt the CNAT-C to Mainland China's cultural context and (2) to evaluate the psychometric properties of the newly adapted Chinese CNAT-C. Cross-cultural adaptation of the original CNAT-C was performed according to published guidelines. A pilot study was conducted in Mainland China with 30 Chinese family cancer caregivers. A subsequent validation study was conducted with 205 Chinese cancer caregivers from Mainland China. Construct validity was determined through exploratory and confirmatory factor analyses. Reliability was determined using internal consistency and test-retest reliability. The split-half coefficient for the overall Chinese CNAT-C scale was 0.77. Principal component analysis resulted in an eight-factor structure explaining 68.11 % of the total variance. The comparative fit index (CFI) was 0.91 from the modified model confirmatory factor analysis. The Chi-square divided by degrees of freedom was 1.98, and the root mean squared error of approximation (RMSEA) was 0.079. In relation to the known-group validation, significant differences were found in the Chinese CNAT-C scale according to various caregiver characteristics. Internal consistency was high for the Chinese CNAT-C reaching a Cronbach α value of 0.94. Test-retest reliability was 0.85. The newly adapted Chinese CNAT-C scale possesses adequate validity, test-retest reliability, and internal consistency and therefore may be used to ascertain holistic health and support needs of cancer patients' family caregivers in Mainland China.

  19. Enrichr: a comprehensive gene set enrichment analysis web server 2016 update.

    PubMed

    Kuleshov, Maxim V; Jones, Matthew R; Rouillard, Andrew D; Fernandez, Nicolas F; Duan, Qiaonan; Wang, Zichen; Koplev, Simon; Jenkins, Sherry L; Jagodnik, Kathleen M; Lachmann, Alexander; McDermott, Michael G; Monteiro, Caroline D; Gundersen, Gregory W; Ma'ayan, Avi

    2016-07-08

    Enrichment analysis is a popular method for analyzing gene sets generated by genome-wide experiments. Here we present a significant update to one of the tools in this domain called Enrichr. Enrichr currently contains a large collection of diverse gene set libraries available for analysis and download. In total, Enrichr currently contains 180 184 annotated gene sets from 102 gene set libraries. New features have been added to Enrichr including the ability to submit fuzzy sets, upload BED files, improved application programming interface and visualization of the results as clustergrams. Overall, Enrichr is a comprehensive resource for curated gene sets and a search engine that accumulates biological knowledge for further biological discoveries. Enrichr is freely available at: http://amp.pharm.mssm.edu/Enrichr. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. CoryneBase: Corynebacterium Genomic Resources and Analysis Tools at Your Fingertips

    PubMed Central

    Tan, Mui Fern; Jakubovics, Nick S.; Wee, Wei Yee; Mutha, Naresh V. R.; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

    2014-01-01

    Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

  1. Quantification of changes in language-related brain areas in autism spectrum disorders using large-scale network analysis.

    PubMed

    Goch, Caspar J; Stieltjes, Bram; Henze, Romy; Hering, Jan; Poustka, Luise; Meinzer, Hans-Peter; Maier-Hein, Klaus H

    2014-05-01

    Diagnosis of autism spectrum disorders (ASD) is difficult, as symptoms vary greatly and are difficult to quantify objectively. Recent work has focused on the assessment of non-invasive diffusion tensor imaging-based biomarkers that reflect the microstructural characteristics of neuronal pathways in the brain. While tractography-based approaches typically analyze specific structures of interest, a graph-based large-scale network analysis of the connectome can yield comprehensive measures of larger-scale architectural patterns in the brain. Commonly applied global network indices, however, do not provide any specificity with respect to functional areas or anatomical structures. Aim of this work was to assess the concept of network centrality as a tool to perform locally specific analysis without disregarding the global network architecture and compare it to other popular network indices. We create connectome networks from fiber tractographies and parcellations of the human brain and compute global network indices as well as local indices for Wernicke's Area, Broca's Area and the Motor Cortex. Our approach was evaluated on 18 children suffering from ASD and 18 typically developed controls using magnetic resonance imaging-based cortical parcellations in combination with diffusion tensor imaging tractography. We show that the network centrality of Wernicke's area is significantly (p<0.001) reduced in ASD, while the motor cortex, which was used as a control region, did not show significant alterations. This could reflect the reduced capacity for comprehension of language in ASD. The betweenness centrality could potentially be an important metric in the development of future diagnostic tools in the clinical context of ASD diagnosis. Our results further demonstrate the applicability of large-scale network analysis tools in the domain of region-specific analysis with a potential application in many different psychological disorders.

  2. RIPPLELAB: A Comprehensive Application for the Detection, Analysis and Classification of High Frequency Oscillations in Electroencephalographic Signals

    PubMed Central

    Alvarado-Rojas, Catalina; Le Van Quyen, Michel; Valderrama, Mario

    2016-01-01

    High Frequency Oscillations (HFOs) in the brain have been associated with different physiological and pathological processes. In epilepsy, HFOs might reflect a mechanism of epileptic phenomena, serving as a biomarker of epileptogenesis and epileptogenicity. Despite the valuable information provided by HFOs, their correct identification is a challenging task. A comprehensive application, RIPPLELAB, was developed to facilitate the analysis of HFOs. RIPPLELAB provides a wide range of tools for HFOs manual and automatic detection and visual validation; all of them are accessible from an intuitive graphical user interface. Four methods for automated detection—as well as several options for visualization and validation of detected events—were implemented and integrated in the application. Analysis of multiple files and channels is possible, and new options can be added by users. All features and capabilities implemented in RIPPLELAB for automatic detection were tested through the analysis of simulated signals and intracranial EEG recordings from epileptic patients (n = 16; 3,471 analyzed hours). Visual validation was also tested, and detected events were classified into different categories. Unlike other available software packages for EEG analysis, RIPPLELAB uniquely provides the appropriate graphical and algorithmic environment for HFOs detection (visual and automatic) and validation, in such a way that the power of elaborated detection methods are available to a wide range of users (experts and non-experts) through the use of this application. We believe that this open-source tool will facilitate and promote the collaboration between clinical and research centers working on the HFOs field. The tool is available under public license and is accessible through a dedicated web site. PMID:27341033

  3. A multimedia comprehensive informatics system with decision support tools for a multi-site collaboration research of stroke rehabilitation

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Documet, Jorge; Garrison, Kathleen A.; Winstein, Carolee J.; Liu, Brent

    2012-02-01

    Stroke is a major cause of adult disability. The Interdisciplinary Comprehensive Arm Rehabilitation Evaluation (I-CARE) clinical trial aims to evaluate a therapy for arm rehabilitation after stroke. A primary outcome measure is correlative analysis between stroke lesion characteristics and standard measures of rehabilitation progress, from data collected at seven research facilities across the country. Sharing and communication of brain imaging and behavioral data is thus a challenge for collaboration. A solution is proposed as a web-based system with tools supporting imaging and informatics related data. In this system, users may upload anonymized brain images through a secure internet connection and the system will sort the imaging data for storage in a centralized database. Users may utilize an annotation tool to mark up images. In addition to imaging informatics, electronic data forms, for example, clinical data forms, are also integrated. Clinical information is processed and stored in the database to enable future data mining related development. Tele-consultation is facilitated through the development of a thin-client image viewing application. For convenience, the system supports access through desktop PC, laptops, and iPAD. Thus, clinicians may enter data directly into the system via iPAD while working with participants in the study. Overall, this comprehensive imaging informatics system enables users to collect, organize and analyze stroke cases efficiently.

  4. Necklace: combining reference and assembled transcriptomes for more comprehensive RNA-Seq analysis.

    PubMed

    Davidson, Nadia M; Oshlack, Alicia

    2018-05-01

    RNA sequencing (RNA-seq) analyses can benefit from performing a genome-guided and de novo assembly, in particular for species where the reference genome or the annotation is incomplete. However, tools for integrating an assembled transcriptome with reference annotation are lacking. Necklace is a software pipeline that runs genome-guided and de novo assembly and combines the resulting transcriptomes with reference genome annotations. Necklace constructs a compact but comprehensive superTranscriptome out of the assembled and reference data. Reads are subsequently aligned and counted in preparation for differential expression testing. Necklace allows a comprehensive transcriptome to be built from a combination of assembled and annotated transcripts, which results in a more comprehensive transcriptome for the majority of organisms. In addition RNA-seq data are mapped back to this newly created superTranscript reference to enable differential expression testing with standard methods.

  5. Web-based NGS data analysis using miRMaster: a large-scale meta-analysis of human miRNAs.

    PubMed

    Fehlmann, Tobias; Backes, Christina; Kahraman, Mustafa; Haas, Jan; Ludwig, Nicole; Posch, Andreas E; Würstle, Maximilian L; Hübenthal, Matthias; Franke, Andre; Meder, Benjamin; Meese, Eckart; Keller, Andreas

    2017-09-06

    The analysis of small RNA NGS data together with the discovery of new small RNAs is among the foremost challenges in life science. For the analysis of raw high-throughput sequencing data we implemented the fast, accurate and comprehensive web-based tool miRMaster. Our toolbox provides a wide range of modules for quantification of miRNAs and other non-coding RNAs, discovering new miRNAs, isomiRs, mutations, exogenous RNAs and motifs. Use-cases comprising hundreds of samples are processed in less than 5 h with an accuracy of 99.4%. An integrative analysis of small RNAs from 1836 data sets (20 billion reads) indicated that context-specific miRNAs (e.g. miRNAs present only in one or few different tissues / cell types) still remain to be discovered while broadly expressed miRNAs appear to be largely known. In total, our analysis of known and novel miRNAs indicated nearly 22 000 candidates of precursors with one or two mature forms. Based on these, we designed a custom microarray comprising 11 872 potential mature miRNAs to assess the quality of our prediction. MiRMaster is a convenient-to-use tool for the comprehensive and fast analysis of miRNA NGS data. In addition, our predicted miRNA candidates provided as custom array will allow researchers to perform in depth validation of candidates interesting to them. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Measuring the Closeness of Relationships: A Comprehensive Evaluation of the 'Inclusion of the Other in the Self' Scale

    PubMed Central

    2015-01-01

    Understanding the nature and influence of social relationships is of increasing interest to behavioral economists, and behavioral scientists more generally. In turn, this creates a need for tractable, and reliable, tools for measuring fundamental aspects of social relationships. We provide a comprehensive evaluation of the 'Inclusion of the Other in the Self' (IOS) Scale, a handy pictorial tool for measuring the subjectively perceived closeness of a relationship. The tool is highly portable, very easy for subjects to understand and takes less than 1 minute to administer. Across our three online studies with a diverse adult population (n = 772) we show that six different scales designed to measure relationship closeness are all highly significantly positively correlated with the IOS Scale. We then conduct a Principal Component Analysis to construct an Index of Relationship Closeness and find that it correlates very strongly (ρ = 85) with the IOS Scale. We conclude that the IOS Scale is a psychologically meaningful and highly reliable measure of the subjective closeness of relationships. PMID:26068873

  7. The Perseus computational platform for comprehensive analysis of (prote)omics data.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen

    2016-09-01

    A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.

  8. Comprehensive helicopter analysis: A state of the art review

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1978-01-01

    An assessment of the status of helicopter theory and analysis is presented. The technology level embodied in available design tools (computer programs) is examined, considering the problem areas of performance, loads and vibration, handling qualities and simulation, and aeroelastic stability. The effectiveness of the present analyses is discussed. The characteristics of the technology in the analyses are reviewed, including the aerodynamics technology, induced velocity and wake geometry, dynamics technology, and machine limitations.

  9. The development of a multimedia online language assessment tool for young children with autism.

    PubMed

    Lin, Chu-Sui; Chang, Shu-Hui; Liou, Wen-Ying; Tsai, Yu-Show

    2013-10-01

    This study aimed to provide early childhood special education professionals with a standardized and comprehensive language assessment tool for the early identification of language learning characteristics (e.g., hyperlexia) of young children with autism. In this study, we used computer technology to develop a multi-media online language assessment tool that presents auditory or visual stimuli. This online comprehensive language assessment consists of six subtests: decoding, homographs, auditory vocabulary comprehension, visual vocabulary comprehension, auditory sentence comprehension, and visual sentence comprehension. Three hundred typically developing children and 35 children with autism from Tao-Yuan County in Taiwan aged 4-6 participated in this study. The Cronbach α values of the six subtests ranged from .64 to .97. The variance explained by the six subtests ranged from 14% to 56%, the current validity of each subtest with the Peabody Picture Vocabulary Test-Revised ranged from .21 to .45, and the predictive validity of each subtest with WISC-III ranged from .47 to .75. This assessment tool was also found to be able to accurately differentiate children with autism up to 92%. These results indicate that this assessment tool has both adequate reliability and validity. Additionally, 35 children with autism have completed the entire assessment in this study without exhibiting any extremely troubling behaviors. However, future research is needed to increase the sample size of both typically developing children and young children with autism and to overcome the technical challenges associated with internet issues. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Modeling Criminal Activity in Urban Landscapes

    NASA Astrophysics Data System (ADS)

    Brantingham, Patricia; Glässer, Uwe; Jackson, Piper; Vajihollahi, Mona

    Computational and mathematical methods arguably have an enormous potential for serving practical needs in crime analysis and prevention by offering novel tools for crime investigations and experimental platforms for evidence-based policy making. We present a comprehensive formal framework and tool support for mathematical and computational modeling of criminal behavior to facilitate systematic experimental studies of a wide range of criminal activities in urban environments. The focus is on spatial and temporal aspects of different forms of crime, including opportunistic and serial violent crimes. However, the proposed framework provides a basis to push beyond conventional empirical research and engage the use of computational thinking and social simulations in the analysis of terrorism and counter-terrorism.

  11. SEURAT: Visual analytics for the integrated analysis of microarray data

    PubMed Central

    2010-01-01

    Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data. PMID:20525257

  12. Development, Sensibility, and Validity of a Systemic Autoimmune Rheumatic Disease Case Ascertainment Tool.

    PubMed

    Armstrong, Susan M; Wither, Joan E; Borowoy, Alan M; Landolt-Marticorena, Carolina; Davis, Aileen M; Johnson, Sindhu R

    2017-01-01

    Case ascertainment through self-report is a convenient but often inaccurate method to collect information. The purposes of this study were to develop, assess the sensibility, and validate a tool to identify cases of systemic autoimmune rheumatic diseases (SARD) in the outpatient setting. The SARD tool was administered to subjects sampled from specialty clinics. Determinants of sensibility - comprehensibility, feasibility, validity, and acceptability - were evaluated using a numeric rating scale from 1-7. Comprehensibility was evaluated using the Flesch Reading Ease and the Flesch-Kincaid Grade Level. Self-reported diagnoses were validated against medical records using Cohen's κ statistic. There were 141 participants [systemic lupus erythematosus (SLE), systemic sclerosis (SSc), rheumatoid arthritis, Sjögren syndrome (SS), inflammatory myositis (polymyositis/dermatomyositis; PM/DM), and controls] who completed the questionnaire. The Flesch Reading Ease score was 77.1 and the Flesch-Kincaid Grade Level was 4.4. Respondents endorsed (mean ± SD) comprehensibility (6.12 ± 0.92), feasibility (5.94 ± 0.81), validity (5.35 ± 1.10), and acceptability (3.10 ± 2.03). The SARD tool had a sensitivity of 0.91 (95% CI 0.88-0.94) and a specificity of 0.99 (95% CI 0.96-1.00). The agreement between the SARD tool and medical record was κ = 0.82 (95% CI 0.77-0.88). Subgroup analysis by SARD found κ coefficients for SLE to be κ = 0.88 (95% CI 0.79-0.97), SSc κ = 1.0 (95% CI 1.0-1.0), PM/DM κ = 0.72 (95% CI 0.49-0.95), and SS κ = 0.85 (95% CI 0.71-0.99). The screening questions had sensitivity ranging from 0.96 to 1.0 and specificity ranging from 0.88 to 1.0. This SARD case ascertainment tool has demonstrable sensibility and validity. The use of both screening and confirmatory questions confers added accuracy.

  13. Towards the Integration of APECS with VE-Suite to Create a Comprehensive Virtual Engineering Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCorkle, D.; Yang, C.; Jordan, T.

    2007-06-01

    Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less

  14. A New Approach for Analysing National Innovation Systems in Emerging and Developing Countries

    ERIC Educational Resources Information Center

    Seidel, Uwe; Muller, Lysann; Meier zu Kocker, Gerd; Filho, Guajarino de Araujo

    2013-01-01

    This paper presents a tool for the indicator-based analysis of national innovation systems (ANIS). ANIS identifies the economic strengths and weaknesses of a country-wide, regional or local system and includes a comprehensive examination and evaluation of the status of existing innovation systems. The use of a particular form of expert interviews…

  15. Beginning Korean. Yale Linguistic Series.

    ERIC Educational Resources Information Center

    Martin, Samuel E.; Lee, Young-Sook C.

    A "model of structural linguistic analysis as well as a teaching tool," this text is designed to give the student a comprehensive grasp of the essentials of modern Korean in 25 lessons, with 5 review lessons, leading to advanced levels of proficiency. It is intended to be used by adult students working either in classes or by themselves,…

  16. Investigating the Affective Learning in a 3D Virtual Learning Environment: The Case Study of the Chatterdale Mystery

    ERIC Educational Resources Information Center

    Molka-Danielsen, Judith; Hadjistassou, Stella; Messl-Egghart, Gerhilde

    2016-01-01

    This research is motivated by the emergence of virtual technologies and their potential as engaging pedagogical tools for facilitating comprehension, interactions and collaborations for learning; and in particular as applied to learning second languages (L2). This paper provides a descriptive analysis of a case study that examines affective…

  17. Constructing a Model of Lottery Tax Incidence Measurement: Revisiting the Illinois Lottery Tax for Education

    ERIC Educational Resources Information Center

    Daberkow, Kevin S.; Lin, Wei

    2012-01-01

    Nearly half a century of lottery scholarship has measured lottery tax incidence predominantly through either the Suits Index or regression analysis. The present study builds on historic lottery tax burden measurement to present a comprehensive set of tools to determine the tax incidence of individual games in addition to determining which lottery…

  18. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data.

    PubMed

    Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel G M

    2006-10-13

    Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via http://www.mix-for-meta-analysis.info or http://sourceforge.net/projects/meta-analysis.

  19. An integrated GIS application system for soil moisture data assimilation

    NASA Astrophysics Data System (ADS)

    Wang, Di; Shen, Runping; Huang, Xiaolong; Shi, Chunxiang

    2014-11-01

    The gaps in knowledge and existing challenges in precisely describing the land surface process make it critical to represent the massive soil moisture data visually and mine the data for further research.This article introduces a comprehensive soil moisture assimilation data analysis system, which is instructed by tools of C#, IDL, ArcSDE, Visual Studio 2008 and SQL Server 2005. The system provides integrated service, management of efficient graphics visualization and analysis of land surface data assimilation. The system is not only able to improve the efficiency of data assimilation management, but also comprehensively integrate the data processing and analysis tools into GIS development environment. So analyzing the soil moisture assimilation data and accomplishing GIS spatial analysis can be realized in the same system. This system provides basic GIS map functions, massive data process and soil moisture products analysis etc. Besides,it takes full advantage of a spatial data engine called ArcSDE to effeciently manage, retrieve and store all kinds of data. In the system, characteristics of temporal and spatial pattern of soil moiture will be plotted. By analyzing the soil moisture impact factors, it is possible to acquire the correlation coefficients between soil moisture value and its every single impact factor. Daily and monthly comparative analysis of soil moisture products among observations, simulation results and assimilations can be made in this system to display the different trends of these products. Furthermore, soil moisture map production function is realized for business application.

  20. Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.

    PubMed

    Xia, Jianguo; Wishart, David S

    2016-09-07

    MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  1. Methods and Practices of Investigators for Determining Participants’ Decisional Capacity and Comprehension of Protocols

    PubMed Central

    Kon, Alexander A.; Klug, Michael

    2010-01-01

    Ethicists recommend that investigators assess subjects’ comprehension prior to accepting their consent as valid. Because children represent an at-risk population, ensuring adequate comprehension in pediatric research is vital. We surveyed all corresponding authors of research articles published over a six-month period in five leading adult and pediatric journals. Our goal was to assess how often subject’s comprehension or decisional capacity was assessed in the consent process, whether there was any difference between adult and pediatric research projects, and the rate at which investigators use formal or validated tools to assess capacity. Responses from 102 authors were analyzed (response rate 56%). Approximately two-thirds of respondents stated that they assessed comprehension or decisional capacity prior to accepting consent, and we found no difference between adult and pediatric researchers. Nine investigators used a formal questionnaire, and three used a validated tool. These findings suggest that fewer than expected investigators assess comprehension and decisional capacity, and that the use of standardized and validated tools is the exception rather than the rule. PMID:19385838

  2. Comprehensive lipid analysis: a powerful metanomic tool for predictive and diagnostic medicine.

    PubMed

    Watkins, S M

    2000-09-01

    The power and accuracy of predictive diagnostics stand to improve dramatically as a result of lipid metanomics. The high definition of data obtained with this approach allows multiple rather than single metabolites to be used in markers for a group. Since as many as 40 fatty acids are quantified from each lipid class, and up to 15 lipid classes can be quantified easily, more than 600 individual lipid metabolites can be measured routinely for each sample. Because these analyses are comprehensive, only the most appropriate and unique metabolites are selected for their predictive value. Thus, comprehensive lipid analysis promises to greatly improve predictive diagnostics for phenotypes that directly or peripherally involve lipids. A broader and possibly more exciting aspect of this technology is the generation of metabolic profiles that are not simply markers for disease, but metabolic maps that can be used to identify specific genes or activities that cause or influence the disease state. Metanomics is, in essence, functional genomics from metabolite analysis. By defining the metabolic basis for phenotype, researchers and clinicians will have an extraordinary opportunity to understand and treat disease. Much in the same way that gene chips allow researchers to observe the complex expression response to a stimulus, metanomics will enable researchers to observe the complex metabolic interplay responsible for defining phenotype. By extending this approach beyond the observation of individual dysregulations, medicine will begin to profile not single diseases, but health. As health is the proper balance of all vital metabolic pathways, comprehensive or metanomic analysis lends itself very well to identifying the metabolite distributions necessary for optimum health. Comprehensive and quantitative analysis of lipids would provide this degree of diagnostic power to researchers and clinicians interested in mining metabolic profiles for biological meaning.

  3. Classification of processes involved in sharing individual participant data from clinical trials.

    PubMed

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods : Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing.

  4. Classification of processes involved in sharing individual participant data from clinical trials

    PubMed Central

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods: Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing. PMID:29623192

  5. A survey of tools for variant analysis of next-generation genome sequencing data

    PubMed Central

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  6. Overview of 'Omics Technologies for Military Occupational Health Surveillance and Medicine.

    PubMed

    Bradburne, Christopher; Graham, David; Kingston, H M; Brenner, Ruth; Pamuku, Matt; Carruth, Lucy

    2015-10-01

    Systems biology ('omics) technologies are emerging as tools for the comprehensive analysis and monitoring of human health. In order for these tools to be used in military medicine, clinical sampling and biobanking will need to be optimized to be compatible with downstream processing and analysis for each class of molecule measured. This article provides an overview of 'omics technologies, including instrumentation, tools, and methods, and their potential application for warfighter exposure monitoring. We discuss the current state and the potential utility of personalized data from a variety of 'omics sources including genomics, epigenomics, transcriptomics, metabolomics, proteomics, lipidomics, and efforts to combine their use. Issues in the "sample-to-answer" workflow, including collection and biobanking are discussed, as well as national efforts for standardization and clinical interpretation. Establishment of these emerging capabilities, along with accurate xenobiotic monitoring, for the Department of Defense could provide new and effective tools for environmental health monitoring at all duty stations, including deployed locations. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  7. Validation of an instrument to measure inter-organisational linkages in general practice.

    PubMed

    Amoroso, Cheryl; Proudfoot, Judith; Bubner, Tanya; Jayasinghe, Upali W; Holton, Christine; Winstanley, Julie; Beilby, Justin; Harris, Mark F

    2007-12-03

    Linkages between general medical practices and external services are important for high quality chronic disease care. The purpose of this research is to describe the development, evaluation and use of a brief tool that measures the comprehensiveness and quality of a general practice's linkages with external providers for the management of patients with chronic disease. In this study, clinical linkages are defined as the communication, support, and referral arrangements between services for the care and assistance of patients with chronic disease. An interview to measure surgery-level (rather than individual clinician-level) clinical linkages was developed, piloted, reviewed, and evaluated with 97 Australian general practices. Two validated survey instruments were posted to patients, and a survey of locally available services was developed and posted to participating Divisions of General Practice (support organisations). Hypotheses regarding internal validity, association with local services, and patient satisfaction were tested using factor analysis, logistic regression and multilevel regression models. The resulting General Practice Clinical Linkages Interview (GP-CLI) is a nine-item tool with three underlying factors: referral and advice linkages, shared care and care planning linkages, and community access and awareness linkages. Local availability of chronic disease services has no affect on the comprehensiveness of services with which practices link, however, comprehensiveness of clinical linkages has an association with patient assessment of access, receptionist services, and of continuity of care in their general practice. The GP-CLI may be useful to researchers examining comparable health care systems for measuring the comprehensiveness and quality of linkages at a general practice-level with related services, possessing both internal and external validity. The tool can be used with large samples exploring the impact, outcomes, and facilitators of high quality clinical linkages in general practice.

  8. LoopX: A Graphical User Interface-Based Database for Comprehensive Analysis and Comparative Evaluation of Loops from Protein Structures.

    PubMed

    Kadumuri, Rajashekar Varma; Vadrevu, Ramakrishna

    2017-10-01

    Due to their crucial role in function, folding, and stability, protein loops are being targeted for grafting/designing to create novel or alter existing functionality and improve stability and foldability. With a view to facilitate a thorough analysis and effectual search options for extracting and comparing loops for sequence and structural compatibility, we developed, LoopX a comprehensively compiled library of sequence and conformational features of ∼700,000 loops from protein structures. The database equipped with a graphical user interface is empowered with diverse query tools and search algorithms, with various rendering options to visualize the sequence- and structural-level information along with hydrogen bonding patterns, backbone φ, ψ dihedral angles of both the target and candidate loops. Two new features (i) conservation of the polar/nonpolar environment and (ii) conservation of sequence and conformation of specific residues within the loops have also been incorporated in the search and retrieval of compatible loops for a chosen target loop. Thus, the LoopX server not only serves as a database and visualization tool for sequence and structural analysis of protein loops but also aids in extracting and comparing candidate loops for a given target loop based on user-defined search options.

  9. [Evaluation of the quality of clinical practice guidelines published in the Annales de Biologie Clinique with the help of the EFLM checklist].

    PubMed

    Wils, Julien; Fonfrède, Michèle; Augereau, Christine; Watine, Joseph

    2014-01-01

    Several tools are available to help evaluate the quality of clinical practice guidelines (CPG). The AGREE instrument (Appraisal of guidelines for research & evaluation) is the most consensual tool but it has been designed to assess CPG methodology only. The European federation of laboratory medicine (EFLM) recently designed a check-list dedicated to laboratory medicine which is supposed to be comprehensive and which therefore makes it possible to evaluate more thoroughly the quality of CPG in laboratory medicine. In the present work we test the comprehensiveness of this check-list on a sample of CPG written in French and published in Annales de biologie clinique (ABC). Thus we show that some work remains to be achieved before a truly comprehensive check-list is designed. We also show that there is some room for improvement for the CPG published in ABC, for example regarding the fact that some of these CPG do not provide any information about allowed durations of transport and of storage of biological samples before analysis, or about standards of minimal analytical performance, or about the sensitivities or the specificities of the recommended tests.

  10. Mass spectrometry as a quantitative tool in plant metabolomics

    PubMed Central

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  11. New multivariable capabilities of the INCA program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1989-01-01

    The INteractive Controls Analysis (INCA) program was developed at NASA's Goddard Space Flight Center to provide a user friendly, efficient environment for the design and analysis of control systems, specifically spacecraft control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. The (INCA) program was initially developed as a comprehensive classical design analysis tool for small and large order control systems. The latest version of INCA, expected to be released in February of 1990, was expanded to include the capability to perform multivariable controls analysis and design.

  12. Improving diabetic foot care in a nurse-managed safety-net clinic.

    PubMed

    Peterson, Joann M; Virden, Mary D

    2013-05-01

    This article is a description of the development and implementation of a Comprehensive Diabetic Foot Care Program and assessment tool in an academically affiliated nurse-managed, multidisciplinary, safety-net clinic. The assessment tool parallels parameters identified in the Task Force Foot Care Interest Group of the American Diabetes Association's report published in 2008, "Comprehensive Foot Examination and Risk Assessment." Review of literature, Silver City Health Center's (SCHC) 2009 Annual Report, retrospective chart review. Since the full implementation of SCHC's Comprehensive Diabetic Foot Care Program, there have been no hospitalizations of clinic patients for foot-related complications. The development of the Comprehensive Diabetic Foot Assessment tool and the implementation of the Comprehensive Diabetic Foot Care Program have resulted in positive outcomes for the patients in a nurse-managed safety-net clinic. This article demonstrates that quality healthcare services can successfully be developed and implemented in a safety-net clinic setting. ©2012 The Author(s) Journal compilation ©2012 American Association of Nurse Practitioners.

  13. TROPHI: development of a tool to measure complex, multi-factorial patient handling interventions.

    PubMed

    Fray, Mike; Hignett, Sue

    2013-01-01

    Patient handling interventions are complex and multi-factorial. It has been difficult to make comparisons across different strategies due to the lack of a comprehensive outcome measurement method. The Tool for Risk Outstanding in Patient Handling Interventions (TROPHI) was developed to address this gap by measuring outcomes and comparing performance across interventions. Focus groups were held with expert patient handling practitioners (n = 36) in four European countries (Finland, Italy, Portugal and the UK) to identify preferred outcomes to be measured for interventions. A systematic literature review identified 598 outcome measures; these were critically appraised and the most appropriate measurement tool was selected for each outcome. TROPHI was evaluated in the four EU countries (eight sites) and by an expert panel (n = 16) from the European Panel of Patient Handling Ergonomics for usability and practical application. This final stage added external validity to the research by exploring transferability potential and presenting the data and analysis to allow respondent (participant) validation. Patient handling interventions are complex and multi-factorial and it has been difficult to make comparisons due to the lack of a comprehensive outcome measurement method. The Tool for Risk Outstanding in Patient Handling Interventions (TROPHI) was developed to address this gap by measuring outcomes to compare performance across interventions.

  14. MANTiS: a program for the analysis of X-ray spectromicroscopy data.

    PubMed

    Lerotic, Mirna; Mak, Rachel; Wirick, Sue; Meirer, Florian; Jacobsen, Chris

    2014-09-01

    Spectromicroscopy combines spectral data with microscopy, where typical datasets consist of a stack of images taken across a range of energies over a microscopic region of the sample. Manual analysis of these complex datasets can be time-consuming, and can miss the important traits in the data. With this in mind we have developed MANTiS, an open-source tool developed in Python for spectromicroscopy data analysis. The backbone of the package involves principal component analysis and cluster analysis, classifying pixels according to spectral similarity. Our goal is to provide a data analysis tool which is comprehensive, yet intuitive and easy to use. MANTiS is designed to lead the user through the analysis using story boards that describe each step in detail so that both experienced users and beginners are able to analyze their own data independently. These capabilities are illustrated through analysis of hard X-ray imaging of iron in Roman ceramics, and soft X-ray imaging of a malaria-infected red blood cell.

  15. Comprehensive health workforce planning: re-consideration of the primary health care approach as a tool for addressing the human resource for health crisis in low and middle income countries.

    PubMed

    Munga, Michael A; Mwangu, Mughwira A

    2013-04-01

    Although the Human Resources for Health (HRH) crisis is apparently not new in the public health agenda of many countries, not many low and middle income countries are using Primary Health Care (PHC) as a tool for planning and addressing the crisis in a comprehensive manner. The aim of this paper is to appraise the inadequacies of the existing planning approaches in addressing the growing HRH crisis in resource limited settings. A descriptive literature review of selected case studies in middle and low income countries reinforced with the evidence from Tanzania was used. Consultations with experts in the field were also made. In this review, we propose a conceptual framework that describes planning may only be effective if it is structured to embrace the fundamental principles of PHC. We place the core principles of PHC at the centre of HRH planning as we acknowledge its major perspective that the effectiveness of any public health policy depends on the degree to which it envisages to address public health problems multi-dimensionally and comprehensively. The proponents of PHC approach in planning have identified inter-sectoral action and collaboration and comprehensive approach as the two basic principles that policies and plans should accentuate in order to make them effective in realizing their pre-determined goals. Two conclusions are made: Firstly, comprehensive health workforce planning is not widely known and thus not frequently used in HRH planning or analysis of health workforce issues; Secondly, comprehensiveness in HRH planning is important but not sufficient in ensuring that all the ingredients of HRH crisis are eliminated. In order to be effective and sustainable, the approach need to evoke three basic values namely effectiveness, efficiency and equity.

  16. Machine Learning Meta-analysis of Large Metagenomic Datasets: Tools and Biological Insights.

    PubMed

    Pasolli, Edoardo; Truong, Duy Tin; Malik, Faizan; Waldron, Levi; Segata, Nicola

    2016-07-01

    Shotgun metagenomic analysis of the human associated microbiome provides a rich set of microbial features for prediction and biomarker discovery in the context of human diseases and health conditions. However, the use of such high-resolution microbial features presents new challenges, and validated computational tools for learning tasks are lacking. Moreover, classification rules have scarcely been validated in independent studies, posing questions about the generality and generalization of disease-predictive models across cohorts. In this paper, we comprehensively assess approaches to metagenomics-based prediction tasks and for quantitative assessment of the strength of potential microbiome-phenotype associations. We develop a computational framework for prediction tasks using quantitative microbiome profiles, including species-level relative abundances and presence of strain-specific markers. A comprehensive meta-analysis, with particular emphasis on generalization across cohorts, was performed in a collection of 2424 publicly available metagenomic samples from eight large-scale studies. Cross-validation revealed good disease-prediction capabilities, which were in general improved by feature selection and use of strain-specific markers instead of species-level taxonomic abundance. In cross-study analysis, models transferred between studies were in some cases less accurate than models tested by within-study cross-validation. Interestingly, the addition of healthy (control) samples from other studies to training sets improved disease prediction capabilities. Some microbial species (most notably Streptococcus anginosus) seem to characterize general dysbiotic states of the microbiome rather than connections with a specific disease. Our results in modelling features of the "healthy" microbiome can be considered a first step toward defining general microbial dysbiosis. The software framework, microbiome profiles, and metadata for thousands of samples are publicly available at http://segatalab.cibio.unitn.it/tools/metaml.

  17. Comprehensive processing of high-throughput small RNA sequencing data including quality checking, normalization, and differential expression analysis using the UEA sRNA Workbench

    PubMed Central

    Beckers, Matthew; Mohorianu, Irina; Stocks, Matthew; Applegate, Christopher; Dalmay, Tamas; Moulton, Vincent

    2017-01-01

    Recently, high-throughput sequencing (HTS) has revealed compelling details about the small RNA (sRNA) population in eukaryotes. These 20 to 25 nt noncoding RNAs can influence gene expression by acting as guides for the sequence-specific regulatory mechanism known as RNA silencing. The increase in sequencing depth and number of samples per project enables a better understanding of the role sRNAs play by facilitating the study of expression patterns. However, the intricacy of the biological hypotheses coupled with a lack of appropriate tools often leads to inadequate mining of the available data and thus, an incomplete description of the biological mechanisms involved. To enable a comprehensive study of differential expression in sRNA data sets, we present a new interactive pipeline that guides researchers through the various stages of data preprocessing and analysis. This includes various tools, some of which we specifically developed for sRNA analysis, for quality checking and normalization of sRNA samples as well as tools for the detection of differentially expressed sRNAs and identification of the resulting expression patterns. The pipeline is available within the UEA sRNA Workbench, a user-friendly software package for the processing of sRNA data sets. We demonstrate the use of the pipeline on a H. sapiens data set; additional examples on a B. terrestris data set and on an A. thaliana data set are described in the Supplemental Information. A comparison with existing approaches is also included, which exemplifies some of the issues that need to be addressed for sRNA analysis and how the new pipeline may be used to do this. PMID:28289155

  18. Daily QA of linear accelerators using only EPID and OBI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Baozhou, E-mail: bsun@radonc.wustl.edu; Goddu, S. Murty; Yaddanapudi, Sridhar

    2015-10-15

    Purpose: As treatment delivery becomes more complex, there is a pressing need for robust quality assurance (QA) tools to improve efficiency and comprehensiveness while simultaneously maintaining high accuracy and sensitivity. This work aims to present the hardware and software tools developed for comprehensive QA of linear accelerator (LINAC) using only electronic portal imaging devices (EPIDs) and kV flat panel detectors. Methods: A daily QA phantom, which includes two orthogonally positioned phantoms for QA of MV-beams and kV onboard imaging (OBI) is suspended from the gantry accessory holder to test both geometric and dosimetric components of a LINAC and an OBI.more » The MV component consists of a 0.5 cm water-equivalent plastic sheet incorporating 11 circular steel plugs for transmission measurements through multiple thicknesses and one resolution plug for MV-image quality testing. The kV-phantom consists of a Leeds phantom (TOR-18 FG phantom supplied by Varian) for testing low and high contrast resolutions. In the developed process, the existing LINAC tools were used to automate daily acquisition of MV and kV images and software tools were developed for simultaneous analysis of these images. A method was developed to derive and evaluate traditional QA parameters from these images [output, flatness, symmetry, uniformity, TPR{sub 20/10}, and positional accuracy of the jaws and multileaf collimators (MLCs)]. The EPID-based daily QA tools were validated by performing measurements on a detuned 6 MV beam to test its effectiveness in detecting errors in output, symmetry, energy, and MLC positions. The developed QA process was clinically commissioned, implemented, and evaluated on a Varian TrueBeam LINAC (Varian Medical System, Palo Alto, CA) over a period of three months. Results: Machine output constancy measured with an EPID (as compared against a calibrated ion-chamber) is shown to be within ±0.5%. Beam symmetry and flatness deviations measured using an EPID and a 2D ion-chamber array agree within ±0.5% and ±1.2% for crossline and inline profiles, respectively. MLC position errors of 0.5 mm can be detected using a picket fence test. The field size and phantom positioning accuracy can be determined within 0.5 mm. The entire daily QA process takes ∼15 min to perform tests for 5 photon beams, MLC tests, and imaging checks. Conclusions: The exclusive use of EPID-based QA tools, including a QA phantom and simultaneous analysis software tools, has been demonstrated as a viable, efficient, and comprehensive process for daily evaluation of LINAC performance.« less

  19. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data

    PubMed Central

    Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel GM

    2006-01-01

    Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. Conclusion The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via or . PMID:17038197

  20. A multimedia consent tool for research participants in the Gambia: a randomized controlled trial.

    PubMed

    Afolabi, Muhammed Olanrewaju; McGrath, Nuala; D'Alessandro, Umberto; Kampmann, Beate; Imoukhuede, Egeruan B; Ravinetto, Raffaella M; Alexander, Neal; Larson, Heidi J; Chandramohan, Daniel; Bojang, Kalifa

    2015-05-01

    To assess the effectiveness of a multimedia informed consent tool for adults participating in a clinical trial in the Gambia. Adults eligible for inclusion in a malaria treatment trial (n = 311) were randomized to receive information needed for informed consent using either a multimedia tool (intervention arm) or a standard procedure (control arm). A computerized, audio questionnaire was used to assess participants' comprehension of informed consent. This was done immediately after consent had been obtained (at day 0) and at subsequent follow-up visits (days 7, 14, 21 and 28). The acceptability and ease of use of the multimedia tool were assessed in focus groups. On day 0, the median comprehension score in the intervention arm was 64% compared with 40% in the control arm (P = 0.042). The difference remained significant at all follow-up visits. Poorer comprehension was independently associated with female sex (odds ratio, OR: 0.29; 95% confidence interval, CI: 0.12-0.70) and residing in Jahaly rather than Basse province (OR: 0.33; 95% CI: 0.13-0.82). There was no significant independent association with educational level. The risk that a participant's comprehension score would drop to half of the initial value was lower in the intervention arm (hazard ratio 0.22, 95% CI: 0.16-0.31). Overall, 70% (42/60) of focus group participants from the intervention arm found the multimedia tool clear and easy to understand. A multimedia informed consent tool significantly improved comprehension and retention of consent information by research participants with low levels of literacy.

  1. DASS-GUI: a user interface for identification and analysis of significant patterns in non-sequential data.

    PubMed

    Hollunder, Jens; Friedel, Maik; Kuiper, Martin; Wilhelm, Thomas

    2010-04-01

    Many large 'omics' datasets have been published and many more are expected in the near future. New analysis methods are needed for best exploitation. We have developed a graphical user interface (GUI) for easy data analysis. Our discovery of all significant substructures (DASS) approach elucidates the underlying modularity, a typical feature of complex biological data. It is related to biclustering and other data mining approaches. Importantly, DASS-GUI also allows handling of multi-sets and calculation of statistical significances. DASS-GUI contains tools for further analysis of the identified patterns: analysis of the pattern hierarchy, enrichment analysis, module validation, analysis of additional numerical data, easy handling of synonymous names, clustering, filtering and merging. Different export options allow easy usage of additional tools such as Cytoscape. Source code, pre-compiled binaries for different systems, a comprehensive tutorial, case studies and many additional datasets are freely available at http://www.ifr.ac.uk/dass/gui/. DASS-GUI is implemented in Qt.

  2. CalFitter: a web server for analysis of protein thermal denaturation data.

    PubMed

    Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri

    2018-05-14

    Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.

  3. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  4. MetaDP: a comprehensive web server for disease prediction of 16S rRNA metagenomic datasets.

    PubMed

    Xu, Xilin; Wu, Aiping; Zhang, Xinlei; Su, Mingming; Jiang, Taijiao; Yuan, Zhe-Ming

    2016-01-01

    High-throughput sequencing-based metagenomics has garnered considerable interest in recent years. Numerous methods and tools have been developed for the analysis of metagenomic data. However, it is still a daunting task to install a large number of tools and complete a complicated analysis, especially for researchers with minimal bioinformatics backgrounds. To address this problem, we constructed an automated software named MetaDP for 16S rRNA sequencing data analysis, including data quality control, operational taxonomic unit clustering, diversity analysis, and disease risk prediction modeling. Furthermore, a support vector machine-based prediction model for intestinal bowel syndrome (IBS) was built by applying MetaDP to microbial 16S sequencing data from 108 children. The success of the IBS prediction model suggests that the platform may also be applied to other diseases related to gut microbes, such as obesity, metabolic syndrome, or intestinal cancer, among others (http://metadp.cn:7001/).

  5. Preliminary design of mesoscale turbocompressor and rotordynamics tests of rotor bearing system

    NASA Astrophysics Data System (ADS)

    Hossain, Md Saddam

    2011-12-01

    A mesoscale turbocompressor spinning above 500,000 RPM is evolutionary technology for micro turbochargers, turbo blowers, turbo compressors, micro-gas turbines, auxiliary power units, etc for automotive, aerospace, and fuel cell industries. Objectives of this work are: (1) to evaluate different air foil bearings designed for the intended applications, and (2) to design & perform CFD analysis of a micro-compressor. CFD analysis of shrouded 3-D micro compressor was conducted using Ansys Bladegen as blade generation tool, ICEM CFD as mesh generation tool, and CFX as main solver for different design and off design cases and also for different number of blades. Comprehensive experimental facilities for testing the turbocompressor system have been also designed and proposed for future work.

  6. Advanced Launch Technology Life Cycle Analysis Using the Architectural Comparison Tool (ACT)

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.

    2015-01-01

    Life cycle technology impact comparisons for nanolauncher technology concepts were performed using an Affordability Comparison Tool (ACT) prototype. Examined are cost drivers and whether technology investments can dramatically affect the life cycle characteristics. Primary among the selected applications was the prospect of improving nanolauncher systems. As a result, findings and conclusions are documented for ways of creating more productive and affordable nanolauncher systems; e.g., an Express Lane-Flex Lane concept is forwarded, and the beneficial effect of incorporating advanced integrated avionics is explored. Also, a Functional Systems Breakdown Structure (F-SBS) was developed to derive consistent definitions of the flight and ground systems for both system performance and life cycle analysis. Further, a comprehensive catalog of ground segment functions was created.

  7. Multimedia and Textual Reading Comprehension: Multimedia as Personal Learning Environment's Enriching Format

    ERIC Educational Resources Information Center

    García, J. Daniel; Rigo, Eduardo; Jiménez, Rafael

    2017-01-01

    In this article we will discuss part of a piece of research that was conducted with two 4ESO groups. Textual learning is opposed to multimedia learning within the context of PLE's (Personal Learning Environment) reading tools and strategies. In the research an analysis was made of whether it would be possible to improve the reading process through…

  8. NetMap: a new tool in support of watershed science and resource management.

    Treesearch

    L. Benda; D. Miller; K. Andras; P. Bigelow; G. Reeves; D. Michael

    2007-01-01

    In this paper, we show how application of principles of river ecology can guide use of a comprehensive terrain database within geographic information system (GIS) to facilitate watershed analysis relevant to natural resource management. We present a unique arrangement of a terrain database, GIS, and principles of riverine ecology for the purpose of advancing watershed...

  9. Envelope analysis of rotating machine vibrations in variable speed conditions: A comprehensive treatment

    NASA Astrophysics Data System (ADS)

    Abboud, D.; Antoni, J.; Sieg-Zieba, S.; Eltabach, M.

    2017-02-01

    Nowadays, the vibration analysis of rotating machine signals is a well-established methodology, rooted on powerful tools offered, in particular, by the theory of cyclostationary (CS) processes. Among them, the squared envelope spectrum (SES) is probably the most popular to detect random CS components which are typical symptoms, for instance, of rolling element bearing faults. Recent researches are shifted towards the extension of existing CS tools - originally devised in constant speed conditions - to the case of variable speed conditions. Many of these works combine the SES with computed order tracking after some preprocessing steps. The principal object of this paper is to organize these dispersed researches into a structured comprehensive framework. Three original features are furnished. First, a model of rotating machine signals is introduced which sheds light on the various components to be expected in the SES. Second, a critical comparison is made of three sophisticated methods, namely, the improved synchronous average, the cepstrum prewhitening, and the generalized synchronous average, used for suppressing the deterministic part. Also, a general envelope enhancement methodology which combines the latter two techniques with a time-domain filtering operation is revisited. All theoretical findings are experimentally validated on simulated and real-world vibration signals.

  10. Protocol for a prospective, school-based standardisation study of a digital social skills assessment tool for children: The Paediatric Evaluation of Emotions, Relationships, and Socialisation (PEERS) study

    PubMed Central

    Thompson, Emma J; Beauchamp, Miriam H; Darling, Simone J; Hearps, Stephen J C; Brown, Amy; Charalambous, George; Crossley, Louise; Darby, David; Dooley, Julian J; Greenham, Mardee; Jaimangal, Mohinder; McDonald, Skye; Muscara, Frank; Turkstra, Lyn; Anderson, Vicki A

    2018-01-01

    Background Humans are by nature a social species, with much of human experience spent in social interaction. Unsurprisingly, social functioning is crucial to well-being and quality of life across the lifespan. While early intervention for social problems appears promising, our ability to identify the specific impairments underlying their social problems (eg, social communication) is restricted by a dearth of accurate, ecologically valid and comprehensive child-direct assessment tools. Current tools are largely limited to parent and teacher ratings scales, which may identify social dysfunction, but not its underlying cause, or adult-based experimental tools, which lack age-appropriate norms. The present study describes the development and standardisation of Paediatric Evaluation of Emotions, Relationships, and Socialisation (PEERS®), an iPad-based social skills assessment tool. Methods The PEERS project is a cross-sectional study involving two groups: (1) a normative group, recruited from early childhood, primary and secondary schools across metropolitan and regional Victoria, Australia; and (2) a clinical group, ascertained from outpatient services at The Royal Children’s Hospital Melbourne (RCH). The project aims to establish normative data for PEERS®, a novel and comprehensive app-delivered child-direct measure of social skills for children and youth. The project involves recruiting and assessing 1000 children aged 4.0–17.11 years. Assessments consist of an intellectual screen, PEERS® subtests, and PEERS-Q, a self-report questionnaire of social skills. Parents and teachers also complete questionnaires relating to participants’ social skills. Main analyses will comprise regression-based continuous norming, factor analysis and psychometric analysis of PEERS® and PEERS-Q. Ethics and dissemination Ethics approval has been obtained through the RCH Human Research Ethics Committee (34046), the Victorian Government Department of Education and Early Childhood Development (002318), and Catholic Education Melbourne (2166). Findings will be disseminated through international conferences and peer-reviewed journals. Following standardisation of PEERS®, the tool will be made commercially available. PMID:29439065

  11. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  12. Optimization Issues with Complex Rotorcraft Comprehensive Analysis

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Young, Katherine C.; Tarzanin, Frank J.; Hirsh, Joel E.; Young, Darrell K.

    1998-01-01

    This paper investigates the use of the general purpose automatic differentiation (AD) tool called Automatic Differentiation of FORTRAN (ADIFOR) as a means of generating sensitivity derivatives for use in Boeing Helicopter's proprietary comprehensive rotor analysis code (VII). ADIFOR transforms an existing computer program into a new program that performs a sensitivity analysis in addition to the original analysis. In this study both the pros (exact derivatives, no step-size problems) and cons (more CPU, more memory) of ADIFOR are discussed. The size (based on the number of lines) of the VII code after ADIFOR processing increased by 70 percent and resulted in substantial computer memory requirements at execution. The ADIFOR derivatives took about 75 percent longer to compute than the finite-difference derivatives. However, the ADIFOR derivatives are exact and are not functions of step-size. The VII sensitivity derivatives generated by ADIFOR are compared with finite-difference derivatives. The ADIFOR and finite-difference derivatives are used in three optimization schemes to solve a low vibration rotor design problem.

  13. Integrated web visualizations for protein-protein interaction databases.

    PubMed

    Jeanquartier, Fleur; Jean-Quartier, Claire; Holzinger, Andreas

    2015-06-16

    Understanding living systems is crucial for curing diseases. To achieve this task we have to understand biological networks based on protein-protein interactions. Bioinformatics has come up with a great amount of databases and tools that support analysts in exploring protein-protein interactions on an integrated level for knowledge discovery. They provide predictions and correlations, indicate possibilities for future experimental research and fill the gaps to complete the picture of biochemical processes. There are numerous and huge databases of protein-protein interactions used to gain insights into answering some of the many questions of systems biology. Many computational resources integrate interaction data with additional information on molecular background. However, the vast number of diverse Bioinformatics resources poses an obstacle to the goal of understanding. We present a survey of databases that enable the visual analysis of protein networks. We selected M=10 out of N=53 resources supporting visualization, and we tested against the following set of criteria: interoperability, data integration, quantity of possible interactions, data visualization quality and data coverage. The study reveals differences in usability, visualization features and quality as well as the quantity of interactions. StringDB is the recommended first choice. CPDB presents a comprehensive dataset and IntAct lets the user change the network layout. A comprehensive comparison table is available via web. The supplementary table can be accessed on http://tinyurl.com/PPI-DB-Comparison-2015. Only some web resources featuring graph visualization can be successfully applied to interactive visual analysis of protein-protein interaction. Study results underline the necessity for further enhancements of visualization integration in biochemical analysis tools. Identified challenges are data comprehensiveness, confidence, interactive feature and visualization maturing.

  14. A comprehensive linear programming tool to optimize formulations of ready-to-use therapeutic foods: An application to Ethiopia

    USDA-ARS?s Scientific Manuscript database

    Ready-to-use therapeutic food (RUTF) is the standard of care for children suffering from noncomplicated severe acute malnutrition (SAM). The objective was to develop a comprehensive linear programming (LP) tool to create novel RUTF formulations for Ethiopia. A systematic approach that surveyed inter...

  15. Databases and Web Tools for Cancer Genomics Study

    PubMed Central

    Yang, Yadong; Dong, Xunong; Xie, Bingbing; Ding, Nan; Chen, Juan; Li, Yongjun; Zhang, Qian; Qu, Hongzhu; Fang, Xiangdong

    2015-01-01

    Publicly-accessible resources have promoted the advance of scientific discovery. The era of genomics and big data has brought the need for collaboration and data sharing in order to make effective use of this new knowledge. Here, we describe the web resources for cancer genomics research and rate them on the basis of the diversity of cancer types, sample size, omics data comprehensiveness, and user experience. The resources reviewed include data repository and analysis tools; and we hope such introduction will promote the awareness and facilitate the usage of these resources in the cancer research community. PMID:25707591

  16. Functional Analysis of OMICs Data and Small Molecule Compounds in an Integrated "Knowledge-Based" Platform.

    PubMed

    Dubovenko, Alexey; Nikolsky, Yuri; Rakhmatulin, Eugene; Nikolskaya, Tatiana

    2017-01-01

    Analysis of NGS and other sequencing data, gene variants, gene expression, proteomics, and other high-throughput (OMICs) data is challenging because of its biological complexity and high level of technical and biological noise. One way to deal with both problems is to perform analysis with a high fidelity annotated knowledgebase of protein interactions, pathways, and functional ontologies. This knowledgebase has to be structured in a computer-readable format and must include software tools for managing experimental data, analysis, and reporting. Here, we present MetaCore™ and Key Pathway Advisor (KPA), an integrated platform for functional data analysis. On the content side, MetaCore and KPA encompass a comprehensive database of molecular interactions of different types, pathways, network models, and ten functional ontologies covering human, mouse, and rat genes. The analytical toolkit includes tools for gene/protein list enrichment analysis, statistical "interactome" tool for the identification of over- and under-connected proteins in the dataset, and a biological network analysis module made up of network generation algorithms and filters. The suite also features Advanced Search, an application for combinatorial search of the database content, as well as a Java-based tool called Pathway Map Creator for drawing and editing custom pathway maps. Applications of MetaCore and KPA include molecular mode of action of disease research, identification of potential biomarkers and drug targets, pathway hypothesis generation, analysis of biological effects for novel small molecule compounds and clinical applications (analysis of large cohorts of patients, and translational and personalized medicine).

  17. Differential gene and transcript expression analysis of RNA-seq experiments with TopHat and Cufflinks

    PubMed Central

    Trapnell, Cole; Roberts, Adam; Goff, Loyal; Pertea, Geo; Kim, Daehwan; Kelley, David R; Pimentel, Harold; Salzberg, Steven L; Rinn, John L; Pachter, Lior

    2012-01-01

    Recent advances in high-throughput cDNA sequencing (RNA-seq) can reveal new genes and splice variants and quantify expression genome-wide in a single assay. The volume and complexity of data from RNA-seq experiments necessitate scalable, fast and mathematically principled analysis software. TopHat and Cufflinks are free, open-source software tools for gene discovery and comprehensive expression analysis of high-throughput mRNA sequencing (RNA-seq) data. Together, they allow biologists to identify new genes and new splice variants of known ones, as well as compare gene and transcript expression under two or more conditions. This protocol describes in detail how to use TopHat and Cufflinks to perform such analyses. It also covers several accessory tools and utilities that aid in managing data, including CummeRbund, a tool for visualizing RNA-seq analysis results. Although the procedure assumes basic informatics skills, these tools assume little to no background with RNA-seq analysis and are meant for novices and experts alike. The protocol begins with raw sequencing reads and produces a transcriptome assembly, lists of differentially expressed and regulated genes and transcripts, and publication-quality visualizations of analysis results. The protocol's execution time depends on the volume of transcriptome sequencing data and available computing resources but takes less than 1 d of computer time for typical experiments and ~1 h of hands-on time. PMID:22383036

  18. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    PubMed

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.

  19. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  20. Role of Knowledge Management in Development and Lifecycle Management of Biopharmaceuticals.

    PubMed

    Rathore, Anurag S; Garcia-Aponte, Oscar Fabián; Golabgir, Aydin; Vallejo-Diaz, Bibiana Margarita; Herwig, Christoph

    2017-02-01

    Knowledge Management (KM) is a key enabler for achieving quality in a lifecycle approach for production of biopharmaceuticals. Due to the important role that it plays towards successful implementation of Quality by Design (QbD), an analysis of KM solutions is needed. This work provides a comprehensive review of the interface between KM and QbD-driven biopharmaceutical production systems as perceived by academic as well as industrial viewpoints. A comprehensive set of 356 publications addressing the applications of KM tools to QbD-related tasks were screened and a query to gather industrial inputs from 17 major biopharmaceutical organizations was performed. Three KM tool classes were identified as having high relevance for biopharmaceutical production systems and have been further explored: knowledge indicators, ontologies, and process modeling. A proposed categorization of 16 distinct KM tool classes allowed for the identification of holistic technologies supporting QbD. In addition, the classification allowed for addressing the disparity between industrial and academic expectations regarding the application of KM methodologies. This is a first of a kind attempt and thus we think that this paper would be of considerable interest to those in academia and industry that are engaged in accelerating development and commercialization of biopharmaceuticals.

  1. Stakeholders' Perspectives towards the Use of the Comprehensive Health Assessment Program (CHAP) for Adults with Intellectual Disabilities in Manitoba.

    PubMed

    Shooshtari, Shahin; Temple, Beverley; Waldman, Celeste; Abraham, Sneha; Ouellette-Kuntz, Héléne; Lennox, Nicholas

    2017-07-01

    No standardized tool is used in Canada for comprehensive health assessments of adults with intellectual disabilities. This study was conducted to determine the feasibility of implementing the Comprehensive Health Assessment Program (CHAP) in Manitoba, Canada. This was a qualitative study using a purposive sample of physicians, nurse practitioners, support workers and families. Data were collected through individual interviews and focus groups and were analysed using content analysis. Use of the CHAP was perceived as beneficial for persons with intellectual disabilities. Improved continuity of care was of the reported benefits. Six barriers for the future implementation of the CHAP were identified including the time required to complete the CHAP, and the perceived lack of physicians' willingness to do comprehensive assessments. The future implementation of the CHAP was strongly supported. For its successful implementation, training of healthcare professionals and support staff and change in regulations and policies were recommended. © 2016 John Wiley & Sons Ltd.

  2. Air Data Report Improves Flight Safety

    NASA Technical Reports Server (NTRS)

    2007-01-01

    NASA's Aviation Safety Program in the NASA Aeronautics Research Mission Directorate, which seeks to make aviation safer by developing tools for flight data analysis and interpretation and then by transferring these tools to the aviation industry, sponsored the development of Morning Report software. The software, created at Ames Research Center with the assistance of the Pacific Northwest National Laboratory, seeks to detect atypicalities without any predefined parameters-it spots deviations and highlights them. In 2004, Sagem Avionics Inc. entered a licensing agreement with NASA for the commercialization of the Morning Report software, and also licensed the NASA Aviation Data Integration System (ADIS) tool, which allows for the integration of data from disparate sources into the flight data analysis process. Sagem Avionics incorporated the Morning Report tool into its AGS product, a comprehensive flight operations monitoring system that helps users detect irregular or divergent practices, technical flaws, and problems that might develop when aircraft operate outside of normal procedures. Sagem developed AGS in collaboration with airlines, so that the system takes into account their technical evolutions and needs, and each airline is able to easily perform specific treatments and to build its own flight data analysis system. Further, the AGS is designed to support any aircraft and flight data recorders.

  3. System level airworthiness tool: A comprehensive approach to small unmanned aircraft system airworthiness

    NASA Astrophysics Data System (ADS)

    Burke, David A.

    One of the pillars of aviation safety is assuring sound engineering practices through airworthiness certification. As Unmanned Aircraft Systems (UAS) grow in popularity, the need for airworthiness standards and verification methods tailored for UAS becomes critical. While airworthiness practices for large UAS may be similar to manned aircraft, it is clear that small UAS require a paradigm shift from the airworthiness practices of manned aircraft. Although small in comparison to manned aircraft these aircraft are not merely remote controlled toys. Small UAS may be complex aircraft flying in the National Airspace System (NAS) over populated areas for extended durations and beyond line of sight of the operators. A comprehensive systems engineering framework for certifying small UAS at the system level is needed. This work presents a point based tool that evaluates small UAS by rewarding good engineering practices in design, analysis, and testing. The airworthiness requirements scale with vehicle size and operational area, while allowing flexibility for new technologies and unique configurations.

  4. Combining qualitative and quantitative methods to analyze serious games outcomes: A pilot study for a new cognitive screening tool.

    PubMed

    Vallejo, Vanessa; Mitache, Andrei V; Tarnanas, Ioannis; Muri, Rene; Mosimann, Urs P; Nef, Tobias

    2015-08-01

    Computer games for a serious purpose - so called serious games can provide additional information for the screening and diagnosis of cognitive impairment. Moreover, they have the advantage of being an ecological tool by involving daily living tasks. However, there is a need for better comprehensive designs regarding the acceptance of this technology, as the target population is older adults that are not used to interact with novel technologies. Moreover given the complexity of the diagnosis and the need for precise assessment, an evaluation of the best approach to analyze the performance data is required. The present study examines the usability of a new screening tool and proposes several new outlines for data analysis.

  5. Comprehension Tools for Teachers: Reading for Understanding from Prekindergarten through Fourth Grade

    PubMed Central

    Connor, Carol McDonald; Phillips, Beth M.; Kaschak, Michael; Apel, Kenn; Kim, Young-Suk; Al Otaiba, Stephanie; Crowe, Elizabeth C.; Thomas-Tate, Shurita; Johnson, Lakeisha Cooper; Lonigan, Christopher J.

    2015-01-01

    This paper describes the theoretical framework, as well as the development and testing of the intervention, Comprehension Tools for Teachers (CTT), which is composed of eight component interventions targeting malleable language and reading comprehension skills that emerging research indicates contribute to proficient reading for understanding for prekindergarteners through fourth graders. Component interventions target processes considered largely automatic as well as more reflective processes, with interacting and reciprocal effects. Specifically, we present component interventions targeting cognitive, linguistic, and text-specific processes, including morphological awareness, syntax, mental-state verbs, comprehension monitoring, narrative and expository text structure, enacted comprehension, academic knowledge, and reading to learn from informational text. Our aim was to develop a tool set composed of intensive meaningful individualized small group interventions. We improved feasibility in regular classrooms through the use of design-based iterative research methods including careful lesson planning, targeted scripting, pre- and postintervention proximal assessments, and technology. In addition to the overall framework, we discuss seven of the component interventions and general results of design and efficacy studies. PMID:26500420

  6. A multimedia consent tool for research participants in the Gambia: a randomized controlled trial

    PubMed Central

    McGrath, Nuala; D’Alessandro, Umberto; Kampmann, Beate; Imoukhuede, Egeruan B; Ravinetto, Raffaella M; Alexander, Neal; Larson, Heidi J; Chandramohan, Daniel; Bojang, Kalifa

    2015-01-01

    Abstract Objective To assess the effectiveness of a multimedia informed consent tool for adults participating in a clinical trial in the Gambia. Methods Adults eligible for inclusion in a malaria treatment trial (n = 311) were randomized to receive information needed for informed consent using either a multimedia tool (intervention arm) or a standard procedure (control arm). A computerized, audio questionnaire was used to assess participants’ comprehension of informed consent. This was done immediately after consent had been obtained (at day 0) and at subsequent follow-up visits (days 7, 14, 21 and 28). The acceptability and ease of use of the multimedia tool were assessed in focus groups. Findings On day 0, the median comprehension score in the intervention arm was 64% compared with 40% in the control arm (P = 0.042). The difference remained significant at all follow-up visits. Poorer comprehension was independently associated with female sex (odds ratio, OR: 0.29; 95% confidence interval, CI: 0.12–0.70) and residing in Jahaly rather than Basse province (OR: 0.33; 95% CI: 0.13–0.82). There was no significant independent association with educational level. The risk that a participant’s comprehension score would drop to half of the initial value was lower in the intervention arm (hazard ratio 0.22, 95% CI: 0.16–0.31). Overall, 70% (42/60) of focus group participants from the intervention arm found the multimedia tool clear and easy to understand. Conclusion A multimedia informed consent tool significantly improved comprehension and retention of consent information by research participants with low levels of literacy. PMID:26229203

  7. [How to evaluate the application of Clinical Governance tools in the management of hospitalized hyperglycemic patients: results of a multicentric study].

    PubMed

    De Belvis, Antonio Giulio; Specchia, Maria Lucia; Ferriero, Anna Maria; Capizzi, Silvio

    2017-01-01

    Risk management is a key tool in Clinical Governance. Our project aimed to define, share, apply and measure the impact of tools and methodologies for the continuous improvement of quality of care, especially in relation to the multi-disciplinary and integrated management of the hyperglycemic patient in hospital settings. A training project, coordinated by a scientific board of experts in diabetes and health management and an Expert Meeting with representatives of all the participating centers was launched in 2014. The project involved eight hospitals through the organization of meetings with five managers and 25 speakers, including diabetologists, internists, pharmacists and nurses. The analysis showed a wide variability in the adoption of tools and processes towards a comprehensive and coordinated management of hyperglycemic patients.

  8. Comprehensiveness of care from the patient perspective: comparison of primary healthcare evaluation instruments.

    PubMed

    Haggerty, Jeannie L; Beaulieu, Marie-Dominique; Pineault, Raynald; Burge, Frederick; Lévesque, Jean-Frédéric; Santor, Darcy A; Bouharaoui, Fatima; Beaulieu, Christine

    2011-12-01

    Comprehensiveness relates both to scope of services offered and to a whole-person clinical approach. Comprehensive services are defined as "the provision, either directly or indirectly, of a full range of services to meet most patients' healthcare needs"; whole-person care is "the extent to which a provider elicits and considers the physical, emotional and social aspects of a patient's health and considers the community context in their care." Among instruments that evaluate primary healthcare, two had subscales that mapped to comprehensive services and to the community component of whole-person care: the Primary Care Assessment Tool - Short Form (PCAT-S) and the Components of Primary Care Index (CPCI, a limited measure of whole-person care). To examine how well comprehensiveness is captured in validated instruments that evaluate primary healthcare from the patient's perspective. 645 adults with at least one healthcare contact in the previous 12 months responded to six instruments that evaluate primary healthcare. Scores were normalized for descriptive comparison. Exploratory and confirmatory (structural equation modelling) factor analysis examined fit to operational definition, and item response theory analysis examined item performance on common constructs. Over one-quarter of respondents had missing responses on services offered or doctor's knowledge of the community. The subscales did not load on a single factor; comprehensive services and community orientation were examined separately. The community orientation subscales did not perform satisfactorily. The three comprehensive services subscales fit very modestly onto two factors: (1) most healthcare needs (from one provider) (CPCI Comprehensive Care, PCAT-S First-Contact Utilization) and (2) range of services (PCAT-S Comprehensive Services Available). Individual item performance revealed several problems. Measurement of comprehensiveness is problematic, making this attribute a priority for measure development. Range of services offered is best obtained from providers. Whole-person care is not addressed as a separate construct, but some dimensions are covered by attributes such as interpersonal communication and relational continuity.

  9. OPUS: A Comprehensive Search Tool for Remote Sensing Observations of the Outer Planets. Now with Enhanced Geometric Metadata for Cassini and New Horizons Optical Remote Sensing Instruments.

    NASA Astrophysics Data System (ADS)

    Gordon, M. K.; Showalter, M. R.; Ballard, L.; Tiscareno, M.; French, R. S.; Olson, D.

    2017-06-01

    The PDS RMS Node hosts OPUS - an accurate, comprehensive search tool for spacecraft remote sensing observations. OPUS supports Cassini: CIRS, ISS, UVIS, VIMS; New Horizons: LORRI, MVIC; Galileo SSI; Voyager ISS; and Hubble: ACS, STIS, WFC3, WFPC2.

  10. Measuring New Media Literacies: Towards the Development of a Comprehensive Assessment Tool

    ERIC Educational Resources Information Center

    Literat, Ioana

    2014-01-01

    This study assesses the psychometric properties of a newly tested self-report assessment tool for media literacy, based on the twelve new media literacy skills (NMLs) developed by Jenkins et al. (2006). The sample (N = 327) consisted of normal volunteers who completed a comprehensive online survey that measured their NML skills, media exposure,…

  11. Quantitative characterization of galectin-3-C affinity mass spectrometry measurements: Comprehensive data analysis, obstacles, shortcuts and robustness.

    PubMed

    Haramija, Marko; Peter-Katalinić, Jasna

    2017-10-30

    Affinity mass spectrometry (AMS) is an emerging tool in the field of the study of protein•carbohydrate complexes. However, experimental obstacles and data analysis are preventing faster integration of AMS methods into the glycoscience field. Here we show how analysis of direct electrospray ionization mass spectrometry (ESI-MS) AMS data can be simplified for screening purposes, even for complex AMS spectra. A direct ESI-MS assay was tested in this study and binding data for the galectin-3C•lactose complex were analyzed using a comprehensive and simplified data analysis approach. In the comprehensive data analysis approach, noise, all protein charge states, alkali ion adducts and signal overlap were taken into account. In a simplified approach, only the intensities of the fully protonated free protein and the protein•carbohydrate complex for the main protein charge state were taken into account. In our study, for high intensity signals, noise was negligible, sodiated protein and sodiated complex signals cancelled each other out when calculating the K d value, and signal overlap influenced the Kd value only to a minor extent. Influence of these parameters on low intensity signals was much higher. However, low intensity protein charge states should be avoided in quantitative AMS analyses due to poor ion statistics. The results indicate that noise, alkali ion adducts, signal overlap, as well as low intensity protein charge states, can be neglected for preliminary experiments, as well as in screening assays. One comprehensive data analysis performed as a control should be sufficient to validate this hypothesis for other binding systems as well. Copyright © 2017 John Wiley & Sons, Ltd.

  12. GenoBase: comprehensive resource database of Escherichia coli K-12

    PubMed Central

    Otsuka, Yuta; Muto, Ai; Takeuchi, Rikiya; Okada, Chihiro; Ishikawa, Motokazu; Nakamura, Koichiro; Yamamoto, Natsuko; Dose, Hitomi; Nakahigashi, Kenji; Tanishima, Shigeki; Suharnan, Sivasundaram; Nomura, Wataru; Nakayashiki, Toru; Aref, Walid G.; Bochner, Barry R.; Conway, Tyrrell; Gribskov, Michael; Kihara, Daisuke; Rudd, Kenneth E.; Tohsato, Yukako; Wanner, Barry L.; Mori, Hirotada

    2015-01-01

    Comprehensive experimental resources, such as ORFeome clone libraries and deletion mutant collections, are fundamental tools for elucidation of gene function. Data sets by omics analysis using these resources provide key information for functional analysis, modeling and simulation both in individual and systematic approaches. With the long-term goal of complete understanding of a cell, we have over the past decade created a variety of clone and mutant sets for functional genomics studies of Escherichia coli K-12. We have made these experimental resources freely available to the academic community worldwide. Accordingly, these resources have now been used in numerous investigations of a multitude of cell processes. Quality control is extremely important for evaluating results generated by these resources. Because the annotation has been changed since 2005, which we originally used for the construction, we have updated these genomic resources accordingly. Here, we describe GenoBase (http://ecoli.naist.jp/GB/), which contains key information about comprehensive experimental resources of E. coli K-12, their quality control and several omics data sets generated using these resources. PMID:25399415

  13. Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer

    DTIC Science & Technology

    2006-03-01

    able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem

  14. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET) sequence data.

    PubMed

    Chiu, Kuo Ping; Wong, Chee-Hong; Chen, Qiongyu; Ariyaratne, Pramila; Ooi, Hong Sain; Wei, Chia-Lin; Sung, Wing-Kin Ken; Ruan, Yijun

    2006-08-25

    We recently developed the Paired End diTag (PET) strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the Project Manager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  15. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    NASA Astrophysics Data System (ADS)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  16. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGES

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; ...

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  17. Widely-targeted quantitative lipidomics methodology by supercritical fluid chromatography coupled with fast-scanning triple quadrupole mass spectrometry.

    PubMed

    Takeda, Hiroaki; Izumi, Yoshihiro; Takahashi, Masatomo; Paxton, Thanai; Tamura, Shohei; Koike, Tomonari; Yu, Ying; Kato, Noriko; Nagase, Katsutoshi; Shiomi, Masashi; Bamba, Takeshi

    2018-05-03

    Lipidomics, the mass spectrometry-based comprehensive analysis of lipids, has attracted attention as an analytical approach to provide novel insight into lipid metabolism and to search for biomarkers. However, an ideal method for both comprehensive and quantitative analysis of lipids has not been fully developed. Herein, we have proposed a practical methodology for widely-targeted quantitative lipidome analysis using supercritical fluid chromatography fast-scanning triple-quadrupole mass spectrometry (SFC/QqQMS) and theoretically calculated a comprehensive lipid multiple reaction monitoring (MRM) library. Lipid classes can be separated by SFC with a normal phase diethylamine-bonded silica column with high-resolution, high-throughput, and good repeatability. Structural isomers of phospholipids can be monitored by mass spectrometric separation with fatty acyl-based MRM transitions. SFC/QqQMS analysis with an internal standard-dilution method offers quantitative information for both lipid class and individual lipid molecular species in the same lipid class. Additionally, data acquired using this method has advantages including reduction of misidentification and acceleration of data analysis. Using the SFC/QqQMS system, alteration of plasma lipid levels in myocardial infarction-prone rabbits to the supplementation of eicosapentaenoic acid was first observed. Our developed SFC/QqQMS method represents a potentially useful tool for in-depth studies focused on complex lipid metabolism and biomarker discovery. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  18. Global catalogue of microorganisms (gcm): a comprehensive database and information retrieval, analysis, and visualization system for microbial resources

    PubMed Central

    2013-01-01

    Background Throughout the long history of industrial and academic research, many microbes have been isolated, characterized and preserved (whenever possible) in culture collections. With the steady accumulation in observational data of biodiversity as well as microbial sequencing data, bio-resource centers have to function as data and information repositories to serve academia, industry, and regulators on behalf of and for the general public. Hence, the World Data Centre for Microorganisms (WDCM) started to take its responsibility for constructing an effective information environment that would promote and sustain microbial research data activities, and bridge the gaps currently present within and outside the microbiology communities. Description Strain catalogue information was collected from collections by online submission. We developed tools for automatic extraction of strain numbers and species names from various sources, including Genbank, Pubmed, and SwissProt. These new tools connect strain catalogue information with the corresponding nucleotide and protein sequences, as well as to genome sequence and references citing a particular strain. All information has been processed and compiled in order to create a comprehensive database of microbial resources, and was named Global Catalogue of Microorganisms (GCM). The current version of GCM contains information of over 273,933 strains, which includes 43,436bacterial, fungal and archaea species from 52 collections in 25 countries and regions. A number of online analysis and statistical tools have been integrated, together with advanced search functions, which should greatly facilitate the exploration of the content of GCM. Conclusion A comprehensive dynamic database of microbial resources has been created, which unveils the resources preserved in culture collections especially for those whose informatics infrastructures are still under development, which should foster cumulative research, facilitating the activities of microbiologists world-wide, who work in both public and industrial research centres. This database is available from http://gcm.wfcc.info. PMID:24377417

  19. DGCA: A comprehensive R package for Differential Gene Correlation Analysis.

    PubMed

    McKenzie, Andrew T; Katsyv, Igor; Song, Won-Min; Wang, Minghui; Zhang, Bin

    2016-11-15

    Dissecting the regulatory relationships between genes is a critical step towards building accurate predictive models of biological systems. A powerful approach towards this end is to systematically study the differences in correlation between gene pairs in more than one distinct condition. In this study we develop an R package, DGCA (for Differential Gene Correlation Analysis), which offers a suite of tools for computing and analyzing differential correlations between gene pairs across multiple conditions. To minimize parametric assumptions, DGCA computes empirical p-values via permutation testing. To understand differential correlations at a systems level, DGCA performs higher-order analyses such as measuring the average difference in correlation and multiscale clustering analysis of differential correlation networks. Through a simulation study, we show that the straightforward z-score based method that DGCA employs significantly outperforms the existing alternative methods for calculating differential correlation. Application of DGCA to the TCGA RNA-seq data in breast cancer not only identifies key changes in the regulatory relationships between TP53 and PTEN and their target genes in the presence of inactivating mutations, but also reveals an immune-related differential correlation module that is specific to triple negative breast cancer (TNBC). DGCA is an R package for systematically assessing the difference in gene-gene regulatory relationships under different conditions. This user-friendly, effective, and comprehensive software tool will greatly facilitate the application of differential correlation analysis in many biological studies and thus will help identification of novel signaling pathways, biomarkers, and targets in complex biological systems and diseases.

  20. ReMatch: a web-based tool to construct, store and share stoichiometric metabolic models with carbon maps for metabolic flux analysis.

    PubMed

    Pitkänen, Esa; Akerlund, Arto; Rantanen, Ari; Jouhten, Paula; Ukkonen, Esko

    2008-08-25

    ReMatch is a web-based, user-friendly tool that constructs stoichiometric network models for metabolic flux analysis, integrating user-developed models into a database collected from several comprehensive metabolic data resources, including KEGG, MetaCyc and CheBI. Particularly, ReMatch augments the metabolic reactions of the model with carbon mappings to facilitate (13)C metabolic flux analysis. The construction of a network model consisting of biochemical reactions is the first step in most metabolic modelling tasks. This model construction can be a tedious task as the required information is usually scattered to many separate databases whose interoperability is suboptimal, due to the heterogeneous naming conventions of metabolites in different databases. Another, particularly severe data integration problem is faced in (13)C metabolic flux analysis, where the mappings of carbon atoms from substrates into products in the model are required. ReMatch has been developed to solve the above data integration problems. First, ReMatch matches the imported user-developed model against the internal ReMatch database while considering a comprehensive metabolite name thesaurus. This, together with wild card support, allows the user to specify the model quickly without having to look the names up manually. Second, ReMatch is able to augment reactions of the model with carbon mappings, obtained either from the internal database or given by the user with an easy-touse tool. The constructed models can be exported into 13C-FLUX and SBML file formats. Further, a stoichiometric matrix and visualizations of the network model can be generated. The constructed models of metabolic networks can be optionally made available to the other users of ReMatch. Thus, ReMatch provides a common repository for metabolic network models with carbon mappings for the needs of metabolic flux analysis community. ReMatch is freely available for academic use at http://www.cs.helsinki.fi/group/sysfys/software/rematch/.

  1. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    PubMed

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Evidence-based patient information about treatment of multiple sclerosis--a phase one study on comprehension and emotional responses.

    PubMed

    Kasper, Jürgen; Köpke, Sascha; Mühlhauser, Ingrid; Heesen, Christoph

    2006-07-01

    This study analysis the comprehension and emotional responses of people suffering from multiple sclerosis when provided with an evidence-based information module. It is a core module of a comprehensive decision aid about immunotherapy. The core module is designed to enable patients to process scientific uncertainty without adverse effects. It considers existing standards for risk communication and presentation of data. Using a mailing approach we investigated 169 patients with differing courses of disease in a before-after design. Items addressed the competence in processing relative and absolute risk information and patients' emotional response to the tool, comprising grade of familiarity with the information, understanding, relevance, emotional arousal, and certainty. Overall, numeracy improved (p < 0.001), although 99 of 169 patients did not complete the numeracy task correctly. Understanding depended on the relevance related to the course of disease. A moderate level of uncertainty was induced. No adverse emotional responses could be shown, neither in those who did comprehend the information, nor in those who did not develop numeracy skills. In conclusion, the tool supports people suffering from multiple sclerosis to process evidence-based medical information and scientific uncertainty without burdening them emotionally. This study is an example for the documentation of an important step in the development process of a complex intervention.

  3. Tools, strategies and qualitative approach in relation to suicidal attempts and ideation in the elderly.

    PubMed

    Cavalcante, Fátima Gonçalves; Minayo, Maria Cecília de Souza; Gutierrez, Denise Machado Duran; de Sousa, Girliani Silva; da Silva, Raimunda Magalhães; Moura, Rosylaine; Meneghel, Stela Nazareth; Grubits, Sonia; Conte, Marta; Cavalcante, Ana Célia Sousa; Figueiredo, Ana Elisa Bastos; Mangas, Raimunda Matilde do Nascimento; Fachola, María Cristina Heuguerot; Izquierdo, Giovane Mendieta

    2015-06-01

    The article analyses the quality and consistency of a comprehensive interview guide, adapted to study attempted suicide and its ideation among the elderly, and imparts the method followed in applying this tool. The objective is to show how the use of a semi-structured interview and the organization and data analysis set-up were tested and perfected by a network of researchers from twelve universities or research centers in Brazil, Uruguay and Colombia. The method involved application and evaluation of the tool and joint production of an instruction manual on data collection, systematization and analysis. The methodology was followed in 67 interviews with elderly people of 60 or older and in 34 interviews with health professionals in thirteen Brazilian municipalities and in Montevideo and Bogotá, allowing the consistency of the tool and the applicability of the method to be checked, during the process and at the end. The enhanced guide and the instructions for reproducing it are presented herein. The results indicate the suitability and credibility of this methodological approach, tested and certified in interdisciplinary and interinstitutional terms.

  4. Learning to Estimate Slide Comprehension in Classrooms with Support Vector Machines

    ERIC Educational Resources Information Center

    Pattanasri, N.; Mukunoki, M.; Minoh, M.

    2012-01-01

    Comprehension assessment is an essential tool in classroom learning. However, the judgment often relies on experience of an instructor who makes observation of students' behavior during the lessons. We argue that students should report their own comprehension explicitly in a classroom. With students' comprehension made available at the slide…

  5. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  6. Analysis of bacterial fatty acids by flow modulated comprehensive two-dimensional gas chromatography with parallel flame ionization detector/mass spectrometry.

    PubMed

    Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Xu, Guowang; De Vos, Paul; Sandra, Pat

    2010-06-25

    Comprehensive two-dimensional gas chromatography (GCxGC) offers an interesting tool for profiling bacterial fatty acids. Flow modulated GCxGC using a commercially available system was evaluated, different parameters such as column flows and modulation time were optimized. The method was tested on bacterial fatty acid methyl esters (BAMEs) from Stenotrophomonas maltophilia LMG 958T by using parallel flame ionization detector (FID)/mass spectrometry (MS). The results are compared to data obtained using a thermal modulated GCxGC system. The data show that flow modulated GCxGC-FID/MS method can be applied in a routine environment and offers interesting perspectives for chemotaxonomy of bacteria.

  7. Architectural Visualization of C/C++ Source Code for Program Comprehension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panas, T; Epperly, T W; Quinlan, D

    2006-09-01

    Structural and behavioral visualization of large-scale legacy systems to aid program comprehension is still a major challenge. The challenge is even greater when applications are implemented in flexible and expressive languages such as C and C++. In this paper, we consider visualization of static and dynamic aspects of large-scale scientific C/C++ applications. For our investigation, we reuse and integrate specialized analysis and visualization tools. Furthermore, we present a novel layout algorithm that permits a compressive architectural view of a large-scale software system. Our layout is unique in that it allows traditional program visualizations, i.e., graph structures, to be seen inmore » relation to the application's file structure.« less

  8. Network Analytical Tool for Monitoring Global Food Safety Highlights China

    PubMed Central

    Nepusz, Tamás; Petróczi, Andrea; Naughton, Declan P.

    2009-01-01

    Background The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. Methodology/Principal Findings We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003 – August 2008 were processed using network analysis to i) capture complexity, ii) analyze trends, and iii) predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i) Google's PageRank algorithm and ii) the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. Conclusions/Significance This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios. PMID:19688088

  9. Browsing Space Weather Data and Models with the Integrated Space Weather Analysis (iSWA) System

    NASA Technical Reports Server (NTRS)

    Maddox, Marlo M.; Mullinix, Richard E.; Berrios, David H.; Hesse, Michael; Rastaetter, Lutz; Pulkkinen, Antti; Hourcle, Joseph A.; Thompson, Barbara J.

    2011-01-01

    The Integrated Space Weather Analysis (iSWA) System is a comprehensive web-based platform for space weather information that combines data from solar, heliospheric and geospace observatories with forecasts based on the most advanced space weather models. The iSWA system collects, generates, and presents a wide array of space weather resources in an intuitive, user-configurable, and adaptable format - thus enabling users to respond to current and future space weather impacts as well as enabling post-impact analysis. iSWA currently provides over 200 data and modeling products, and features a variety of tools that allow the user to browse, combine, and examine data and models from various sources. This presentation will consist of a summary of the iSWA products and an overview of the customizable user interfaces, and will feature several tutorial demonstrations highlighting the interactive tools and advanced capabilities.

  10. Research on criticality analysis method of CNC machine tools components under fault rate correlation

    NASA Astrophysics Data System (ADS)

    Gui-xiang, Shen; Xian-zhuo, Zhao; Zhang, Ying-zhi; Chen-yu, Han

    2018-02-01

    In order to determine the key components of CNC machine tools under fault rate correlation, a system component criticality analysis method is proposed. Based on the fault mechanism analysis, the component fault relation is determined, and the adjacency matrix is introduced to describe it. Then, the fault structure relation is hierarchical by using the interpretive structure model (ISM). Assuming that the impact of the fault obeys the Markov process, the fault association matrix is described and transformed, and the Pagerank algorithm is used to determine the relative influence values, combined component fault rate under time correlation can obtain comprehensive fault rate. Based on the fault mode frequency and fault influence, the criticality of the components under the fault rate correlation is determined, and the key components are determined to provide the correct basis for equationting the reliability assurance measures. Finally, taking machining centers as an example, the effectiveness of the method is verified.

  11. TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.

    PubMed

    Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D

    2018-05-08

    Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.

  12. LipidMiner: A Software for Automated Identification and Quantification of Lipids from Multiple Liquid Chromatography-Mass Spectrometry Data Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Da; Zhang, Qibin; Gao, Xiaoli

    2014-04-30

    We have developed a tool for automated, high-throughput analysis of LC-MS/MS data files, which greatly simplifies LC-MS based lipidomics analysis. Our results showed that LipidMiner is accurate and comprehensive in identification and quantification of lipid molecular species. In addition, the workflow implemented in LipidMiner is not limited to identification and quantification of lipids. If a suitable metabolite library is implemented in the library matching module, LipidMiner could be reconfigured as a tool for general metabolomics data analysis. It is of note that LipidMiner currently is limited to singly charged ions, although it is adequate for the purpose of lipidomics sincemore » lipids are rarely multiply charged,[14] even for the polyphosphoinositides. LipidMiner also only processes file formats generated from mass spectrometers from Thermo, i.e. the .RAW format. In the future, we are planning to accommodate file formats generated by mass spectrometers from other predominant instrument vendors to make this tool more universal.« less

  13. Environmental Camp as a Comprehensive Communication Tool to Promote the RRR Concept to Elementary Education Students at Koh Si Chang School

    ERIC Educational Resources Information Center

    Supakata, Nuta; Puangthongthub, Sitthichok; Srithongouthai, Sarawut; Kanokkantapong, Vorapot; Chaikaew, Pasicha

    2016-01-01

    The objective of this study was to develop and implement a Reduce-Reuse-Recycle (RRR) communication strategy through environmental camp as a comprehensive communication tool to promote the RRR concept to elementary school students. Various activities from five learning bases including the folding milk carton game, waste separation relay, recycling…

  14. The IDEA Assessment Tool: Assessing the Reporting, Diagnostic Reasoning, and Decision-Making Skills Demonstrated in Medical Students' Hospital Admission Notes.

    PubMed

    Baker, Elizabeth A; Ledford, Cynthia H; Fogg, Louis; Way, David P; Park, Yoon Soo

    2015-01-01

    Construct: Clinical skills are used in the care of patients, including reporting, diagnostic reasoning, and decision-making skills. Written comprehensive new patient admission notes (H&Ps) are a ubiquitous part of student education but are underutilized in the assessment of clinical skills. The interpretive summary, differential diagnosis, explanation of reasoning, and alternatives (IDEA) assessment tool was developed to assess students' clinical skills using written comprehensive new patient admission notes. The validity evidence for assessment of clinical skills using clinical documentation following authentic patient encounters has not been well documented. Diagnostic justification tools and postencounter notes are described in the literature (1,2) but are based on standardized patient encounters. To our knowledge, the IDEA assessment tool is the first published tool that uses medical students' H&Ps to rate students' clinical skills. The IDEA assessment tool is a 15-item instrument that asks evaluators to rate students' reporting, diagnostic reasoning, and decision-making skills based on medical students' new patient admission notes. This study presents validity evidence in support of the IDEA assessment tool using Messick's unified framework, including content (theoretical framework), response process (interrater reliability), internal structure (factor analysis and internal-consistency reliability), and relationship to other variables. Validity evidence is based on results from four studies conducted between 2010 and 2013. First, the factor analysis (2010, n = 216) yielded a three-factor solution, measuring patient story, IDEA, and completeness, with reliabilities of .79, .88, and .79, respectively. Second, an initial interrater reliability study (2010) involving two raters demonstrated fair to moderate consensus (κ = .21-.56, ρ =.42-.79). Third, a second interrater reliability study (2011) with 22 trained raters also demonstrated fair to moderate agreement (intraclass correlations [ICCs] = .29-.67). There was moderate reliability for all three skill domains, including reporting skills (ICC = .53), diagnostic reasoning skills (ICC = .64), and decision-making skills (ICC = .63). Fourth, there was a significant correlation between IDEA rating scores (2010-2013) and final Internal Medicine clerkship grades (r = .24), 95% confidence interval (CI) [.15, .33]. The IDEA assessment tool is a novel tool with validity evidence to support its use in the assessment of students' reporting, diagnostic reasoning, and decision-making skills. The moderate reliability achieved supports formative or lower stakes summative uses rather than high-stakes summative judgments.

  15. Development of a Peer Teaching-Assessment Program and a Peer Observation and Evaluation Tool

    PubMed Central

    Trujillo, Jennifer M.; Barr, Judith; Gonyeau, Michael; Van Amburgh, Jenny A.; Matthews, S. James; Qualters, Donna

    2008-01-01

    Objectives To develop a formalized, comprehensive, peer-driven teaching assessment program and a valid and reliable assessment tool. Methods A volunteer taskforce was formed and a peer-assessment program was developed using a multistep, sequential approach and the Peer Observation and Evaluation Tool (POET). A pilot study was conducted to evaluate the efficiency and practicality of the process and to establish interrater reliability of the tool. Intra-class correlation coefficients (ICC) were calculated. Results ICCs for 8 separate lectures evaluated by 2-3 observers ranged from 0.66 to 0.97, indicating good interrater reliability of the tool. Conclusion Our peer assessment program for large classroom teaching, which includes a valid and reliable evaluation tool, is comprehensive, feasible, and can be adopted by other schools of pharmacy. PMID:19325963

  16. Non-targeted analysis of electronics waste by comprehensive two-dimensional gas chromatography combined with high-resolution mass spectrometry: Using accurate mass information and mass defect analysis to explore the data.

    PubMed

    Ubukata, Masaaki; Jobst, Karl J; Reiner, Eric J; Reichenbach, Stephen E; Tao, Qingping; Hang, Jiliang; Wu, Zhanpin; Dane, A John; Cody, Robert B

    2015-05-22

    Comprehensive two-dimensional gas chromatography (GC×GC) and high-resolution mass spectrometry (HRMS) offer the best possible separation of their respective techniques. Recent commercialization of combined GC×GC-HRMS systems offers new possibilities for the analysis of complex mixtures. However, such experiments yield enormous data sets that require new informatics tools to facilitate the interpretation of the rich information content. This study reports on the analysis of dust obtained from an electronics recycling facility by using GC×GC in combination with a new high-resolution time-of-flight (TOF) mass spectrometer. New software tools for (non-traditional) Kendrick mass defect analysis were developed in this research and greatly aided in the identification of compounds containing chlorine and bromine, elements that feature in most persistent organic pollutants (POPs). In essence, the mass defect plot serves as a visual aid from which halogenated compounds are recognizable on the basis of their mass defect and isotope patterns. Mass chromatograms were generated based on specific ions identified in the plots as well as region of the plot predominantly occupied by halogenated contaminants. Tentative identification was aided by database searches, complementary electron-capture negative ionization experiments and elemental composition determinations from the exact mass data. These included known and emerging flame retardants, such as polybrominated diphenyl ethers (PBDEs), hexabromobenzene, tetrabromo bisphenol A and tris (1-chloro-2-propyl) phosphate (TCPP), as well as other legacy contaminants such as polychlorinated biphenyls (PCBs) and polychlorinated terphenyls (PCTs). Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Transforming beliefs and practices: Elementary teacher candidates' development through shared authentic teaching and reflection experiences within an innovative science methods course

    NASA Astrophysics Data System (ADS)

    Naidoo, Kara

    Elementary teachers are criticized for failing to incorporate meaningful science instruction in their classrooms or avoiding science instruction altogether. The lack of adequate science instruction in elementary schools is partially attributed to teacher candidates' anxiety, poor content and pedagogical preparation, and low science teaching self-efficacy. The central premise of this study was that many of these issues could be alleviated through course modifications designed to address these issues. The design tested and presented here provided prospective elementary educators' authentic science teaching experiences with elementary students in a low-stakes environment with the collaboration of peers and science teacher educators. The process of comprehensive reflection was developed for and tested in this study. Comprehensive reflection is individual and collective, written and set in dialogic discourse, focused on past and future behavior, and utilizes video recordings from shared teaching experiences. To test the central premise, an innovative science methods course was designed, implemented and evaluated using a one-group mixed-method design. The focus of the analysis was on changes in self-efficacy, identity and teaching practices as a function of authentic science teaching experiences and comprehensive reflection. The quantitative tools for analysis were t-tests and repeated-measures ANOVA on the Science Teaching Efficacy Belief Instrument-B (STEBI-B) and weekly self-rating on confidence as a learner and a teacher of science, respectively. The tools used to analyze qualitative data included thematic analysis and interpretative phenomenological analysis. In addition, theoretically grounded tools were developed and used in a case study to determine the ways one prospective educator's science teaching identity was influenced by experiences in the course. The innovative course structure led the development of teacher candidates' science teaching identity, supported science teaching self-efficacy development, positioned teachers as agents in their learning and development, provided the opportunity for teacher candidates to problematize teaching experiences to improve practice, developed teacher candidates who were able to critically question and create science curricula with the primary purpose of mediating student learning, and improved teacher candidates questioning skills and assistance with student performance in order to better mediate student learning. Implications for teacher education and future directions for research are discussed.

  18. RIEMS: a software pipeline for sensitive and comprehensive taxonomic classification of reads from metagenomics datasets.

    PubMed

    Scheuch, Matthias; Höper, Dirk; Beer, Martin

    2015-03-03

    Fuelled by the advent and subsequent development of next generation sequencing technologies, metagenomics became a powerful tool for the analysis of microbial communities both scientifically and diagnostically. The biggest challenge is the extraction of relevant information from the huge sequence datasets generated for metagenomics studies. Although a plethora of tools are available, data analysis is still a bottleneck. To overcome the bottleneck of data analysis, we developed an automated computational workflow called RIEMS - Reliable Information Extraction from Metagenomic Sequence datasets. RIEMS assigns every individual read sequence within a dataset taxonomically by cascading different sequence analyses with decreasing stringency of the assignments using various software applications. After completion of the analyses, the results are summarised in a clearly structured result protocol organised taxonomically. The high accuracy and performance of RIEMS analyses were proven in comparison with other tools for metagenomics data analysis using simulated sequencing read datasets. RIEMS has the potential to fill the gap that still exists with regard to data analysis for metagenomics studies. The usefulness and power of RIEMS for the analysis of genuine sequencing datasets was demonstrated with an early version of RIEMS in 2011 when it was used to detect the orthobunyavirus sequences leading to the discovery of Schmallenberg virus.

  19. Comprehensive Analysis of Immunological Synapse Phenotypes Using Supported Lipid Bilayers.

    PubMed

    Valvo, Salvatore; Mayya, Viveka; Seraia, Elena; Afrose, Jehan; Novak-Kotzer, Hila; Ebner, Daniel; Dustin, Michael L

    2017-01-01

    Supported lipid bilayers (SLB) formed on glass substrates have been a useful tool for study of immune cell signaling since the early 1980s. The mobility of lipid-anchored proteins in the system, first described for antibodies binding to synthetic phospholipid head groups, allows for the measurement of two-dimensional binding reactions and signaling processes in a single imaging plane over time or for fixed samples. The fragility of SLB and the challenges of building and validating individual substrates limit most experimenters to ~10 samples per day, perhaps increasing this few-fold when examining fixed samples. Successful experiments might then require further days to fully analyze. We present methods for automation of many steps in SLB formation, imaging in 96-well glass bottom plates, and analysis that enables >100-fold increase in throughput for fixed samples and wide-field fluorescence. This increased throughput will allow better coverage of relevant parameters and more comprehensive analysis of aspects of the immunological synapse that are well reconstituted by SLB.

  20. Method Development for Clinical Comprehensive Evaluation of Pediatric Drugs Based on Multi-Criteria Decision Analysis: Application to Inhaled Corticosteroids for Children with Asthma.

    PubMed

    Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling

    2018-04-01

    Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful tool for the CCES-P.

  1. Progress on the Multiphysics Capabilities of the Parallel Electromagnetic ACE3P Simulation Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kononenko, Oleksiy

    2015-03-26

    ACE3P is a 3D parallel simulation suite that is being developed at SLAC National Accelerator Laboratory. Effectively utilizing supercomputer resources, ACE3P has become a key tool for the coupled electromagnetic, thermal and mechanical research and design of particle accelerators. Based on the existing finite-element infrastructure, a massively parallel eigensolver is developed for modal analysis of mechanical structures. It complements a set of the multiphysics tools in ACE3P and, in particular, can be used for the comprehensive study of microphonics in accelerating cavities ensuring the operational reliability of a particle accelerator.

  2. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.

    PubMed

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.

  3. IDEOM: an Excel interface for analysis of LC-MS-based metabolomics data.

    PubMed

    Creek, Darren J; Jankevics, Andris; Burgess, Karl E V; Breitling, Rainer; Barrett, Michael P

    2012-04-01

    The application of emerging metabolomics technologies to the comprehensive investigation of cellular biochemistry has been limited by bottlenecks in data processing, particularly noise filtering and metabolite identification. IDEOM provides a user-friendly data processing application that automates filtering and identification of metabolite peaks, paying particular attention to common sources of noise and false identifications generated by liquid chromatography-mass spectrometry (LC-MS) platforms. Building on advanced processing tools such as mzMatch and XCMS, it allows users to run a comprehensive pipeline for data analysis and visualization from a graphical user interface within Microsoft Excel, a familiar program for most biological scientists. IDEOM is provided free of charge at http://mzmatch.sourceforge.net/ideom.html, as a macro-enabled spreadsheet (.xlsb). Implementation requires Microsoft Excel (2007 or later). R is also required for full functionality. michael.barrett@glasgow.ac.uk Supplementary data are available at Bioinformatics online.

  4. Understanding online health information: Evaluation, tools, and strategies.

    PubMed

    Beaunoyer, Elisabeth; Arsenault, Marianne; Lomanowska, Anna M; Guitton, Matthieu J

    2017-02-01

    Considering the status of the Internet as a prominent source of health information, assessing online health material has become a central issue in patient education. We describe the strategies available to evaluate the characteristics of online health information, including readability, emotional content, understandability, usability. Popular tools used in assessment of readability, emotional content and comprehensibility of online health information were reviewed. Tools designed to evaluate both printed and online material were considered. Readability tools are widely used in online health material evaluation and are highly covariant. Assessment of emotional content of online health-related communications via sentiment analysis tools is becoming more popular. Understandability and usability tools have been developed specifically for health-related material, but each tool has important limitations and has been tested on a limited number of health issues. Despite the availability of numerous assessment tools, their overall reliability differs between readability (high) and understandability (low). Approaches combining multiple assessment tools and involving both quantitative and qualitative observations would optimize assessment strategies. Effective assessment of online health information should rely on mixed strategies combining quantitative and qualitative evaluations. Assessment tools should be selected according to their functional properties and compatibility with target material. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  6. Discover Space Weather and Sun's Superpowers: Using CCMC's innovative tools and applications

    NASA Astrophysics Data System (ADS)

    Mendoza, A. M. M.; Maddox, M. M.; Kuznetsova, M. M.; Chulaki, A.; Rastaetter, L.; Mullinix, R.; Weigand, C.; Boblitt, J.; Taktakishvili, A.; MacNeice, P. J.; Pulkkinen, A. A.; Pembroke, A. D.; Mays, M. L.; Zheng, Y.; Shim, J. S.

    2015-12-01

    Community Coordinated Modeling Center (CCMC) has developed a comprehensive set of tools and applications that are directly applicable to space weather and space science education. These tools, some of which were developed by our student interns, are capable of serving a wide range of student audiences, from middle school to postgraduate research. They include a web-based point of access to sophisticated space physics models and visualizations, and a powerful space weather information dissemination system, available on the web and as a mobile app. In this demonstration, we will use CCMC's innovative tools to engage the audience in real-time space weather analysis and forecasting and will share some of our interns' hands-on experiences while being trained as junior space weather forecasters. The main portals to CCMC's educational material are ccmc.gsfc.nasa.gov and iswa.gsfc.nasa.gov

  7. Avanti lipid tools: connecting lipids, technology, and cell biology.

    PubMed

    Sims, Kacee H; Tytler, Ewan M; Tipton, John; Hill, Kasey L; Burgess, Stephen W; Shaw, Walter A

    2014-08-01

    Lipid research is challenging owing to the complexity and diversity of the lipidome. Here we review a set of experimental tools developed for the seasoned lipid researcher, as well as, those who are new to the field of lipid research. Novel tools for probing protein-lipid interactions, applications for lipid binding antibodies, enhanced systems for the cellular delivery of lipids, improved visualization of lipid membranes using gold-labeled lipids, and advances in mass spectrometric analysis techniques will be discussed. Because lipid mediators are known to participate in a host of signal transduction and trafficking pathways within the cell, a comprehensive lipid toolbox that aids the science of lipidomics research is essential to better understand the molecular mechanisms of interactions between cellular components. This article is part of a Special Issue entitled Tools to study lipid functions. Copyright © 2014. Published by Elsevier B.V.

  8. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis

    PubMed Central

    2013-01-01

    Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed. PMID:24044807

  9. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis.

    PubMed

    Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa

    2013-09-17

    Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed.

  10. Sensory Evaluation as a Tool in Determining Acceptability of Innovative Products Developed by Undergraduate Students in Food Science and Technology at the University of Trinidad and Tobago

    ERIC Educational Resources Information Center

    Singh-Ackbarali, Dimple; Maharaj, Rohanie

    2014-01-01

    This paper discusses the comprehensive and practical training that was delivered to students in a university classroom on how sensory evaluation can be used to determine acceptability of food products. The report presents how students used their training on sensory evaluation methods and analysis and applied it to improving and predicting…

  11. Solving PDEs with Intrepid

    DOE PAGES

    Bochev, P.; Edwards, H. C.; Kirby, R. C.; ...

    2012-01-01

    Intrepid is a Trilinos package for advanced discretizations of Partial Differential Equations (PDEs). The package provides a comprehensive set of tools for local, cell-based construction of a wide range of numerical methods for PDEs. This paper describes the mathematical ideas and software design principles incorporated in the package. We also provide representative examples showcasing the use of Intrepid both in the context of numerical PDEs and the more general context of data analysis.

  12. Rosetta CONSERT operations and data analysis preparation: simulation software tools.

    NASA Astrophysics Data System (ADS)

    Rogez, Yves; Hérique, Alain; Cardiet, Maël; Zine, Sonia; Westphal, Mathieu; Micallef, Mickael; Berquin, Yann; Kofman, Wlodek

    2014-05-01

    The CONSERT experiment onboard Rosetta and Philae will perform the tomography of the 67P/CG comet nucleus by measuring radio waves transmission from the Rosetta S/C to the Philae Lander. The accurate analysis of travel time measurements will deliver unique knowledge of the nucleus interior dielectric properties. The challenging complexity of CONSERT operations requirements, combining both Rosetta and Philae, allows only a few set of opportunities to acquire data. Thus, we need a fine analysis of the impact of Rosetta trajectory, Philae position and comet shape on CONSERT measurements, in order to take optimal decisions in a short time. The integration of simulation results and mission parameters provides synthetic information to evaluate performances and risks for each opportunity. The preparation of CONSERT measurements before space operations is a key to achieve the best science return of the experiment. In addition, during Rosetta space operations, these software tools will allow a "real-time" first analysis of the latest measurements to improve the next acquisition sequences. The software tools themselves are built around a 3D electromagnetic radio wave simulation, taking into account the signal polarization. It is based on ray-tracing algorithms specifically designed for quick orbit analysis and radar signal generation. This allows computation on big domains relatively to the wavelength. The extensive use of 3D visualization tools provides comprehensive and synthetic views of the results. The software suite is designed to be extended, after Rosetta operations, to the full 3D measurement data analysis using inversion methods.

  13. Interprofessional partnerships in chronic illness care: a conceptual model for measuring partnership effectiveness

    PubMed Central

    Butt, Gail; Markle-Reid, Maureen; Browne, Gina

    2008-01-01

    Introduction Interprofessional health and social service partnerships (IHSSP) are internationally acknowledged as integral for comprehensive chronic illness care. However, the evidence-base for partnership effectiveness is lacking. This paper aims to clarify partnership measurement issues, conceptualize IHSSP at the front-line staff level, and identify tools valid for group process measurement. Theory and methods A systematic literature review utilizing three interrelated searches was conducted. Thematic analysis techniques were supported by NVivo 7 software. Complexity theory was used to guide the analysis, ground the new conceptualization and validate the selected measures. Other properties of the measures were critiqued using established criteria. Results There is a need for a convergent view of what constitutes a partnership and its measurement. The salient attributes of IHSSP and their interorganizational context were described and grounded within complexity theory. Two measures were selected and validated for measurement of proximal group outcomes. Conclusion This paper depicts a novel complexity theory-based conceptual model for IHSSP of front-line staff who provide chronic illness care. The conceptualization provides the underpinnings for a comprehensive evaluative framework for partnerships. Two partnership process measurement tools, the PSAT and TCI are valid for IHSSP process measurement with consideration of their strengths and limitations. PMID:18493591

  14. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    PubMed

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  15. At-TAX: a whole genome tiling array resource for developmental expression analysis and transcript identification in Arabidopsis thaliana

    PubMed Central

    Laubinger, Sascha; Zeller, Georg; Henz, Stefan R; Sachsenberg, Timo; Widmer, Christian K; Naouar, Naïra; Vuylsteke, Marnik; Schölkopf, Bernhard; Rätsch, Gunnar; Weigel, Detlef

    2008-01-01

    Gene expression maps for model organisms, including Arabidopsis thaliana, have typically been created using gene-centric expression arrays. Here, we describe a comprehensive expression atlas, Arabidopsis thaliana Tiling Array Express (At-TAX), which is based on whole-genome tiling arrays. We demonstrate that tiling arrays are accurate tools for gene expression analysis and identified more than 1,000 unannotated transcribed regions. Visualizations of gene expression estimates, transcribed regions, and tiling probe measurements are accessible online at the At-TAX homepage. PMID:18613972

  16. Correlates of lower comprehension of informed consent among participants enrolled in a cohort study in Pune, India.

    PubMed

    Joglekar, Neelam S; Deshpande, Swapna S; Sahay, Seema; Ghate, Manisha V; Bollinger, Robert C; Mehendale, Sanjay M

    2013-03-01

    Optimum comprehension of informed consent by research participants is essential yet challenging. This study explored correlates of lower comprehension of informed consent among 1334 participants of a cohort study aimed at estimating HIV incidence in Pune, India. As part of the informed consent process, a structured comprehension tool was administered to study participants. Participants scoring ≥90% were categorised into the 'optimal comprehension group', whilst those scoring 80-89% were categorised into the 'lower comprehension group'. Data were analysed to identify sociodemographic and behavioural correlates of lower consent comprehension. The mean ± SD comprehension score was 94.4 ± 5.00%. Information pertaining to study-related risks was not comprehended by 61.7% of participants. HIV-negative men (adjusted OR [AOR] = 4.36, 95% CI 1.71-11.05) or HIV-negative women (AOR = 13.54, 95% CI 6.42-28.55), illiteracy (AOR= 1.65, 95% CI 1.19-2.30), those with a history of multiple partners (AOR = 1.73, 95% CI 1.12-2.66) and those never using condoms (AOR = 1.35, 95% CI 1.01-1.82) were more likely to have lower consent comprehension. We recommend exploration of domains of lower consent comprehension using a validated consent comprehension tool. Improved education in these specific domains would optimise consent comprehension among research participants.

  17. Performance evaluation of non-targeted peak-based cross-sample analysis for comprehensive two-dimensional gas chromatography-mass spectrometry data and application to processed hazelnut profiling.

    PubMed

    Kiefl, Johannes; Cordero, Chiara; Nicolotti, Luca; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo

    2012-06-22

    The continuous interest in non-targeted profiling induced the development of tools for automated cross-sample analysis. Such tools were found to be selective or not comprehensive thus delivering a biased view on the qualitative/quantitative peak distribution across 2D sample chromatograms. Therefore, the performance of non-targeted approaches needs to be critically evaluated. This study focused on the development of a validation procedure for non-targeted, peak-based, GC×GC-MS data profiling. The procedure introduced performance parameters such as specificity, precision, accuracy, and uncertainty for a profiling method known as Comprehensive Template Matching. The performance was assessed by applying a three-week validation protocol based on CITAC/EURACHEM guidelines. Optimized ¹D and ²D retention times search windows, MS match factor threshold, detection threshold, and template threshold were evolved from two training sets by a semi-automated learning process. The effectiveness of proposed settings to consistently match 2D peak patterns was established by evaluating the rate of mismatched peaks and was expressed in terms of results accuracy. The study utilized 23 different 2D peak patterns providing the chemical fingerprints of raw and roasted hazelnuts (Corylus avellana L.) from different geographical origins, of diverse varieties and different roasting degrees. The validation results show that non-targeted peak-based profiling can be reliable with error rates lower than 10% independent of the degree of analytical variance. The optimized Comprehensive Template Matching procedure was employed to study hazelnut roasting profiles and in particular to find marker compounds strongly dependent on the thermal treatment, and to establish the correlation of potential marker compounds to geographical origin and variety/cultivar and finally to reveal the characteristic release of aroma active compounds. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    PubMed

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  19. 3. How comprehensive can we be in the economic assessment of vaccines?

    PubMed Central

    2017-01-01

    ABSTRACT In two previous papers we argued on current vaccines economic assessment not fully comprehensive when using the incremental cost-utility analysis normally applied for treatments. Many differences exist between vaccines and drug treatments making vaccines economic evaluation more cumbersome. Four challenges overwhelmingly present in vaccines assessment are less important for treatments: requirements for population, societal perspectives, budget impact evaluation, and time focused objectives (control or elimination). Based on this, economic analysis of vaccines may need to be presented to many different stakeholders with various evaluation preferences, in addition to the current stakeholders involved for drugs treatment assessment. Then, we may need a tool making the inventory of the different vaccines health economic assessment programmes more comprehensive. The cauliflower value toolbox has been developed with that aim, and its use is illustrated here with rotavirus vaccine. Given the broader perspectives for vaccine assessment, it provides better value and cost evaluations. Cost-benefit analysis may be the preferred economic assessment method when considering substitution from treatment to active medical prevention. Other economic evaluation methods can be selected (i.e. optimisation modelling, return on investment) when project prioritisation is the main focus considered and when stakeholders would like to influence the development of the healthcare programme. PMID:29785253

  20. Obesity Policy Action framework and analysis grids for a comprehensive policy approach to reducing obesity.

    PubMed

    Sacks, G; Swinburn, B; Lawrence, M

    2009-01-01

    A comprehensive policy approach is needed to control the growing obesity epidemic. This paper proposes the Obesity Policy Action (OPA) framework, modified from the World Health Organization framework for the implementation of the Global Strategy on Diet, Physical Activity and Health, to provide specific guidance for governments to systematically identify areas for obesity policy action. The proposed framework incorporates three different public health approaches to addressing obesity: (i) 'upstream' policies influence either the broad social and economic conditions of society (e.g. taxation, education, social security) or the food and physical activity environments to make healthy eating and physical activity choices easier; (ii) 'midstream' policies are aimed at directly influencing population behaviours; and (iii) 'downstream' policies support health services and clinical interventions. A set of grids for analysing potential policies to support obesity prevention and management is presented. The general pattern that emerges from populating the analysis grids as they relate to the Australian context is that all sectors and levels of government, non-governmental organizations and private businesses have multiple opportunities to contribute to reducing obesity. The proposed framework and analysis grids provide a comprehensive approach to mapping the policy environment related to obesity, and a tool for identifying policy gaps, barriers and opportunities.

  1. Two chemically distinct light-absorbing pools of urban organic aerosols: A comprehensive multidimensional analysis of trends.

    PubMed

    Paula, Andreia S; Matos, João T V; Duarte, Regina M B O; Duarte, Armando C

    2016-02-01

    The chemical and light-absorption dynamics of organic aerosols (OAs), a master variable in the atmosphere, have yet to be resolved. This study uses a comprehensive multidimensional analysis approach for exploiting simultaneously the compositional changes over a molecular size continuum and associated light-absorption (ultraviolet absorbance and fluorescence) properties of two chemically distinct pools of urban OAs chromophores. Up to 45% of aerosol organic carbon (OC) is soluble in water and consists of a complex mixture of fluorescent and UV-absorbing constituents, with diverse relative abundances, hydrophobic, and molecular weight (Mw) characteristics between warm and cold periods. In contrast, the refractory alkaline-soluble OC pool (up to 18%) is represented along a similar Mw and light-absorption continuum throughout the different seasons. Results suggest that these alkaline-soluble chromophores may actually originate from primary OAs sources in the urban site. This work shows that the comprehensive multidimensional analysis method is a powerful and complementary tool for the characterization of OAs fractions. The great diversity in the chemical composition and optical properties of OAs chromophores, including both water-soluble and alkaline-soluble OC, may be an important contribution to explain the contrasting photo-reactivity and atmospheric behavior of OAs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Two-dimensional liquid chromatography consisting of twelve second-dimension columns for comprehensive analysis of intact proteins.

    PubMed

    Ren, Jiangtao; Beckner, Matthew A; Lynch, Kyle B; Chen, Huang; Zhu, Zaifang; Yang, Yu; Chen, Apeng; Qiao, Zhenzhen; Liu, Shaorong; Lu, Joann J

    2018-05-15

    A comprehensive two-dimensional liquid chromatography (LCxLC) system consisting of twelve columns in the second dimension was developed for comprehensive analysis of intact proteins in complex biological samples. The system consisted of an ion-exchange column in the first dimension and the twelve reverse-phase columns in the second dimension; all thirteen columns were monolithic and prepared inside 250 µm i.d. capillaries. These columns were assembled together through the use of three valves and an innovative configuration. The effluent from the first dimension was continuously fractionated and sequentially transferred into the twelve second-dimension columns, while the second-dimension separations were carried out in a series of batches (six columns per batch). This LCxLC system was tested first using standard proteins followed by real-world samples from E. coli. Baseline separation was observed for eleven standard proteins and hundreds of peaks were observed for the real-world sample analysis. Two-dimensional liquid chromatography, often considered as an effective tool for mapping proteins, is seen as laborious and time-consuming when configured offline. Our online LCxLC system with increased second-dimension columns promises to provide a solution to overcome these hindrances. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Improving text comprehension: scaffolding adolescents into strategic reading.

    PubMed

    Ukrainetz, Teresa A

    2015-02-01

    Understanding and learning from academic texts involves purposeful, strategic reading. Adolescent readers, particularly poor readers, benefit from explicit instruction in text comprehension strategies, such as text preview, summarization, and comprehension monitoring, as part of a comprehensive reading program. However, strategies are difficult to teach within subject area lessons where content instruction must take primacy. Speech-language pathologists (SLPs) have the expertise and service delivery options to support middle and high school students in learning to use comprehension strategies in their academic reading and learning. This article presents the research evidence on what strategies to teach and how best to teach them, including the use of explicit instruction, spoken interactions around text, cognitive modeling, peer learning, classroom connections, and disciplinary literacy. The article focuses on how to move comprehension strategies from being teaching tools of the SLP to becoming learning tools of the student. SLPs can provide the instruction and support needed for students to learn and apply of this important component of academic reading. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  4. The children's menu assessment: development, evaluation, and relevance of a tool for evaluating children's menus.

    PubMed

    Krukowski, Rebecca A; Eddings, Kenya; West, Delia Smith

    2011-06-01

    Restaurant foods represent a substantial portion of children's dietary intake, and consumption of foods away from home has been shown to contribute to excess adiposity. This descriptive study aimed to pilot-test and establish the reliability of a standardized and comprehensive assessment tool, the Children's Menu Assessment, for evaluating the restaurant food environment for children. The tool is an expansion of the Nutrition Environment Measures Survey-Restaurant. In 2009-2010, a randomly selected sample of 130 local and chain restaurants were chosen from within 20 miles of Little Rock, AR, to examine the availability of children's menus and to conduct initial calibration of the Children's Menu Assessment tool (final sample: n=46). Independent raters completed the Children's Menu Assessment in order to determine inter-rater reliability. Test-retest reliability was also examined. Inter-rater reliability was high: percent agreement was 97% and Spearman correlation was 0.90. Test-retest was also high: percent agreement was 91% and Spearman correlation was 0.96. Mean Children's Menu Assessment completion time was 14 minutes, 56 seconds ± 10 minutes, 21 seconds. Analysis of Children's Menu Assessment findings revealed that few healthier options were available on children's menus, and most menus did not provide parents with information for making healthy choices, including nutrition information or identification of healthier options. The Children's Menu Assessment tool allows for comprehensive, rapid measurement of the restaurant food environment for children with high inter-rater reliability. This tool has the potential to contribute to public health efforts to develop and evaluate targeted environmental interventions and/or policy changes regarding restaurant foods. Copyright © 2011 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  5. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    PubMed

    Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E

    2015-01-01

    Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for longitudinal and multicenter studies.

  6. NDARC - NASA Design and Analysis of Rotorcraft Validation and Demonstration

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2010-01-01

    Validation and demonstration results from the development of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are presented. The principal tasks of NDARC are to design a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft chosen as NDARC development test cases are the UH-60A single main-rotor and tail-rotor helicopter, the CH-47D tandem helicopter, the XH-59A coaxial lift-offset helicopter, and the XV-15 tiltrotor. These aircraft were selected because flight performance data, a weight statement, detailed geometry information, and a correlated comprehensive analysis model are available for each. Validation consists of developing the NDARC models for these aircraft by using geometry and weight information, airframe wind tunnel test data, engine decks, rotor performance tests, and comprehensive analysis results; and then comparing the NDARC results for aircraft and component performance with flight test data. Based on the calibrated models, the capability of the code to size rotorcraft is explored.

  7. Insect transformation with piggyBac: getting the number of injections just right

    PubMed Central

    Morrison, N. I.; Shimeld, S. M.

    2016-01-01

    Abstract The insertion of exogenous genetic cargo into insects using transposable elements is a powerful research tool with potential applications in meeting food security and public health challenges facing humanity. piggyBac is the transposable element most commonly utilized for insect germline transformation. The described efficiency of this process is variable in the published literature, and a comprehensive review of transformation efficiency in insects is lacking. This study compared and contrasted all available published data with a comprehensive data set provided by a biotechnology group specializing in insect transformation. Based on analysis of these data, with particular focus on the more complete observational data from the biotechnology group, we designed a decision tool to aid researchers' decision‐making when using piggyBac to transform insects by microinjection. A combination of statistical techniques was used to define appropriate summary statistics of piggyBac transformation efficiency by species and insect order. Publication bias was assessed by comparing the data sets. The bias was assessed using strategies co‐opted from the medical literature. The work culminated in building the Goldilocks decision tool, a Markov‐Chain Monte‐Carlo simulation operated via a graphical interface and providing guidance on best practice for those seeking to transform insects using piggyBac. PMID:27027400

  8. Towards better process understanding: chemometrics and multivariate measurements in manufacturing of solid dosage forms.

    PubMed

    Matero, Sanni; van Den Berg, Frans; Poutiainen, Sami; Rantanen, Jukka; Pajander, Jari

    2013-05-01

    The manufacturing of tablets involves many unit operations that possess multivariate and complex characteristics. The interactions between the material characteristics and process related variation are presently not comprehensively analyzed due to univariate detection methods. As a consequence, current best practice to control a typical process is to not allow process-related factors to vary i.e. lock the production parameters. The problem related to the lack of sufficient process understanding is still there: the variation within process and material properties is an intrinsic feature and cannot be compensated for with constant process parameters. Instead, a more comprehensive approach based on the use of multivariate tools for investigating processes should be applied. In the pharmaceutical field these methods are referred to as Process Analytical Technology (PAT) tools that aim to achieve a thorough understanding and control over the production process. PAT includes the frames for measurement as well as data analyzes and controlling for in-depth understanding, leading to more consistent and safer drug products with less batch rejections. In the optimal situation, by applying these techniques, destructive end-product testing could be avoided. In this paper the most prominent multivariate data analysis measuring tools within tablet manufacturing and basic research on operations are reviewed. Copyright © 2013 Wiley Periodicals, Inc.

  9. Nut-cracking behaviour in wild-born, rehabilitated bonobos (Pan paniscus): a comprehensive study of hand-preference, hand grips and efficiency.

    PubMed

    Neufuss, Johanna; Humle, Tatyana; Cremaschi, Andrea; Kivell, Tracy L

    2017-02-01

    There has been an enduring interest in primate tool-use and manipulative abilities, most often with the goal of providing insight into the evolution of human manual dexterity, right-hand preference, and what behaviours make humans unique. Chimpanzees (Pan troglodytes) are arguably the most well-studied tool-users amongst non-human primates, and are particularly well-known for their complex nut-cracking behaviour, which has been documented in several West African populations. However, their sister-taxon, the bonobos (Pan paniscus), rarely engage in even simple tool-use and are not known to nut-crack in the wild. Only a few studies have reported tool-use in captive bonobos, including their ability to crack nuts, but details of this complex tool-use behaviour have not been documented before. Here, we fill this gap with the first comprehensive analysis of bonobo nut-cracking in a natural environment at the Lola ya Bonobo sanctuary, Democratic Republic of the Congo. Eighteen bonobos were studied as they cracked oil palm nuts using stone hammers. Individual bonobos showed exclusive laterality for using the hammerstone and there was a significant group-level right-hand bias. The study revealed 15 hand grips for holding differently sized and weighted hammerstones, 10 of which had not been previously described in the literature. Our findings also demonstrated that bonobos select the most effective hammerstones when nut-cracking. Bonobos are efficient nut-crackers and not that different from the renowned nut-cracking chimpanzees of Bossou, Guinea, which also crack oil palm nuts using stones. © 2016 Wiley Periodicals, Inc.

  10. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    NASA Astrophysics Data System (ADS)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds,...) from the shape model currently manipulated. Finally, shape interaction capabilities will be added to SBGAT GUI, as it will be augmented with these functionalities using built-in VTK interaction methods.

  11. The Assessment of Reading Comprehension Difficulties for Reading Intervention

    ERIC Educational Resources Information Center

    Woolley, Gary

    2008-01-01

    There are many environmental and personal factors that contribute to reading success. Reading comprehension is a complex interaction of language, sensory perception, memory, and motivational aspects. However, most existing assessment tools have not adequately reflected the complex nature of reading comprehension. Good assessment requires a…

  12. Protocol for a prospective, school-based standardisation study of a digital social skills assessment tool for children: The Paediatric Evaluation of Emotions, Relationships, and Socialisation (PEERS) study.

    PubMed

    Thompson, Emma J; Beauchamp, Miriam H; Darling, Simone J; Hearps, Stephen J C; Brown, Amy; Charalambous, George; Crossley, Louise; Darby, David; Dooley, Julian J; Greenham, Mardee; Jaimangal, Mohinder; McDonald, Skye; Muscara, Frank; Turkstra, Lyn; Anderson, Vicki A

    2018-02-08

    Humans are by nature a social species, with much of human experience spent in social interaction. Unsurprisingly, social functioning is crucial to well-being and quality of life across the lifespan. While early intervention for social problems appears promising, our ability to identify the specific impairments underlying their social problems (eg, social communication) is restricted by a dearth of accurate, ecologically valid and comprehensive child-direct assessment tools. Current tools are largely limited to parent and teacher ratings scales, which may identify social dysfunction, but not its underlying cause, or adult-based experimental tools, which lack age-appropriate norms. The present study describes the development and standardisation of Paediatric Evaluation of Emotions, Relationships, and Socialisation ( PEERS®), an iPad-based social skills assessment tool. The PEERS project is a cross-sectional study involving two groups: (1) a normative group, recruited from early childhood, primary and secondary schools across metropolitan and regional Victoria, Australia; and (2) a clinical group, ascertained from outpatient services at The Royal Children's Hospital Melbourne (RCH). The project aims to establish normative data for PEERS®, a novel and comprehensive app-delivered child-direct measure of social skills for children and youth. The project involves recruiting and assessing 1000 children aged 4.0-17.11 years. Assessments consist of an intellectual screen, PEERS® subtests, and PEERS-Q, a self-report questionnaire of social skills. Parents and teachers also complete questionnaires relating to participants' social skills. Main analyses will comprise regression-based continuous norming, factor analysis and psychometric analysis of PEERS® and PEERS-Q. Ethics approval has been obtained through the RCH Human Research Ethics Committee (34046), the Victorian Government Department of Education and Early Childhood Development (002318), and Catholic Education Melbourne (2166). Findings will be disseminated through international conferences and peer-reviewed journals. Following standardisation of PEERS®, the tool will be made commercially available. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Pathway enrichment analysis approach based on topological structure and updated annotation of pathway.

    PubMed

    Yang, Qian; Wang, Shuyuan; Dai, Enyu; Zhou, Shunheng; Liu, Dianming; Liu, Haizhou; Meng, Qianqian; Jiang, Bin; Jiang, Wei

    2017-08-16

    Pathway enrichment analysis has been widely used to identify cancer risk pathways, and contributes to elucidating the mechanism of tumorigenesis. However, most of the existing approaches use the outdated pathway information and neglect the complex gene interactions in pathway. Here, we first reviewed the existing widely used pathway enrichment analysis approaches briefly, and then, we proposed a novel topology-based pathway enrichment analysis (TPEA) method, which integrated topological properties and global upstream/downstream positions of genes in pathways. We compared TPEA with four widely used pathway enrichment analysis tools, including database for annotation, visualization and integrated discovery (DAVID), gene set enrichment analysis (GSEA), centrality-based pathway enrichment (CePa) and signaling pathway impact analysis (SPIA), through analyzing six gene expression profiles of three tumor types (colorectal cancer, thyroid cancer and endometrial cancer). As a result, we identified several well-known cancer risk pathways that could not be obtained by the existing tools, and the results of TPEA were more stable than that of the other tools in analyzing different data sets of the same cancer. Ultimately, we developed an R package to implement TPEA, which could online update KEGG pathway information and is available at the Comprehensive R Archive Network (CRAN): https://cran.r-project.org/web/packages/TPEA/. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. The Value of Reliable Data: Interactive Data Tools from the National Comprehensive Center for Teacher Quality. Policy-to-Practice Brief. Number 1

    ERIC Educational Resources Information Center

    National Comprehensive Center for Teacher Quality, 2008

    2008-01-01

    The National Comprehensive Center for Teacher Quality (TQ Center) designed the Interactive Data Tools to provide users with access to state and national data that can be helpful in assessing the qualifications of teachers in the states and the extent to which a state's teacher policy climate generally supports teacher quality. The Interactive Data…

  15. Chapter 11. Community analysis-based methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Y.; Wu, C.H.; Andersen, G.L.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. Inmore » increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.« less

  16. A tool for assessment of heart failure prescribing quality: A systematic review and meta-analysis.

    PubMed

    El Hadidi, Seif; Darweesh, Ebtissam; Byrne, Stephen; Bermingham, Margaret

    2018-04-16

    Heart failure (HF) guidelines aim to standardise patient care. Internationally, prescribing practice in HF may deviate from guidelines and so a standardised tool is required to assess prescribing quality. A systematic review and meta-analysis were performed to identify a quantitative tool for measuring adherence to HF guidelines and its clinical implications. Eleven electronic databases were searched to include studies reporting a comprehensive tool for measuring adherence to prescribing guidelines in HF patients aged ≥18 years. Qualitative studies or studies measuring prescription rates alone were excluded. Study quality was assessed using the Good ReseArch for Comparative Effectiveness Checklist. In total, 2455 studies were identified. Sixteen eligible full-text articles were included (n = 14 354 patients, mean age 69 ± 8 y). The Guideline Adherence Index (GAI), and its modified versions, was the most frequently cited tool (n = 13). Other tools identified were the Individualised Reconciled Evidence Recommendations, the Composite Heart Failure Performance, and the Heart Failure Scale. The meta-analysis included the GAI studies of good to high quality. The average GAI-3 was 62%. Compared to low GAI, high GAI patients had lower mortality rate (7.6% vs 33.9%) and lower rehospitalisation rates (23.5% vs 24.5%); both P ≤ .05. High GAI was associated with reduced risk of mortality (hazard ratio = 0.29, 95% confidence interval, 0.06-0.51) and rehospitalisation (hazard ratio = 0.64, 95% confidence interval, 0.41-1.00). No tool was used to improve prescribing quality. The GAI is the most frequently used tool to assess guideline adherence in HF. High GAI is associated with improved HF outcomes. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Screening for adolescents' internalizing symptoms in primary care: item response theory analysis of the behavior health screen depression, anxiety, and suicidal risk scales.

    PubMed

    Bevans, Katherine B; Diamond, Guy; Levy, Suzanne

    2012-05-01

    To apply a modern psychometric approach to validate the Behavioral Health Screen (BHS) Depression, Anxiety, and Suicidal Risk Scales among adolescents in primary care. Psychometric analyses were conducted using data collected from 426 adolescents aged 12 to 21 years (mean = 15.8, SD = 2.2). Rasch-Masters partial credit models were fit to the data to determine whether items supported the comprehensive measurement of internalizing symptoms with minimal gaps and redundancies. Scales were reduced to ensure that they measured singular dimensions of generalized anxiety, depressed affect, and suicidal risk both comprehensively and efficiently. Although gender bias was observed for some depression and anxiety items, differential item functioning did not impact overall subscale scores. Future revisions to the BHS should include additional items that assess low-level internalizing symptoms. The BHS is an accurate and efficient tool for identifying adolescents with internalizing symptoms in primary care settings. Access to psychometrically sound and cost-effective behavioral health screening tools is essential for meeting the increasing demands for adolescent behavioral health screening in primary/ambulatory care.

  18. The FaceBase Consortium: A comprehensive program to facilitate craniofacial research

    PubMed Central

    Hochheiser, Harry; Aronow, Bruce J.; Artinger, Kristin; Beaty, Terri H.; Brinkley, James F.; Chai, Yang; Clouthier, David; Cunningham, Michael L.; Dixon, Michael; Donahue, Leah Rae; Fraser, Scott E.; Hallgrimsson, Benedikt; Iwata, Junichi; Klein, Ophir; Marazita, Mary L.; Murray, Jeffrey C.; Murray, Stephen; de Villena, Fernando Pardo-Manuel; Postlethwait, John; Potter, Steven; Shapiro, Linda; Spritz, Richard; Visel, Axel; Weinberg, Seth M.; Trainor, Paul A.

    2012-01-01

    The FaceBase Consortium consists of ten interlinked research and technology projects whose goal is to generate craniofacial research data and technology for use by the research community through a central data management and integrated bioinformatics hub. Funded by the National Institute of Dental and Craniofacial Research (NIDCR) and currently focused on studying the development of the middle region of the face, the Consortium will produce comprehensive datasets of global gene expression patterns, regulatory elements and sequencing; will generate anatomical and molecular atlases; will provide human normative facial data and other phenotypes; conduct follow up studies of a completed genome-wide association study; generate independent data on the genetics of craniofacial development, build repositories of animal models and of human samples and data for community access and analysis; and will develop software tools and animal models for analyzing and functionally testing and integrating these data. The FaceBase website (http://www.facebase.org) will serve as a web home for these efforts, providing interactive tools for exploring these datasets, together with discussion forums and other services to support and foster collaboration within the craniofacial research community. PMID:21458441

  19. Technical Advances and Fifth Grade Reading Comprehension: Do Students Benefit?

    ERIC Educational Resources Information Center

    Fountaine, Drew

    This paper takes a look at some recent studies on utilization of technical tools, primarily personal computers and software, for improving fifth-grade students' reading comprehension. Specifically, the paper asks what benefits an educator can expect students to derive from closed-captioning and computer-assisted reading comprehension products. It…

  20. A Low Vision Reading Comprehension Test.

    ERIC Educational Resources Information Center

    Watson, G. R.; And Others

    1996-01-01

    Fifty adults (ages 28-86) with macular degeneration were given the Low Vision Reading Comprehension Assessment (LVRCA) to test its reliability and validity in evaluating the reading comprehension of those with vision impairments. The LVRCA was found to take only nine minutes to administer and was a valid and reliable tool. (CR)

  1. Detailed Measurements of the Aeroelastic Response of a Rigid Coaxial Rotor in Hover

    DTIC Science & Technology

    2017-08-11

    included: hover testing of single and CCR rotors (Year 1), deformation measurement and modal identification of rotor blades in the non -rotating and...the rotor blades, as well as the detailed experimental data were shared with Dr. Rajneesh Singh and Dr. Hao Kang at Vehicle Technology Directorate...VTD), Aberdeen Proving Grounds, MD. In this way, the experimental data could be used to validate US Army comprehensive analysis tools, specifically

  2. Metabolomic fingerprinting employing DART-TOFMS for authentication of tomatoes and peppers from organic and conventional farming.

    PubMed

    Novotná, H; Kmiecik, O; Gałązka, M; Krtková, V; Hurajová, A; Schulzová, V; Hallmann, E; Rembiałkowska, E; Hajšlová, J

    2012-01-01

    The rapidly growing demand for organic food requires the availability of analytical tools enabling their authentication. Recently, metabolomic fingerprinting/profiling has been demonstrated as a challenging option for a comprehensive characterisation of small molecules occurring in plants, since their pattern may reflect the impact of various external factors. In a two-year pilot study, concerned with the classification of organic versus conventional crops, ambient mass spectrometry consisting of a direct analysis in real time (DART) ion source and a time-of-flight mass spectrometer (TOFMS) was employed. This novel methodology was tested on 40 tomato and 24 pepper samples grown under specified conditions. To calculate statistical models, the obtained data (mass spectra) were processed by the principal component analysis (PCA) followed by linear discriminant analysis (LDA). The results from the positive ionisation mode enabled better differentiation between organic and conventional samples than the results from the negative mode. In this case, the recognition ability obtained by LDA was 97.5% for tomato and 100% for pepper samples and the prediction abilities were above 80% for both sample sets. The results suggest that the year of production had stronger influence on the metabolomic fingerprints compared with the type of farming (organic versus conventional). In any case, DART-TOFMS is a promising tool for rapid screening of samples. Establishing comprehensive (multi-sample) long-term databases may further help to improve the quality of statistical classification models.

  3. Correlates of lower comprehension of informed consent among participants enrolled in a cohort study in Pune, India

    PubMed Central

    Joglekar, Neelam S.; Deshpande, Swapna S.; Sahay, Seema; Ghate, Manisha V.; Bollinger, Robert C.; Mehendale, Sanjay M.

    2013-01-01

    Background Optimum comprehension of informed consent by research participants is essential yet challenging. This study explored correlates of lower comprehension of informed consent among 1334 participants of a cohort study aimed at estimating HIV incidence in Pune, India. Methods As part of the informed consent process, a structured comprehension tool was administered to study participants. Participants scoring ≥90% were categorised into the ‘optimal comprehension group’, whilst those scoring 80–89% were categorised into the ‘lower comprehension group’. Data were analysed to identify sociodemographic and behavioural correlates of lower consent comprehension. Results The mean ± SD comprehension score was 94.4 ± 5.00%. Information pertaining to study-related risks was not comprehended by 61.7% of participants. HIV-negative men (adjusted OR [AOR] = 4.36, 95% CI 1.71–11.05) or HIV-negative women (AOR = 13.54, 95% CI 6.42–28.55), illiteracy (AOR= 1.65, 95% CI 1.19–2.30), those with a history of multiple partners (AOR = 1.73, 95% CI 1.12–2.66) and those never using condoms (AOR = 1.35, 95% CI 1.01–1.82) were more likely to have lower consent comprehension. Conclusions We recommend exploration of domains of lower consent comprehension using a validated consent comprehension tool. Improved education in these specific domains would optimise consent comprehension among research participants. PMID:24029848

  4. GenoBase: comprehensive resource database of Escherichia coli K-12.

    PubMed

    Otsuka, Yuta; Muto, Ai; Takeuchi, Rikiya; Okada, Chihiro; Ishikawa, Motokazu; Nakamura, Koichiro; Yamamoto, Natsuko; Dose, Hitomi; Nakahigashi, Kenji; Tanishima, Shigeki; Suharnan, Sivasundaram; Nomura, Wataru; Nakayashiki, Toru; Aref, Walid G; Bochner, Barry R; Conway, Tyrrell; Gribskov, Michael; Kihara, Daisuke; Rudd, Kenneth E; Tohsato, Yukako; Wanner, Barry L; Mori, Hirotada

    2015-01-01

    Comprehensive experimental resources, such as ORFeome clone libraries and deletion mutant collections, are fundamental tools for elucidation of gene function. Data sets by omics analysis using these resources provide key information for functional analysis, modeling and simulation both in individual and systematic approaches. With the long-term goal of complete understanding of a cell, we have over the past decade created a variety of clone and mutant sets for functional genomics studies of Escherichia coli K-12. We have made these experimental resources freely available to the academic community worldwide. Accordingly, these resources have now been used in numerous investigations of a multitude of cell processes. Quality control is extremely important for evaluating results generated by these resources. Because the annotation has been changed since 2005, which we originally used for the construction, we have updated these genomic resources accordingly. Here, we describe GenoBase (http://ecoli.naist.jp/GB/), which contains key information about comprehensive experimental resources of E. coli K-12, their quality control and several omics data sets generated using these resources. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Energy and Exergy Analysis of Vapour Absorption Refrigeration Cycle—A Review

    NASA Astrophysics Data System (ADS)

    Kanabar, Bhaveshkumar Kantilal; Ramani, Bharatkumar Maganbhai

    2016-07-01

    In recent years, an energy crisis and the energy consumption have become global problems which restrict the sustainable growth. In these scenarios the scientific energy recovery and the utilization of various kinds of waste heat become very important. The waste heat can be utilized in many ways and one of the best practices is to use it for vapour absorption refrigeration system. To ensure efficient working of absorption cycle and utilization of optimum heat, exergy is the best tool for analysis. This paper provides the comprehensive picture of research and development of absorption refrigeration technology, practical and theoretical analysis with different arrangements of the cycle.

  6. Systems biology of embryonic development: Prospects for a complete understanding of the Caenorhabditis elegans embryo.

    PubMed

    Murray, John Isaac

    2018-05-01

    The convergence of developmental biology and modern genomics tools brings the potential for a comprehensive understanding of developmental systems. This is especially true for the Caenorhabditis elegans embryo because its small size, invariant developmental lineage, and powerful genetic and genomic tools provide the prospect of a cellular resolution understanding of messenger RNA (mRNA) expression and regulation across the organism. We describe here how a systems biology framework might allow large-scale determination of the embryonic regulatory relationships encoded in the C. elegans genome. This framework consists of two broad steps: (a) defining the "parts list"-all genes expressed in all cells at each time during development and (b) iterative steps of computational modeling and refinement of these models by experimental perturbation. Substantial progress has been made towards defining the parts list through imaging methods such as large-scale green fluorescent protein (GFP) reporter analysis. Imaging results are now being augmented by high-resolution transcriptome methods such as single-cell RNA sequencing, and it is likely the complete expression patterns of all genes across the embryo will be known within the next few years. In contrast, the modeling and perturbation experiments performed so far have focused largely on individual cell types or genes, and improved methods will be needed to expand them to the full genome and organism. This emerging comprehensive map of embryonic expression and regulatory function will provide a powerful resource for developmental biologists, and would also allow scientists to ask questions not accessible without a comprehensive picture. This article is categorized under: Invertebrate Organogenesis > Worms Technologies > Analysis of the Transcriptome Gene Expression and Transcriptional Hierarchies > Gene Networks and Genomics. © 2018 Wiley Periodicals, Inc.

  7. [Tools to assess the impact on health of public health programmes and community interventions from an equity perspective].

    PubMed

    Suárez Álvarez, Óscar; Fernández-Feito, Ana; Vallina Crespo, Henar; Aldasoro Unamuno, Elena; Cofiño, Rafael

    2018-05-11

    It is essential to develop a comprehensive approach to institutionally promoted interventions to assess their impact on health from the perspective of the social determinants of health and equity. Simple, adapted tools must be developed to carry out these assessments. The aim of this paper is to present two tools to assess the impact of programmes and community-based interventions on the social determinants of health. The first tool is intended to assess health programmes through interviews and analysis of information provided by the assessment team. The second tool, by means of online assessments of community-based interventions, also enables a report on inequality issues that includes recommendations for improvement. In addition to reducing health-related social inequities, the implementation of these tools can also help to improve the efficiency of public health interventions. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  8. A survey on annotation tools for the biomedical literature.

    PubMed

    Neves, Mariana; Leser, Ulf

    2014-03-01

    New approaches to biomedical text mining crucially depend on the existence of comprehensive annotated corpora. Such corpora, commonly called gold standards, are important for learning patterns or models during the training phase, for evaluating and comparing the performance of algorithms and also for better understanding the information sought for by means of examples. Gold standards depend on human understanding and manual annotation of natural language text. This process is very time-consuming and expensive because it requires high intellectual effort from domain experts. Accordingly, the lack of gold standards is considered as one of the main bottlenecks for developing novel text mining methods. This situation led the development of tools that support humans in annotating texts. Such tools should be intuitive to use, should support a range of different input formats, should include visualization of annotated texts and should generate an easy-to-parse output format. Today, a range of tools which implement some of these functionalities are available. In this survey, we present a comprehensive survey of tools for supporting annotation of biomedical texts. Altogether, we considered almost 30 tools, 13 of which were selected for an in-depth comparison. The comparison was performed using predefined criteria and was accompanied by hands-on experiences whenever possible. Our survey shows that current tools can support many of the tasks in biomedical text annotation in a satisfying manner, but also that no tool can be considered as a true comprehensive solution.

  9. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    PubMed

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  11. Ecosystem Risk Assessment Using the Comprehensive Assessment of Risk to Ecosystems (CARE) Tool

    NASA Astrophysics Data System (ADS)

    Battista, W.; Fujita, R.; Karr, K.

    2016-12-01

    Effective Ecosystem Based Management requires a localized understanding of the health and functioning of a given system as well as of the various factors that may threaten the ongoing ability of the system to support the provision of valued services. Several risk assessment models are available that can provide a scientific basis for understanding these factors and for guiding management action, but these models focus mainly on single species and evaluate only the impacts of fishing in detail. We have developed a new ecosystem risk assessment model - the Comprehensive Assessment of Risk to Ecosystems (CARE) - that allows analysts to consider the cumulative impact of multiple threats, interactions among multiple threats that may result in synergistic or antagonistic impacts, and the impacts of a suite of threats on whole-ecosystem productivity and functioning, as well as on specific ecosystem services. The CARE model was designed to be completed in as little as two hours, and uses local and expert knowledge where data are lacking. The CARE tool can be used to evaluate risks facing a single site; to compare multiple sites for the suitability or necessity of different management options; or to evaluate the effects of a proposed management action aimed at reducing one or more risks. This analysis can help users identify which threats are the most important at a given site, and therefore where limited management resources should be targeted. CARE can be applied to virtually any system, and can be modified as knowledge is gained or to better match different site characteristics. CARE builds on previous ecosystem risk assessment tools to provide a comprehensive assessment of fishing and non-fishing threats that can be used to inform environmental management decisions across a broad range of systems.

  12. Ecosystem Risk Assessment Using the Comprehensive Assessment of Risk to Ecosystems (CARE) Tool

    NASA Astrophysics Data System (ADS)

    Battista, W.; Fujita, R.; Karr, K.

    2016-02-01

    Effective Ecosystem Based Management requires a localized understanding of the health and functioning of a given system as well as of the various factors that may threaten the ongoing ability of the system to support the provision of valued services. Several risk assessment models are available that can provide a scientific basis for understanding these factors and for guiding management action, but these models focus mainly on single species and evaluate only the impacts of fishing in detail. We have developed a new ecosystem risk assessment model - the Comprehensive Assessment of Risk to Ecosystems (CARE) - that allows analysts to consider the cumulative impact of multiple threats, interactions among multiple threats that may result in synergistic or antagonistic impacts, and the impacts of a suite of threats on whole-ecosystem productivity and functioning, as well as on specific ecosystem services. The CARE model was designed to be completed in as little as two hours, and uses local and expert knowledge where data are lacking. The CARE tool can be used to evaluate risks facing a single site; to compare multiple sites for the suitability or necessity of different management options; or to evaluate the effects of a proposed management action aimed at reducing one or more risks. This analysis can help users identify which threats are the most important at a given site, and therefore where limited management resources should be targeted. CARE can be applied to virtually any system, and can be modified as knowledge is gained or to better match different site characteristics. CARE builds on previous ecosystem risk assessment tools to provide a comprehensive assessment of fishing and non-fishing threats that can be used to inform environmental management decisions across a broad range of systems.

  13. Culture shock and healthcare workers in remote Indigenous communities of Australia: what do we know and how can we measure it?

    PubMed

    Muecke, A; Lenthall, S; Lindeman, M

    2011-01-01

    Culture shock or cultural adaptation is a significant issue confronting non-Indigenous health professionals working in remote Indigenous communities in Australia. This article is presented in two parts. The first part provides a thorough background in the theory of culture shock and cultural adaptation, and a comprehensive analysis of the consequences, causes, and current issues around the phenomenon in the remote Australian healthcare context. Second, the article presents the results of a comprehensive literature review undertaken to determine if existing studies provide tools which may measure the cultural adaptation of remote health professionals. A comprehensive literature review was conducted utilising the meta-databases CINAHL and Ovid Medline. While there is a plethora of descriptive literature about culture shock and cultural adaptation, empirical evidence is lacking. In particular, no empirical evidence was found relating to the cultural adaptation of non-Indigenous health professionals working in Indigenous communities in Australia. In all, 15 international articles were found that provided empirical evidence to support the concept of culture shock. Of these, only 2 articles contained tools that met the pre-determined selection criteria to measure the stages of culture shock. The 2 instruments identified were the Culture Shock Profile (CSP) by Zapf and the Culture Shock Adaptation Inventory (CSAI) by Juffer. There is sufficient evidence to determine that culture shock is a significant issue for non-Indigenous health professionals working in Indigenous communities in Australia. However, further research in this area is needed. The available empirical evidence indicates that a measurement tool is possible but needs further development to be suitable for use in remote Indigenous communities in Australia.

  14. Optimization of Surface Roughness Parameters of Al-6351 Alloy in EDC Process: A Taguchi Coupled Fuzzy Logic Approach

    NASA Astrophysics Data System (ADS)

    Kar, Siddhartha; Chakraborty, Sujoy; Dey, Vidyut; Ghosh, Subrata Kumar

    2017-10-01

    This paper investigates the application of Taguchi method with fuzzy logic for multi objective optimization of roughness parameters in electro discharge coating process of Al-6351 alloy with powder metallurgical compacted SiC/Cu tool. A Taguchi L16 orthogonal array was employed to investigate the roughness parameters by varying tool parameters like composition and compaction load and electro discharge machining parameters like pulse-on time and peak current. Crucial roughness parameters like Centre line average roughness, Average maximum height of the profile and Mean spacing of local peaks of the profile were measured on the coated specimen. The signal to noise ratios were fuzzified to optimize the roughness parameters through a single comprehensive output measure (COM). Best COM obtained with lower values of compaction load, pulse-on time and current and 30:70 (SiC:Cu) composition of tool. Analysis of variance is carried out and a significant COM model is observed with peak current yielding highest contribution followed by pulse-on time, compaction load and composition. The deposited layer is characterised by X-Ray Diffraction analysis which confirmed the presence of tool materials on the work piece surface.

  15. Report on Automated Semantic Analysis of Scientific and Engineering Codes

    NASA Technical Reports Server (NTRS)

    Stewart. Maark E. M.; Follen, Greg (Technical Monitor)

    2001-01-01

    The loss of the Mars Climate Orbiter due to a software error reveals what insiders know: software development is difficult and risky because, in part, current practices do not readily handle the complex details of software. Yet, for scientific software development the MCO mishap represents the tip of the iceberg; few errors are so public, and many errors are avoided with a combination of expertise, care, and testing during development and modification. Further, this effort consumes valuable time and resources even when hardware costs and execution time continually decrease. Software development could use better tools! This lack of tools has motivated the semantic analysis work explained in this report. However, this work has a distinguishing emphasis; the tool focuses on automated recognition of the fundamental mathematical and physical meaning of scientific code. Further, its comprehension is measured by quantitatively evaluating overall recognition with practical codes. This emphasis is necessary if software errors-like the MCO error-are to be quickly and inexpensively avoided in the future. This report evaluates the progress made with this problem. It presents recommendations, describes the approach, the tool's status, the challenges, related research, and a development strategy.

  16. GAMES II Project: a general architecture for medical knowledge-based systems.

    PubMed

    Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M

    1994-10-01

    GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.

  17. PaintOmics 3: a web resource for the pathway analysis and visualization of multi-omics data.

    PubMed

    Hernández-de-Diego, Rafael; Tarazona, Sonia; Martínez-Mira, Carlos; Balzano-Nogueira, Leandro; Furió-Tarí, Pedro; Pappas, Georgios J; Conesa, Ana

    2018-05-25

    The increasing availability of multi-omic platforms poses new challenges to data analysis. Joint visualization of multi-omics data is instrumental in better understanding interconnections across molecular layers and in fully utilizing the multi-omic resources available to make biological discoveries. We present here PaintOmics 3, a web-based resource for the integrated visualization of multiple omic data types onto KEGG pathway diagrams. PaintOmics 3 combines server-end capabilities for data analysis with the potential of modern web resources for data visualization, providing researchers with a powerful framework for interactive exploration of their multi-omics information. Unlike other visualization tools, PaintOmics 3 covers a comprehensive pathway analysis workflow, including automatic feature name/identifier conversion, multi-layered feature matching, pathway enrichment, network analysis, interactive heatmaps, trend charts, and more. It accepts a wide variety of omic types, including transcriptomics, proteomics and metabolomics, as well as region-based approaches such as ATAC-seq or ChIP-seq data. The tool is freely available at www.paintomics.org.

  18. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences

    PubMed Central

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant’s platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses. PMID:26752627

  19. A Comprehensive Tool and Analytical Pathway for Differential Molecular Profiling and Biomarker Discovery

    DTIC Science & Technology

    2014-10-20

    three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY

  20. The Comprehensive, Powerful, Academic Database (CPAD): An Evaluative Study of a Predictive Tool Designed for Elementary School Personnel in Identifying At-Risk Students through Progress, Curriculum, and Performance Monitoring

    ERIC Educational Resources Information Center

    Chavez-Gibson, Sarah

    2013-01-01

    The purpose of this study is to exam in-depth, the Comprehensive, Powerful, Academic Database (CPAD), a data decision-making tool that determines and identifies students at-risk of dropping out of school, and how the CPAD assists administrators and teachers at an elementary campus to monitor progress, curriculum, and performance to improve student…

  1. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  2. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel

    1990-01-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  3. TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.

    PubMed

    Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo

    2018-06-15

    We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  4. Screening in toddlers and preschoolers at risk for autism spectrum disorder: Evaluating a novel mobile-health screening tool.

    PubMed

    Kanne, Stephen M; Carpenter, Laura Arnstein; Warren, Zachary

    2018-05-07

    There are many available tools with varying levels of accuracy designed to screen for Autism Spectrum Disorder (ASD) in young children, both in the general population and specifically among those referred for developmental concerns. With burgeoning waitlists for comprehensive diagnostic ASD assessments, finding accurate methods and tools for advancing diagnostic triage becomes increasingly important. The current study compares the efficacy of four oft used paper and pencil measures, the Modified Checklist for Autism in Toddlers Revised with Follow-up, the Social Responsiveness Scale, Second Edition, and the Social Communication Questionnaire, and the Child Behavior Checklist to a novel mobile-health screening tool developed by Cognoa, Inc. (Cognoa) in a group of children 18-72 months of age. The Cognoa tool may have potential benefits as it integrates a series of parent-report questions with remote clinical ratings of brief video segments uploaded via parent's smartphones to calculate level of ASD risk. Participants were referred to one of three tertiary care diagnostic centers for ASD-related concerns (n = 230) and received a best estimate ASD diagnosis. Analysis and comparison of psychometric properties indicated potential advantages for Cognoa within this clinical sample across age ranges not often covered by another single measure/tool. Autism Res 2018. © 2018 International Society for Autism Research, Wiley Periodicals, Inc. With the wait times getting longer for comprehensive Autism Spectrum Disorder (ASD) diagnostic assessments, it is becoming increasingly important to find accurate tools to screen for ASD. The current study compares four screening measures that have been in use for some time to a novel mobile-health screening tool, called Cognoa. The Cognoa tool is novel because it integrates parent-report questions with clinical ratings of brief video segments uploaded via parent's smartphones to calculate ASD risk. Two hundred thirty children who were referred to one of three ASD specialty diagnostic centers to see if they had ASD participated in the study. A direct comparison indicated potential advantages for Cognoa not often covered by another single measure/tool. © 2018 International Society for Autism Research, Wiley Periodicals, Inc.

  5. Reliability analysis in the Office of Safety, Environmental, and Mission Assurance (OSEMA)

    NASA Astrophysics Data System (ADS)

    Kauffmann, Paul J.

    1994-12-01

    The technical personnel in the SEMA office are working to provide the highest degree of value-added activities to their support of the NASA Langley Research Center mission. Management perceives that reliability analysis tools and an understanding of a comprehensive systems approach to reliability will be a foundation of this change process. Since the office is involved in a broad range of activities supporting space mission projects and operating activities (such as wind tunnels and facilities), it was not clear what reliability tools the office should be familiar with and how these tools could serve as a flexible knowledge base for organizational growth. Interviews and discussions with the office personnel (both technicians and engineers) revealed that job responsibilities ranged from incoming inspection to component or system analysis to safety and risk. It was apparent that a broad base in applied probability and reliability along with tools for practical application was required by the office. A series of ten class sessions with a duration of two hours each was organized and scheduled. Hand-out materials were developed and practical examples based on the type of work performed by the office personnel were included. Topics covered were: Reliability Systems - a broad system oriented approach to reliability; Probability Distributions - discrete and continuous distributions; Sampling and Confidence Intervals - random sampling and sampling plans; Data Analysis and Estimation - Model selection and parameter estimates; and Reliability Tools - block diagrams, fault trees, event trees, FMEA. In the future, this information will be used to review and assess existing equipment and processes from a reliability system perspective. An analysis of incoming materials sampling plans was also completed. This study looked at the issues associated with Mil Std 105 and changes for a zero defect acceptance sampling plan.

  6. Reliability analysis in the Office of Safety, Environmental, and Mission Assurance (OSEMA)

    NASA Technical Reports Server (NTRS)

    Kauffmann, Paul J.

    1994-01-01

    The technical personnel in the SEMA office are working to provide the highest degree of value-added activities to their support of the NASA Langley Research Center mission. Management perceives that reliability analysis tools and an understanding of a comprehensive systems approach to reliability will be a foundation of this change process. Since the office is involved in a broad range of activities supporting space mission projects and operating activities (such as wind tunnels and facilities), it was not clear what reliability tools the office should be familiar with and how these tools could serve as a flexible knowledge base for organizational growth. Interviews and discussions with the office personnel (both technicians and engineers) revealed that job responsibilities ranged from incoming inspection to component or system analysis to safety and risk. It was apparent that a broad base in applied probability and reliability along with tools for practical application was required by the office. A series of ten class sessions with a duration of two hours each was organized and scheduled. Hand-out materials were developed and practical examples based on the type of work performed by the office personnel were included. Topics covered were: Reliability Systems - a broad system oriented approach to reliability; Probability Distributions - discrete and continuous distributions; Sampling and Confidence Intervals - random sampling and sampling plans; Data Analysis and Estimation - Model selection and parameter estimates; and Reliability Tools - block diagrams, fault trees, event trees, FMEA. In the future, this information will be used to review and assess existing equipment and processes from a reliability system perspective. An analysis of incoming materials sampling plans was also completed. This study looked at the issues associated with Mil Std 105 and changes for a zero defect acceptance sampling plan.

  7. Power processing methodology. [computerized design of spacecraft electric power systems

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hansen, I. G.; Hayden, J. H.

    1974-01-01

    Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.

  8. Establishing a Measurement Tool for a Nursing Work Environment in Taiwan.

    PubMed

    Lin, Li-Chiu; Lee, Huan-Fang; Yen, Miaofen

    2017-02-01

    The nursing work environment is a critical global health care problem. Many health care providers are concerned about the associations between the nursing work environment and the outcomes of organizations, nurses, and patients. Nursing work environment instruments have been assessed in the West but have not been considered in Asia. However, different cultures will affect the factorial structure of the tool. Using a stratified nationwide random sample, we created a measurement tool for the nursing work environment in Taiwan. The Nursing Work Environment Index-Revised Scale and the Essentials of Magnetism scale were used to examine the factorial structure. Item analysis, exploratory factor analysis, and confirmatory factor analysis were used to examine the hypothesis model and generate a new factorial structure. The Taiwan Nursing Work Environment Index (TNWEI) was established to evaluate the nursing work environment in Taiwan. The four factors were labeled "Organizational Support" (7 items), "Nurse Staffing and Resources" (4 items), "Nurse-Physician Collaboration" (4 items), and "Support for Continuing Education" (4 items). The 19 items explained 58.5% of the variance. Confirmatory factor analysis showed a good fit to the model (x2/df = 5.99; p < .05, goodness of fit index [GFI] = .90; RMSEA = .07). The TNWEI provides a comprehensive and efficient method for measuring the nurses' work environment in Taiwan.

  9. Effect-directed analysis supporting monitoring of aquatic ...

    EPA Pesticide Factsheets

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required,and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, includingtheir strengths andweaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies onfractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determiningthe chemical structures causing effects is analytical toxi

  10. In Silico PCR Tools for a Fast Primer, Probe, and Advanced Searching.

    PubMed

    Kalendar, Ruslan; Muterko, Alexandr; Shamekova, Malika; Zhambakin, Kabyl

    2017-01-01

    The polymerase chain reaction (PCR) is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. The principle of this technique has been further used and applied in plenty of other simple or complex nucleic acid amplification technologies (NAAT). In parallel to laboratory "wet bench" experiments for nucleic acid amplification technologies, in silico or virtual (bioinformatics) approaches have been developed, among which in silico PCR analysis. In silico NAAT analysis is a useful and efficient complementary method to ensure the specificity of primers or probes for an extensive range of PCR applications from homology gene discovery, molecular diagnosis, DNA fingerprinting, and repeat searching. Predicting sensitivity and specificity of primers and probes requires a search to determine whether they match a database with an optimal number of mismatches, similarity, and stability. In the development of in silico bioinformatics tools for nucleic acid amplification technologies, the prospects for the development of new NAAT or similar approaches should be taken into account, including forward-looking and comprehensive analysis that is not limited to only one PCR technique variant. The software FastPCR and the online Java web tool are integrated tools for in silico PCR of linear and circular DNA, multiple primer or probe searches in large or small databases and for advanced search. These tools are suitable for processing of batch files that are essential for automation when working with large amounts of data. The FastPCR software is available for download at http://primerdigital.com/fastpcr.html and the online Java version at http://primerdigital.com/tools/pcr.html .

  11. Points of attention in designing tools for regional brownfield prioritization.

    PubMed

    Limasset, Elsa; Pizzol, Lisa; Merly, Corinne; Gatchett, Annette M; Le Guern, Cécile; Martinát, Stanislav; Klusáček, Petr; Bartke, Stephan

    2018-05-01

    The regeneration of brownfields has been increasingly recognized as a key instrument in sustainable land management, since free developable land (or so called "greenfields") has become a scare and more expensive resource, especially in densely populated areas. However, the complexity of these sites requires considerable efforts to successfully complete their revitalization projects, thus requiring the development and application of appropriate tools to support decision makers in the selection of promising sites where efficiently allocate the limited financial resources. The design of effective prioritization tools is a complex process, which requires the analysis and consideration of critical points of attention (PoAs) which has been identified considering the state of the art in literature, and lessons learned from previous developments of regional brownfield (BF) prioritization processes, frameworks and tools. Accordingly, we identified 5 PoAs, namely 1) Assessing end user needs and orientation discussions, 2) Availability and quality of the data needed for the BF prioritization tool, 3) Communication and stakeholder engagement 4) Drivers of regeneration success, and 5) Financing and application costs. To deepen and collate the most recent knowledge on the topics from scientists and practitioners, we organized a focus group discussion within a special session at the AquaConSoil (ACS) conference 2017, where participants were asked to add their experience and thoughts to the discussion in order to identify the most significant and urgent points of attention in BF prioritization tool design. The result of this assessment is a comprehensive table (Table 2), which can support problem owners, investors, service providers, regulators, public and private land managers, decision makers etc. in the identification of the main aspects (sub-topics) to be considered and their relative influences and in the comprehension of the general patterns and challenges to be faced when dealing with the development of BF prioritization tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Development and evaluation of a comprehensive clinical decision support taxonomy: comparison of front-end tools in commercial and internally developed electronic health record systems

    PubMed Central

    Sittig, Dean F; Ash, Joan S; Feblowitz, Joshua; Meltzer, Seth; McMullen, Carmit; Guappone, Ken; Carpenter, Jim; Richardson, Joshua; Simonaitis, Linas; Evans, R Scott; Nichol, W Paul; Middleton, Blackford

    2011-01-01

    Background Clinical decision support (CDS) is a valuable tool for improving healthcare quality and lowering costs. However, there is no comprehensive taxonomy of types of CDS and there has been limited research on the availability of various CDS tools across current electronic health record (EHR) systems. Objective To develop and validate a taxonomy of front-end CDS tools and to assess support for these tools in major commercial and internally developed EHRs. Study design and methods We used a modified Delphi approach with a panel of 11 decision support experts to develop a taxonomy of 53 front-end CDS tools. Based on this taxonomy, a survey on CDS tools was sent to a purposive sample of commercial EHR vendors (n=9) and leading healthcare institutions with internally developed state-of-the-art EHRs (n=4). Results Responses were received from all healthcare institutions and 7 of 9 EHR vendors (response rate: 85%). All 53 types of CDS tools identified in the taxonomy were found in at least one surveyed EHR system, but only 8 functions were present in all EHRs. Medication dosing support and order facilitators were the most commonly available classes of decision support, while expert systems (eg, diagnostic decision support, ventilator management suggestions) were the least common. Conclusion We developed and validated a comprehensive taxonomy of front-end CDS tools. A subsequent survey of commercial EHR vendors and leading healthcare institutions revealed a small core set of common CDS tools, but identified significant variability in the remainder of clinical decision support content. PMID:21415065

  13. Review of nutritional screening and assessment tools and clinical outcomes in heart failure.

    PubMed

    Lin, Hong; Zhang, Haifeng; Lin, Zheng; Li, Xinli; Kong, Xiangqin; Sun, Gouzhen

    2016-09-01

    Recent studies have suggested that undernutrition as defined using multidimensional nutritional evaluation tools may affect clinical outcomes in heart failure (HF). The evidence supporting this correlation is unclear. Therefore, we conducted this systematic review to critically appraise the use of multidimensional evaluation tools in the prediction of clinical outcomes in HF. We performed descriptive analyses of all identified articles involving qualitative analyses. We used STATA to conduct meta-analyses when at least three studies that tested the same type of nutritional assessment or screening tools and used the same outcome were identified. Sensitivity analyses were conducted to validate our positive results. We identified 17 articles with qualitative analyses and 11 with quantitative analysis after comprehensive literature searching and screening. We determined that the prevalence of malnutrition is high in HF (range 16-90 %), particularly in advanced and acute decompensated HF (approximate range 75-90 %). Undernutrition as identified by multidimensional evaluation tools may be significantly associated with hospitalization, length of stay and complications and is particularly strongly associated with high mortality. The meta-analysis revealed that compared with other tools, Mini Nutritional Assessment (MNA) scores were the strongest predictors of mortality in HF [HR (4.32, 95 % CI 2.30-8.11)]. Our results remained reliable after conducting sensitivity analyses. The prevalence of malnutrition is high in HF, particularly in advanced and acute decompensated HF. Moreover, undernutrition as identified by multidimensional evaluation tools is significantly associated with unfavourable prognoses and high mortality in HF.

  14. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  15. Kinetic Simulation and Energetic Neutral Atom Imaging of the Magnetosphere

    NASA Technical Reports Server (NTRS)

    Fok, Mei-Ching H.

    2011-01-01

    Advanced simulation tools and measurement techniques have been developed to study the dynamic magnetosphere and its response to drivers in the solar wind. The Comprehensive Ring Current Model (CRCM) is a kinetic code that solves the 3D distribution in space, energy and pitch-angle information of energetic ions and electrons. Energetic Neutral Atom (ENA) imagers have been carried in past and current satellite missions. Global morphology of energetic ions were revealed by the observed ENA images. We have combined simulation and ENA analysis techniques to study the development of ring current ions during magnetic storms and substorms. We identify the timing and location of particle injection and loss. We examine the evolution of ion energy and pitch-angle distribution during different phases of a storm. In this talk we will discuss the findings from our ring current studies and how our simulation and ENA analysis tools can be applied to the upcoming TRIO-CINAMA mission.

  16. Iterative Development of an Application to Support Nuclear Magnetic Resonance Data Analysis of Proteins.

    PubMed

    Ellis, Heidi J C; Nowling, Ronald J; Vyas, Jay; Martyn, Timothy O; Gryk, Michael R

    2011-04-11

    The CONNecticut Joint University Research (CONNJUR) team is a group of biochemical and software engineering researchers at multiple institutions. The vision of the team is to develop a comprehensive application that integrates a variety of existing analysis tools with workflow and data management to support the process of protein structure determination using Nuclear Magnetic Resonance (NMR). The use of multiple disparate tools and lack of data management, currently the norm in NMR data processing, provides strong motivation for such an integrated environment. This manuscript briefly describes the domain of NMR as used for protein structure determination and explains the formation of the CONNJUR team and its operation in developing the CONNJUR application. The manuscript also describes the evolution of the CONNJUR application through four prototypes and describes the challenges faced while developing the CONNJUR application and how those challenges were met.

  17. T7 lytic phage-displayed peptide libraries: construction and diversity characterization.

    PubMed

    Krumpe, Lauren R H; Mori, Toshiyuki

    2014-01-01

    In this chapter, we describe the construction of T7 bacteriophage (phage)-displayed peptide libraries and the diversity analyses of random amino acid sequences obtained from the libraries. We used commercially available reagents, Novagen's T7Select system, to construct the libraries. Using a combination of biotinylated extension primer and streptavidin-coupled magnetic beads, we were able to prepare library DNA without applying gel purification, resulting in extremely high ligation efficiencies. Further, we describe the use of bioinformatics tools to characterize library diversity. Amino acid frequency and positional amino acid diversity and hydropathy are estimated using the REceptor LIgand Contacts website http://relic.bio.anl.gov. Peptide net charge analysis and peptide hydropathy analysis are conducted using the Genetics Computer Group Wisconsin Package computational tools. A comprehensive collection of the estimated number of recombinants and titers of T7 phage-displayed peptide libraries constructed in our lab is included.

  18. The guideline implementability research and application network (GIRAnet): an international collaborative to support knowledge exchange: study protocol.

    PubMed

    Gagliardi, Anna R; Brouwers, Melissa C; Bhattacharyya, Onil K

    2012-04-02

    Modifying the format and content of guidelines may facilitate their use and lead to improved quality of care. We reviewed the medical literature to identify features desired by different users and associated with guideline use to develop a framework of implementability and found that most guidelines do not contain these elements. Further research is needed to develop and evaluate implementability tools. We are launching the Guideline Implementability Research and Application Network (GIRAnet) to enable the development and testing of implementability tools in three domains: Resource Implications, Implementation, and Evaluation. Partners include the Guidelines International Network (G-I-N) and its member guideline developers, implementers, and researchers. In phase one, international guidelines will be examined to identify and describe exemplar tools. Indication-specific and generic tools will populate a searchable repository. In phase two, qualitative analysis of cognitive interviews will be used to understand how developers can best integrate implementability tools in guidelines and how health professionals use them for interpreting and applying guidelines. In phase three, a small-scale pilot test will assess the impact of implementability tools based on quantitative analysis of chart-based behavioural outcomes and qualitative analysis of interviews with participants. The findings will be used to plan a more comprehensive future evaluation of implementability tools. Infrastructure funding to establish GIRAnet will be leveraged with the in-kind contributions of collaborating national and international guideline developers to advance our knowledge of implementation practice and science. Needs assessment and evaluation of GIRAnet will provide a greater understanding of how to develop and sustain such knowledge-exchange networks. Ultimately, by facilitating use of guidelines, this research may lead to improved delivery and outcomes of patient care.

  19. Sighten Final Technical Report DEEE0006690 Deploying an integrated and comprehensive solar financing software platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Conlan

    Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software,more » and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a rooftop solar financing program. Standardizing and improving all calculations, improving data quality, and exposing new analysis tools previously unavailable affects investment in the residential space in several important ways: 1) lowering the cost of capital for existing capital providers by mitigating uncertainty and de-risking the solar asset class; 2) attracting new, lower cost investors to the solar asset class as reporting and data quality resemble standards of more mature asset classes; 3) increasing the prevalence of liquidity options for investors through back leverage, securitization, or secondary sale by providing the tools necessary for lenders, ratings agencies, etc. to properly understand a portfolio of residential solar assets. During the project period, Sighten successfully built and scaled a commercially ready tool for the residential solar market. The software solution built by Sighten has been deployed with key target customer segments identified in the award deliverables: solar installers, solar developers/channel managers, and solar financiers, including lenders. Each of these segments greatly benefits from the availability of the Sighten toolset.« less

  20. Association of indoor smoke-free air laws with hospital admissions for acute myocardial infarction and stroke in three states.

    PubMed

    Loomis, Brett R; Juster, Harlan R

    2012-01-01

    To examine whether comprehensive smoke-free air laws enacted in Florida, New York, and Oregon are associated with reductions in hospital admissions for acute myocardial infarction (AMI) and stroke. Analyzed trends in county-level, age-adjusted, hospital admission rates for AMI and stroke from 1990 to 2006 (quarterly) for Florida, 1995 to 2006 (monthly) for New York, and 1998 to 2006 (monthly) for Oregon to identify any association between admission rates and passage of comprehensive smoke-free air laws. Interrupted time series analysis was used to adjust for the effects of preexisting moderate local-level laws, seasonal variation in hospital admissions, differences across counties, and a secular time trend. More than 3 years after passage of statewide comprehensive smoke-free air laws, rates of hospitalization for AMI were reduced by 18.4% (95% CI: 8.8-28.0%) in Florida and 15.5% (95% CI: 11.0-20.1%) in New York. Rates of hospitalization for stroke were reduced by 18.1% (95% CI: 9.3-30.0%) in Florida. The few local comprehensive laws in Oregon were not associated with reductions in AMI or stroke statewide. Comprehensive smoke-free air laws are an effective policy tool for reducing the burden of AMI and stroke.

  1. High-throughput DNA microarray detection of pathogenic bacteria in shallow well groundwater in the Kathmandu Valley, Nepal.

    PubMed

    Inoue, Daisuke; Hinoura, Takuji; Suzuki, Noriko; Pang, Junqin; Malla, Rabin; Shrestha, Sadhana; Chapagain, Saroj Kumar; Matsuzawa, Hiroaki; Nakamura, Takashi; Tanaka, Yasuhiro; Ike, Michihiko; Nishida, Kei; Sei, Kazunari

    2015-01-01

    Because of heavy dependence on groundwater for drinking water and other domestic use, microbial contamination of groundwater is a serious problem in the Kathmandu Valley, Nepal. This study investigated comprehensively the occurrence of pathogenic bacteria in shallow well groundwater in the Kathmandu Valley by applying DNA microarray analysis targeting 941 pathogenic bacterial species/groups. Water quality measurements found significant coliform (fecal) contamination in 10 of the 11 investigated groundwater samples and significant nitrogen contamination in some samples. The results of DNA microarray analysis revealed the presence of 1-37 pathogen species/groups, including 1-27 biosafety level 2 ones, in 9 of the 11 groundwater samples. While the detected pathogens included several feces- and animal-related ones, those belonging to Legionella and Arthrobacter, which were considered not to be directly associated with feces, were detected prevalently. This study could provide a rough picture of overall pathogenic bacterial contamination in the Kathmandu Valley, and demonstrated the usefulness of DNA microarray analysis as a comprehensive screening tool of a wide variety of pathogenic bacteria.

  2. MIPS: a database for genomes and protein sequences.

    PubMed Central

    Mewes, H W; Heumann, K; Kaps, A; Mayer, K; Pfeiffer, F; Stocker, S; Frishman, D

    1999-01-01

    The Munich Information Center for Protein Sequences (MIPS-GSF), Martinsried near Munich, Germany, develops and maintains genome oriented databases. It is commonplace that the amount of sequence data available increases rapidly, but not the capacity of qualified manual annotation at the sequence databases. Therefore, our strategy aims to cope with the data stream by the comprehensive application of analysis tools to sequences of complete genomes, the systematic classification of protein sequences and the active support of sequence analysis and functional genomics projects. This report describes the systematic and up-to-date analysis of genomes (PEDANT), a comprehensive database of the yeast genome (MYGD), a database reflecting the progress in sequencing the Arabidopsis thaliana genome (MATD), the database of assembled, annotated human EST clusters (MEST), and the collection of protein sequence data within the framework of the PIR-International Protein Sequence Database (described elsewhere in this volume). MIPS provides access through its WWW server (http://www.mips.biochem.mpg.de) to a spectrum of generic databases, including the above mentioned as well as a database of protein families (PROTFAM), the MITOP database, and the all-against-all FASTA database. PMID:9847138

  3. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    PubMed

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.

  4. Updates in metabolomics tools and resources: 2014-2015.

    PubMed

    Misra, Biswapriya B; van der Hooft, Justin J J

    2016-01-01

    Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Analysis of Advanced Rotorcraft Configurations

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2000-01-01

    Advanced rotorcraft configurations are being investigated with the objectives of identifying vehicles that are larger, quieter, and faster than current-generation rotorcraft. A large rotorcraft, carrying perhaps 150 passengers, could do much to alleviate airport capacity limitations, and a quiet rotorcraft is essential for community acceptance of the benefits of VTOL operations. A fast, long-range, long-endurance rotorcraft, notably the tilt-rotor configuration, will improve rotorcraft economics through productivity increases. A major part of the investigation of advanced rotorcraft configurations consists of conducting comprehensive analyses of vehicle behavior for the purpose of assessing vehicle potential and feasibility, as well as to establish the analytical models required to support the vehicle development. The analytical work of FY99 included applications to tilt-rotor aircraft. Tilt Rotor Aeroacoustic Model (TRAM) wind tunnel measurements are being compared with calculations performed by using the comprehensive analysis tool (Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD 11)). The objective is to establish the wing and wake aerodynamic models that are required for tilt-rotor analysis and design. The TRAM test in the German-Dutch Wind Tunnel (DNW) produced extensive measurements. This is the first test to encompass air loads, performance, and structural load measurements on tilt rotors, as well as acoustic and flow visualization data. The correlation of measurements and calculations includes helicopter-mode operation (performance, air loads, and blade structural loads), hover (performance and air loads), and airplane-mode operation (performance).

  6. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.

    PubMed

    Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.

  7. Factors influencing subjects' comprehension of a set of medicine package inserts.

    PubMed

    Pires, Carla; Vigário, Marina; Cavaco, Afonso

    2016-08-01

    Background Package inserts (PIs) should promote the safe and effective use of medicines. The comprehension of PIs is related to socio-demographic features, such as education. Objectives To evaluate the participants' comprehension of a sample of PIs and to build an explanatory model of subjects' understanding of the content of these documents. Setting The data were collected from municipalities, city halls, firefighters, the military, schools and charities from two Portuguese regions. Methods Cross-sectional descriptive survey: 503 participants, homogeneously distributed by education and gender. The self-administered tool comprised questions on socio-demographic data, literacy tasks and comprehension evaluation of 12 purposively selected PIs. A logistic regression analysis was used. Main outcome measures Scores of numeracy tasks and comprehension. Results The average comprehension score for the PIs was 63 % (±32 %), with 48 % (n = 239) of the participants scoring <75 %. The most important predictors in explaining a comprehension score ≥75 % were having >12 years of education and correctly performing a numeracy task [respectively, OR 49.6 (CI 95 %: 22.8-108) and OR 2.48 (CI 95 %: 1.5-4.2)]. Conclusion An explanatory model of subjects' knowledge about the content of the tested PIs was built. Given that a high level of education and literacy were found to be the most relevant predictors for acceptable comprehension rates, PIs should be clearly written to assure that they are understood by all potential users, including the less educated. The evaluated PIs may thus need to be simplified.

  8. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.

  9. User-centered design in clinical handover: exploring post-implementation outcomes for clinicians.

    PubMed

    Wong, Ming Chao; Cummings, Elizabeth; Turner, Paul

    2013-01-01

    This paper examines the outcomes for clinicians from their involvement in the development of an electronic clinical hand-over tool developed using principles of user-centered design. Conventional e-health post-implementation evaluations tend to emphasize technology-related (mostly positive) outcomes. More recently, unintended (mostly negative) consequences arising from the implementation of e-health technologies have also been reported. There remains limited focus on the post-implementation outcomes for users, particularly those directly involved in e-health design processes. This paper presents detailed analysis and insights into the outcomes experienced post-implementation by a cohort of junior clinicians involved in developing an electronic clinical handover tool in Tasmania, Australia. The qualitative methods used included observations, semi-structured interviews and analysis of clinical handover notes. Significantly, a number of unanticipated flow-on effects were identified that mitigated some of the challenges arising during the design and implementation of the tool. The paper concludes by highlighting the importance of identifying post-implementation user outcomes beyond conventional system adoption and use and also points to the need for more comprehensive evaluative frameworks to encapsulate these broader socio-technical user outcomes.

  10. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    PubMed

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  11. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less

  12. A Comprehensive Look at Polypharmacy and Medication Screening Tools for the Older Cancer Patient

    PubMed Central

    DeGregory, Kathlene A.; Morris, Amy L.; Ramsdale, Erika E.

    2016-01-01

    Inappropriate medication use and polypharmacy are extremely common among older adults. Numerous studies have discussed the importance of a comprehensive medication assessment in the general geriatric population. However, only a handful of studies have evaluated inappropriate medication use in the geriatric oncology patient. Almost a dozen medication screening tools exist for the older adult. Each available tool has the potential to improve aspects of the care of older cancer patients, but no single tool has been developed for this population. We extensively reviewed the literature (MEDLINE, PubMed) to evaluate and summarize the most relevant medication screening tools for older patients with cancer. Findings of this review support the use of several screening tools concurrently for the elderly patient with cancer. A deprescribing tool should be developed and included in a comprehensive geriatric oncology assessment. Finally, prospective studies are needed to evaluate such a tool to determine its feasibility and impact in older patients with cancer. Implications for Practice: The prevalence of polypharmacy increases with advancing age. Older adults are more susceptible to adverse effects of medications. “Prescribing cascades” are common, whereas “deprescribing” remains uncommon; thus, older patients tend to accumulate medications over time. Older patients with cancer are at high risk for adverse drug events, in part because of the complexity and intensity of cancer treatment. Additionally, a cancer diagnosis often alters assessments of life expectancy, clinical status, and competing risk. Screening for polypharmacy and potentially inappropriate medications could reduce the risk for adverse drug events, enhance quality of life, and reduce health care spending for older cancer patients. PMID:27151653

  13. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    PubMed

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  14. Enhancing Literacy Skills of Students with Congenital and Profound Hearing Impairment in Nigeria Using Babudoh's Comprehension Therapy

    ERIC Educational Resources Information Center

    Babudoh, Gladys B.

    2014-01-01

    This study reports the effect of a treatment tool called "Babudoh's comprehension therapy" in enhancing the comprehension and writing skills of 10 junior secondary school students with congenital and profound hearing impairment in Plateau State, Nigeria. The study adopted the single group pretest-posttest quasi-experimental research…

  15. The GMOseek matrix: a decision support tool for optimizing the detection of genetically modified plants.

    PubMed

    Block, Annette; Debode, Frédéric; Grohmann, Lutz; Hulin, Julie; Taverniers, Isabel; Kluga, Linda; Barbau-Piednoir, Elodie; Broeders, Sylvia; Huber, Ingrid; Van den Bulcke, Marc; Heinze, Petra; Berben, Gilbert; Busch, Ulrich; Roosens, Nancy; Janssen, Eric; Žel, Jana; Gruden, Kristina; Morisset, Dany

    2013-08-22

    Since their first commercialization, the diversity of taxa and the genetic composition of transgene sequences in genetically modified plants (GMOs) are constantly increasing. To date, the detection of GMOs and derived products is commonly performed by PCR-based methods targeting specific DNA sequences introduced into the host genome. Information available regarding the GMOs' molecular characterization is dispersed and not appropriately organized. For this reason, GMO testing is very challenging and requires more complex screening strategies and decision making schemes, demanding in return the use of efficient bioinformatics tools relying on reliable information. The GMOseek matrix was built as a comprehensive, online open-access tabulated database which provides a reliable, comprehensive and user-friendly overview of 328 GMO events and 247 different genetic elements (status: 18/07/2013). The GMOseek matrix is aiming to facilitate GMO detection from plant origin at different phases of the analysis. It assists in selecting the targets for a screening analysis, interpreting the screening results, checking the occurrence of a screening element in a group of selected GMOs, identifying gaps in the available pool of GMO detection methods, and designing a decision tree. The GMOseek matrix is an independent database with effective functionalities in a format facilitating transferability to other platforms. Data were collected from all available sources and experimentally tested where detection methods and certified reference materials (CRMs) were available. The GMOseek matrix is currently a unique and very valuable tool with reliable information on GMOs from plant origin and their present genetic elements that enables further development of appropriate strategies for GMO detection. It is flexible enough to be further updated with new information and integrated in different applications and platforms.

  16. The GMOseek matrix: a decision support tool for optimizing the detection of genetically modified plants

    PubMed Central

    2013-01-01

    Background Since their first commercialization, the diversity of taxa and the genetic composition of transgene sequences in genetically modified plants (GMOs) are constantly increasing. To date, the detection of GMOs and derived products is commonly performed by PCR-based methods targeting specific DNA sequences introduced into the host genome. Information available regarding the GMOs’ molecular characterization is dispersed and not appropriately organized. For this reason, GMO testing is very challenging and requires more complex screening strategies and decision making schemes, demanding in return the use of efficient bioinformatics tools relying on reliable information. Description The GMOseek matrix was built as a comprehensive, online open-access tabulated database which provides a reliable, comprehensive and user-friendly overview of 328 GMO events and 247 different genetic elements (status: 18/07/2013). The GMOseek matrix is aiming to facilitate GMO detection from plant origin at different phases of the analysis. It assists in selecting the targets for a screening analysis, interpreting the screening results, checking the occurrence of a screening element in a group of selected GMOs, identifying gaps in the available pool of GMO detection methods, and designing a decision tree. The GMOseek matrix is an independent database with effective functionalities in a format facilitating transferability to other platforms. Data were collected from all available sources and experimentally tested where detection methods and certified reference materials (CRMs) were available. Conclusions The GMOseek matrix is currently a unique and very valuable tool with reliable information on GMOs from plant origin and their present genetic elements that enables further development of appropriate strategies for GMO detection. It is flexible enough to be further updated with new information and integrated in different applications and platforms. PMID:23965170

  17. A new cross-correlation algorithm for the analysis of "in vitro" neuronal network activity aimed at pharmacological studies.

    PubMed

    Biffi, E; Menegon, A; Regalia, G; Maida, S; Ferrigno, G; Pedrocchi, A

    2011-08-15

    Modern drug discovery for Central Nervous System pathologies has recently focused its attention to in vitro neuronal networks as models for the study of neuronal activities. Micro Electrode Arrays (MEAs), a widely recognized tool for pharmacological investigations, enable the simultaneous study of the spiking activity of discrete regions of a neuronal culture, providing an insight into the dynamics of networks. Taking advantage of MEAs features and making the most of the cross-correlation analysis to assess internal parameters of a neuronal system, we provide an efficient method for the evaluation of comprehensive neuronal network activity. We developed an intra network burst correlation algorithm, we evaluated its sensitivity and we explored its potential use in pharmacological studies. Our results demonstrate the high sensitivity of this algorithm and the efficacy of this methodology in pharmacological dose-response studies, with the advantage of analyzing the effect of drugs on the comprehensive correlative properties of integrated neuronal networks. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Open reading frames associated with cancer in the dark matter of the human genome.

    PubMed

    Delgado, Ana Paula; Brandao, Pamela; Chapado, Maria Julia; Hamid, Sheilin; Narayanan, Ramaswamy

    2014-01-01

    The uncharacterized proteins (open reading frames, ORFs) in the human genome offer an opportunity to discover novel targets for cancer. A systematic analysis of the dark matter of the human proteome for druggability and biomarker discovery is crucial to mining the genome. Numerous data mining tools are available to mine these ORFs to develop a comprehensive knowledge base for future target discovery and validation. Using the Genetic Association Database, the ORFs of the human dark matter proteome were screened for evidence of association with neoplasms. The Phenome-Genome Integrator tool was used to establish phenotypic association with disease traits including cancer. Batch analysis of the tools for protein expression analysis, gene ontology and motifs and domains was used to characterize the ORFs. Sixty-two ORFs were identified for neoplasm association. The expression Quantitative Trait Loci (eQTL) analysis identified thirteen ORFs related to cancer traits. Protein expression, motifs and domain analysis and genome-wide association studies verified the relevance of these OncoORFs in diverse tumors. The OncoORFs are also associated with a wide variety of human diseases and disorders. Our results link the OncoORFs to diverse diseases and disorders. This suggests a complex landscape of the uncharacterized proteome in human diseases. These results open the dark matter of the proteome to novel cancer target research. Copyright© 2014, International Institute of Anticancer Research (Dr. John G. Delinasios), All rights reserved.

  19. Evaluation of the Effectiveness of Stormwater Decision Support Tools for Infrastructure Selection and the Barriers to Implementation

    NASA Astrophysics Data System (ADS)

    Spahr, K.; Hogue, T. S.

    2016-12-01

    Selecting the most appropriate green, gray, and / or hybrid system for stormwater treatment and conveyance can prove challenging to decision markers across all scales, from site managers to large municipalities. To help streamline the selection process, a multi-disciplinary team of academics and professionals is developing an industry standard for selecting and evaluating the most appropriate stormwater management technology for different regions. To make the tool more robust and comprehensive, life-cycle cost assessment and optimization modules will be included to evaluate non-monetized and ecosystem benefits of selected technologies. Initial work includes surveying advisory board members based in cities that use existing decision support tools in their infrastructure planning process. These surveys will qualify the decisions currently being made and identify challenges within the current planning process across a range of hydroclimatic regions and city size. Analysis of social and other non-technical barriers to adoption of the existing tools is also being performed, with identification of regional differences and institutional challenges. Surveys will also gage the regional appropriateness of certain stormwater technologies based off experiences in implementing stormwater treatment and conveyance plans. In additional to compiling qualitative data on existing decision support tools, a technical review of components of the decision support tool used will be performed. Gaps in each tool's analysis, like the lack of certain critical functionalities, will be identified and ease of use will be evaluated. Conclusions drawn from both the qualitative and quantitative analyses will be used to inform the development of the new decision support tool and its eventual dissemination.

  20. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  1. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  2. Space station electrical power distribution analysis using a load flow approach

    NASA Technical Reports Server (NTRS)

    Emanuel, Ervin M.

    1987-01-01

    The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

  3. Lagrangian ocean analysis: Fundamentals and practices

    DOE PAGES

    van Sebille, Erik; Griffies, Stephen M.; Abernathey, Ryan; ...

    2017-11-24

    Lagrangian analysis is a powerful way to analyse the output of ocean circulation models and other ocean velocity data such as from altimetry. In the Lagrangian approach, large sets of virtual particles are integrated within the three-dimensional, time-evolving velocity fields. A variety of tools and methods for this purpose have emerged, over several decades. Here, we review the state of the art in the field of Lagrangian analysis of ocean velocity data, starting from a fundamental kinematic framework and with a focus on large-scale open ocean applications. Beyond the use of explicit velocity fields, we consider the influence of unresolvedmore » physics and dynamics on particle trajectories. We comprehensively list and discuss the tools currently available for tracking virtual particles. We then showcase some of the innovative applications of trajectory data, and conclude with some open questions and an outlook. Our overall goal of this review paper is to reconcile some of the different techniques and methods in Lagrangian ocean analysis, while recognising the rich diversity of codes that have and continue to emerge, and the challenges of the coming age of petascale computing.« less

  4. Lagrangian ocean analysis: Fundamentals and practices

    NASA Astrophysics Data System (ADS)

    van Sebille, Erik; Griffies, Stephen M.; Abernathey, Ryan; Adams, Thomas P.; Berloff, Pavel; Biastoch, Arne; Blanke, Bruno; Chassignet, Eric P.; Cheng, Yu; Cotter, Colin J.; Deleersnijder, Eric; Döös, Kristofer; Drake, Henri F.; Drijfhout, Sybren; Gary, Stefan F.; Heemink, Arnold W.; Kjellsson, Joakim; Koszalka, Inga Monika; Lange, Michael; Lique, Camille; MacGilchrist, Graeme A.; Marsh, Robert; Mayorga Adame, C. Gabriela; McAdam, Ronan; Nencioli, Francesco; Paris, Claire B.; Piggott, Matthew D.; Polton, Jeff A.; Rühs, Siren; Shah, Syed H. A. M.; Thomas, Matthew D.; Wang, Jinbo; Wolfram, Phillip J.; Zanna, Laure; Zika, Jan D.

    2018-01-01

    Lagrangian analysis is a powerful way to analyse the output of ocean circulation models and other ocean velocity data such as from altimetry. In the Lagrangian approach, large sets of virtual particles are integrated within the three-dimensional, time-evolving velocity fields. Over several decades, a variety of tools and methods for this purpose have emerged. Here, we review the state of the art in the field of Lagrangian analysis of ocean velocity data, starting from a fundamental kinematic framework and with a focus on large-scale open ocean applications. Beyond the use of explicit velocity fields, we consider the influence of unresolved physics and dynamics on particle trajectories. We comprehensively list and discuss the tools currently available for tracking virtual particles. We then showcase some of the innovative applications of trajectory data, and conclude with some open questions and an outlook. The overall goal of this review paper is to reconcile some of the different techniques and methods in Lagrangian ocean analysis, while recognising the rich diversity of codes that have and continue to emerge, and the challenges of the coming age of petascale computing.

  5. Lagrangian ocean analysis: Fundamentals and practices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Sebille, Erik; Griffies, Stephen M.; Abernathey, Ryan

    Lagrangian analysis is a powerful way to analyse the output of ocean circulation models and other ocean velocity data such as from altimetry. In the Lagrangian approach, large sets of virtual particles are integrated within the three-dimensional, time-evolving velocity fields. A variety of tools and methods for this purpose have emerged, over several decades. Here, we review the state of the art in the field of Lagrangian analysis of ocean velocity data, starting from a fundamental kinematic framework and with a focus on large-scale open ocean applications. Beyond the use of explicit velocity fields, we consider the influence of unresolvedmore » physics and dynamics on particle trajectories. We comprehensively list and discuss the tools currently available for tracking virtual particles. We then showcase some of the innovative applications of trajectory data, and conclude with some open questions and an outlook. Our overall goal of this review paper is to reconcile some of the different techniques and methods in Lagrangian ocean analysis, while recognising the rich diversity of codes that have and continue to emerge, and the challenges of the coming age of petascale computing.« less

  6. On the road to a stronger public health workforce: visual tools to address complex challenges.

    PubMed

    Drehobl, Patricia; Stover, Beth H; Koo, Denise

    2014-11-01

    The public health workforce is vital to protecting the health and safety of the public, yet for years, state and local governmental public health agencies have reported substantial workforce losses and other challenges to the workforce that threaten the public's health. These challenges are complex, often involve multiple influencing or related causal factors, and demand comprehensive solutions. However, proposed solutions often focus on selected factors and might be fragmented rather than comprehensive. This paper describes approaches to characterizing the situation more comprehensively and includes two visual tools: (1) a fishbone, or Ishikawa, diagram that depicts multiple factors affecting the public health workforce; and (2) a roadmap that displays key elements-goals and strategies-to strengthen the public health workforce, thus moving from the problems depicted in the fishbone toward solutions. The visual tools aid thinking about ways to strengthen the public health workforce through collective solutions and to help leverage resources and build on each other's work. The strategic roadmap is intended to serve as a dynamic tool for partnership, prioritization, and gap assessment. These tools reflect and support CDC's commitment to working with partners on the highest priorities for strengthening the workforce to improve the public's health. Published by Elsevier Inc.

  7. Introduction of blended learning in a master program: Developing an integrative mixed method evaluation framework.

    PubMed

    Chmiel, Aviva S; Shaha, Maya; Schneider, Daniel K

    2017-01-01

    The aim of this research is to develop a comprehensive evaluation framework involving all actors in a higher education blended learning (BL) program. BL evaluation usually either focuses on students, faculty, technological or institutional aspects. Currently, no validated comprehensive monitoring tool exists that can support introduction and further implementation of BL in a higher education context. Starting from established evaluation principles and standards, concepts that were to be evaluated were firstly identified and grouped. In a second step, related BL evaluation tools referring to students, faculty and institutional level were selected. This allowed setting up and implementing an evaluation framework to monitor the introduction of BL during two succeeding recurrences of the program. The results of the evaluation allowed documenting strengths and weaknesses of the BL format in a comprehensive way, involving all actors. It has led to improvements at program, faculty and course level. The evaluation process and the reporting of the results proved to be demanding in time and personal resources. The evaluation framework allows measuring the most significant dimensions influencing the success of a BL implementation at program level. However, this comprehensive evaluation is resource intensive. Further steps will be to refine the framework towards a sustainable and transferable BL monitoring tool that finds a balance between comprehensiveness and efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Models for residential-and commercial-sector energy conservation analysis: Applications, limitations, and future potential

    NASA Astrophysics Data System (ADS)

    Cole, H. E.; Fuller, R. E.

    1980-09-01

    Four of the major models used by DOE for energy conservation analyses in the residential and commercial building sectors are reviewed and critically analyzed to determine how these models can serve as tools for DOE and its Conservation Policy Office in evaluating and quantifying their policy and program requirements. The most effective role for each model in addressing future issues of buildings energy conservation policy and analysis is assessed. The four models covered are: Oak Ridge Residential Energy Model; Micro Analysis of Transfers to Households/Comprehensive Human Resources Data System (MATH/CHRDS) Model; Oak Ridge Commercial Energy Model; and Brookhaven Buildings Energy Conservation Optimization Model (BECOM).

  9. Spreadsheets for Analyzing and Optimizing Space Missions

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Agrawal, Anil K.; Czikmantory, Akos J.; Weisbin, Charles R.; Hua, Hook; Neff, Jon M.; Cowdin, Mark A.; Lewis, Brian S.; Iroz, Juana; Ross, Rick

    2009-01-01

    XCALIBR (XML Capability Analysis LIBRary) is a set of Extensible Markup Language (XML) database and spreadsheet- based analysis software tools designed to assist in technology-return-on-investment analysis and optimization of technology portfolios pertaining to outer-space missions. XCALIBR is also being examined for use in planning, tracking, and documentation of projects. An XCALIBR database contains information on mission requirements and technological capabilities, which are related by use of an XML taxonomy. XCALIBR incorporates a standardized interface for exporting data and analysis templates to an Excel spreadsheet. Unique features of XCALIBR include the following: It is inherently hierarchical by virtue of its XML basis. The XML taxonomy codifies a comprehensive data structure and data dictionary that includes performance metrics for spacecraft, sensors, and spacecraft systems other than sensors. The taxonomy contains >700 nodes representing all levels, from system through subsystem to individual parts. All entries are searchable and machine readable. There is an intuitive Web-based user interface. The software automatically matches technologies to mission requirements. The software automatically generates, and makes the required entries in, an Excel return-on-investment analysis software tool. The results of an analysis are presented in both tabular and graphical displays.

  10. Enlight: A Comprehensive Quality and Therapeutic Potential Evaluation Tool for Mobile and Web-Based eHealth Interventions

    PubMed Central

    Faber, Keren; Mathur, Nandita; Kane, John M; Muench, Fred

    2017-01-01

    Background Studies of criteria-based assessment tools have demonstrated the feasibility of objectively evaluating eHealth interventions independent of empirical testing. However, current tools have not included some quality constructs associated with intervention outcome, such as persuasive design, behavior change, or therapeutic alliance. In addition, the generalizability of such tools has not been explicitly examined. Objective The aim is to introduce the development and further analysis of the Enlight suite of measures, developed to incorporate the aforementioned concepts and address generalizability aspects. Methods As a first step, a comprehensive systematic review was performed to identify relevant quality rating criteria in line with the PRISMA statement. These criteria were then categorized to create Enlight. The second step involved testing Enlight on 42 mobile apps and 42 Web-based programs (delivery mediums) targeting modifiable behaviors related to medical illness or mental health (clinical aims). Results A total of 476 criteria from 99 identified sources were used to build Enlight. The rating measures were divided into two sections: quality assessments and checklists. Quality assessments included usability, visual design, user engagement, content, therapeutic persuasiveness, therapeutic alliance, and general subjective evaluation. The checklists included credibility, privacy explanation, basic security, and evidence-based program ranking. The quality constructs exhibited excellent interrater reliability (intraclass correlations=.77-.98, median .91) and internal consistency (Cronbach alphas=.83-.90, median .88), with similar results when separated into delivery mediums or clinical aims. Conditional probability analysis revealed that 100% of the programs that received a score of fair or above (≥3.0) in therapeutic persuasiveness or therapeutic alliance received the same range of scores in user engagement and content—a pattern that did not appear in the opposite direction. Preliminary concurrent validity analysis pointed to positive correlations of combined quality scores with selected variables. The combined score that did not include therapeutic persuasiveness and therapeutic alliance descriptively underperformed the other combined scores. Conclusions This paper provides empirical evidence supporting the importance of persuasive design and therapeutic alliance within the context of a program’s evaluation. Reliability metrics and preliminary concurrent validity analysis indicate the potential of Enlight in examining eHealth programs regardless of delivery mediums and clinical aims. PMID:28325712

  11. Enlight: A Comprehensive Quality and Therapeutic Potential Evaluation Tool for Mobile and Web-Based eHealth Interventions.

    PubMed

    Baumel, Amit; Faber, Keren; Mathur, Nandita; Kane, John M; Muench, Fred

    2017-03-21

    Studies of criteria-based assessment tools have demonstrated the feasibility of objectively evaluating eHealth interventions independent of empirical testing. However, current tools have not included some quality constructs associated with intervention outcome, such as persuasive design, behavior change, or therapeutic alliance. In addition, the generalizability of such tools has not been explicitly examined. The aim is to introduce the development and further analysis of the Enlight suite of measures, developed to incorporate the aforementioned concepts and address generalizability aspects. As a first step, a comprehensive systematic review was performed to identify relevant quality rating criteria in line with the PRISMA statement. These criteria were then categorized to create Enlight. The second step involved testing Enlight on 42 mobile apps and 42 Web-based programs (delivery mediums) targeting modifiable behaviors related to medical illness or mental health (clinical aims). A total of 476 criteria from 99 identified sources were used to build Enlight. The rating measures were divided into two sections: quality assessments and checklists. Quality assessments included usability, visual design, user engagement, content, therapeutic persuasiveness, therapeutic alliance, and general subjective evaluation. The checklists included credibility, privacy explanation, basic security, and evidence-based program ranking. The quality constructs exhibited excellent interrater reliability (intraclass correlations=.77-.98, median .91) and internal consistency (Cronbach alphas=.83-.90, median .88), with similar results when separated into delivery mediums or clinical aims. Conditional probability analysis revealed that 100% of the programs that received a score of fair or above (≥3.0) in therapeutic persuasiveness or therapeutic alliance received the same range of scores in user engagement and content-a pattern that did not appear in the opposite direction. Preliminary concurrent validity analysis pointed to positive correlations of combined quality scores with selected variables. The combined score that did not include therapeutic persuasiveness and therapeutic alliance descriptively underperformed the other combined scores. This paper provides empirical evidence supporting the importance of persuasive design and therapeutic alliance within the context of a program's evaluation. Reliability metrics and preliminary concurrent validity analysis indicate the potential of Enlight in examining eHealth programs regardless of delivery mediums and clinical aims. ©Amit Baumel, Keren Faber, Nandita Mathur, John M Kane, Fred Muench. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 21.03.2017.

  12. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    NASA Astrophysics Data System (ADS)

    Seebauer, Matthias

    2014-03-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.

  13. MVIAeval: a web tool for comprehensively evaluating the performance of a new missing value imputation algorithm.

    PubMed

    Wu, Wei-Sheng; Jhou, Meng-Jhun

    2017-01-13

    Missing value imputation is important for microarray data analyses because microarray data with missing values would significantly degrade the performance of the downstream analyses. Although many microarray missing value imputation algorithms have been developed, an objective and comprehensive performance comparison framework is still lacking. To solve this problem, we previously proposed a framework which can perform a comprehensive performance comparison of different existing algorithms. Also the performance of a new algorithm can be evaluated by our performance comparison framework. However, constructing our framework is not an easy task for the interested researchers. To save researchers' time and efforts, here we present an easy-to-use web tool named MVIAeval (Missing Value Imputation Algorithm evaluator) which implements our performance comparison framework. MVIAeval provides a user-friendly interface allowing users to upload the R code of their new algorithm and select (i) the test datasets among 20 benchmark microarray (time series and non-time series) datasets, (ii) the compared algorithms among 12 existing algorithms, (iii) the performance indices from three existing ones, (iv) the comprehensive performance scores from two possible choices, and (v) the number of simulation runs. The comprehensive performance comparison results are then generated and shown as both figures and tables. MVIAeval is a useful tool for researchers to easily conduct a comprehensive and objective performance evaluation of their newly developed missing value imputation algorithm for microarray data or any data which can be represented as a matrix form (e.g. NGS data or proteomics data). Thus, MVIAeval will greatly expedite the progress in the research of missing value imputation algorithms.

  14. Earth-Science Data Co-Locating Tool

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Block, Gary L.

    2012-01-01

    This software is used to locate Earth-science satellite data and climate-model analysis outputs in space and time. This enables the direct comparison of any set of data with different spatial and temporal resolutions. It is written in three separate modules that are clearly separated for their functionality and interface with other modules. This enables a fast development of supporting any new data set. In this updated version of the tool, several new front ends are developed for new products. This software finds co-locatable data pairs for given sets of data products and creates new data products that share the same spatial and temporal coordinates. This facilitates the direct comparison between the two heterogeneous datasets and the comprehensive and synergistic use of the datasets.

  15. Bioinformatics-based tools in drug discovery: the cartography from single gene to integrative biological networks.

    PubMed

    Ramharack, Pritika; Soliman, Mahmoud E S

    2018-06-01

    Originally developed for the analysis of biological sequences, bioinformatics has advanced into one of the most widely recognized domains in the scientific community. Despite this technological evolution, there is still an urgent need for nontoxic and efficient drugs. The onus now falls on the 'omics domain to meet this need by implementing bioinformatics techniques that will allow for the introduction of pioneering approaches in the rational drug design process. Here, we categorize an updated list of informatics tools and explore the capabilities of integrative bioinformatics in disease control. We believe that our review will serve as a comprehensive guide toward bioinformatics-oriented disease and drug discovery research. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. NONMEMory: a run management tool for NONMEM.

    PubMed

    Wilkins, Justin J

    2005-06-01

    NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.

  17. Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)

    NASA Astrophysics Data System (ADS)

    Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.

    2014-04-01

    A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.

  18. Design and evaluation of low-cost stainless steel fiberglass foam blades for large wind driven generating systems

    NASA Technical Reports Server (NTRS)

    Eggert, W. S.

    1982-01-01

    A low cost wind turbine blade based on a stainless steel fiberglass foam Budd blade design concept, was evaluated for its principle characteristics, low cost features, and its advantages and disadvantages. A blade structure was designed and construction methods and materials were selected. A complete blade tooling concepts, various technical and economic analysis, and evaluations of the blade design were performed. A comprehensive fatigue test program is conducted to provide data to verify the design stress allowables.

  19. The Current Status of the Philosophy of Biology

    NASA Astrophysics Data System (ADS)

    Takacs, Peter; Ruse, Michael

    2013-01-01

    The philosophy of biology today is one of the most exciting areas of philosophy. It looks critically across the life sciences, teasing out conceptual issues and difficulties bringing to bear the tools of philosophical analysis to achieve clarification and understanding. This essay surveys work in all of the major directions of research: evolutionary theory and the units/levels of selection; evolutionary developmental biology; reductionism; ecology; the species problem; teleology; evolutionary epistemology; evolutionary ethics; and progress. There is a comprehensive bibliography.

  20. Comprehensive Approach Training Toolkit: Training Needs Analysis

    DTIC Science & Technology

    2013-03-01

    design pertinent sit e training on . A tool such able 14 summ Global MedAi The research w global and fund War Program. The menu has standards...The cultur detail and designed requires. T type inter cultural pr Table 20 s Comp e Cul Onl s Inte equ aintain Sold ann ess Yes Mod onli exce s...Competency developmen Time comm Use by allie Website Comp e Intr es Bec situa at g grad test The Onl It is Acc s The Win 512 DVD

  1. Design and evaluation of low-cost stainless steel fiberglass foam blades for large wind driven generating systems

    NASA Astrophysics Data System (ADS)

    Eggert, W. S.

    1982-10-01

    A low cost wind turbine blade based on a stainless steel fiberglass foam Budd blade design concept, was evaluated for its principle characteristics, low cost features, and its advantages and disadvantages. A blade structure was designed and construction methods and materials were selected. A complete blade tooling concepts, various technical and economic analysis, and evaluations of the blade design were performed. A comprehensive fatigue test program is conducted to provide data to verify the design stress allowables.

  2. Delamination Assessment Tool for Spacecraft Composite Structures

    NASA Astrophysics Data System (ADS)

    Portela, Pedro; Preller, Fabian; Wittke, Henrik; Sinnema, Gerben; Camanho, Pedro; Turon, Albert

    2012-07-01

    Fortunately only few cases are known where failure of spacecraft structures due to undetected damage has resulted in a loss of spacecraft and launcher mission. However, several problems related to damage tolerance and in particular delamination of composite materials have been encountered during structure development of various ESA projects and qualification testing. To avoid such costly failures during development, launch or service of spacecraft, launcher and reusable launch vehicles structures a comprehensive damage tolerance verification approach is needed. In 2009, the European Space Agency (ESA) initiated an activity called “Delamination Assessment Tool” which is led by the Portuguese company HPS Lda and includes academic and industrial partners. The goal of this study is the development of a comprehensive damage tolerance verification approach for launcher and reusable launch vehicles (RLV) structures, addressing analytical and numerical methodologies, material-, subcomponent- and component testing, as well as non-destructive inspection. The study includes a comprehensive review of current industrial damage tolerance practice resulting from ECSS and NASA standards, the development of new Best Practice Guidelines for analysis, test and inspection methods and the validation of these with a real industrial case study. The paper describes the main findings of this activity so far and presents a first iteration of a Damage Tolerance Verification Approach, which includes the introduction of novel analytical and numerical tools at an industrial level. This new approach is being put to the test using real industrial case studies provided by the industrial partners, MT Aerospace, RUAG Space and INVENT GmbH

  3. Can Early Years Professionals Determine Which Preschoolers Have Comprehension Delays? A Comparison of Two Screening Tools

    ERIC Educational Resources Information Center

    Seager, Emily; Abbot-Smith, Kirsten

    2017-01-01

    Language comprehension delays in pre-schoolers are predictive of difficulties in a range of developmental domains. In England, early years practitioners are required to assess the language comprehension of 2-year-olds in their care. Many use a format based on the Early Years Foundation Stage Unique Child Communication Sheet (EYFS:UCCS) in which…

  4. Stakeholders' Perspectives towards the Use of the Comprehensive Health Assessment Program (CHAP) for Adults with Intellectual Disabilities in Manitoba

    ERIC Educational Resources Information Center

    Shooshtari, Shahin; Temple, Beverley; Waldman, Celeste; Abraham, Sneha; Ouellette-Kuntz, Héléne; Lennox, Nicholas

    2017-01-01

    Background: No standardized tool is used in Canada for comprehensive health assessments of adults with intellectual disabilities. This study was conducted to determine the feasibility of implementing the Comprehensive Health Assessment Program (CHAP) in Manitoba, Canada. Method: This was a qualitative study using a purposive sample of physicians,…

  5. Engineering the fitness of older patients for chemotherapy: an exploration of Comprehensive Geriatric Assessment in practice.

    PubMed

    McCarthy, Alexandra L; Cook, Peta S; Yates, Patsy

    2014-03-01

    Clinicians often report that currently available methods to assess older patients, including standard clinical consultations, do not elicit the information necessary to make an appropriate cancer treatment recommendation for older cancer patients. An increasingly popular way of assessing the potential of older patients to cope with chemotherapy is a Comprehensive Geriatric Assessment. What constitutes Comprehensive Geriatric Assessment, however, is open to interpretation and varies from one setting to another. Furthermore, Comprehensive Geriatric Assessment's usefulness as a predictor of fitness for chemotherapy and as a determinant of actual treatment is not well understood. In this article, we analyse how Comprehensive Geriatric Assessment was developed for use in a large cancer service in an Australian capital city. Drawing upon Actor-Network Theory, our findings reveal how, during its development, Comprehensive Geriatric Assessment was made both a tool and a science. Furthermore, we briefly explore the tensions that we experienced as scholars who analyse medico-scientific practices and as practitioner-designers charged with improving the very tools we critique. Our study contributes towards geriatric oncology by scrutinising the medicalisation of ageing, unravelling the practices of standardisation and illuminating the multiplicity of 'fitness for chemotherapy'.

  6. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    PubMed

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  7. NuGO contributions to GenePattern

    PubMed Central

    Reiff, C.; Mayer, C.; Müller, M.

    2008-01-01

    NuGO, the European Nutrigenomics Organization, utilizes 31 powerful computers for, e.g., data storage and analysis. These so-called black boxes (NBXses) are located at the sites of different partners. NuGO decided to use GenePattern as the preferred genomic analysis tool on each NBX. To handle the custom made Affymetrix NuGO arrays, new NuGO modules are added to GenePattern. These NuGO modules execute the latest Bioconductor version ensuring up-to-date annotations and access to the latest scientific developments. The following GenePattern modules are provided by NuGO: NuGOArrayQualityAnalysis for comprehensive quality control, NuGOExpressionFileCreator for import and normalization of data, LimmaAnalysis for identification of differentially expressed genes, TopGoAnalysis for calculation of GO enrichment, and GetResultForGo for retrieval of information on genes associated with specific GO terms. All together, these NuGO modules allow comprehensive, up-to-date, and user friendly analysis of Affymetrix data. A special feature of the NuGO modules is that for analysis they allow the use of either the standard Affymetrix or the MBNI custom CDF-files, which remap probes based on current knowledge. In both cases a .chip-file is created to enable GSEA analysis. The NuGO GenePattern installations are distributed as binary Ubuntu (.deb) packages via the NuGO repository. PMID:19034553

  8. NuGO contributions to GenePattern.

    PubMed

    De Groot, P J; Reiff, C; Mayer, C; Müller, M

    2008-12-01

    NuGO, the European Nutrigenomics Organization, utilizes 31 powerful computers for, e.g., data storage and analysis. These so-called black boxes (NBXses) are located at the sites of different partners. NuGO decided to use GenePattern as the preferred genomic analysis tool on each NBX. To handle the custom made Affymetrix NuGO arrays, new NuGO modules are added to GenePattern. These NuGO modules execute the latest Bioconductor version ensuring up-to-date annotations and access to the latest scientific developments. The following GenePattern modules are provided by NuGO: NuGOArrayQualityAnalysis for comprehensive quality control, NuGOExpressionFileCreator for import and normalization of data, LimmaAnalysis for identification of differentially expressed genes, TopGoAnalysis for calculation of GO enrichment, and GetResultForGo for retrieval of information on genes associated with specific GO terms. All together, these NuGO modules allow comprehensive, up-to-date, and user friendly analysis of Affymetrix data. A special feature of the NuGO modules is that for analysis they allow the use of either the standard Affymetrix or the MBNI custom CDF-files, which remap probes based on current knowledge. In both cases a .chip-file is created to enable GSEA analysis. The NuGO GenePattern installations are distributed as binary Ubuntu (.deb) packages via the NuGO repository.

  9. A situational analysis methodology to inform comprehensive HIV prevention and treatment programming, applied in rural South Africa.

    PubMed

    Treves-Kagan, Sarah; Naidoo, Evasen; Gilvydis, Jennifer M; Raphela, Elsie; Barnhart, Scott; Lippman, Sheri A

    2017-09-01

    Successful HIV prevention programming requires engaging communities in the planning process and responding to the social environmental factors that shape health and behaviour in a specific local context. We conducted two community-based situational analyses to inform a large, comprehensive HIV prevention programme in two rural districts of North West Province South Africa in 2012. The methodology includes: initial partnership building, goal setting and background research; 1 week of field work; in-field and subsequent data analysis; and community dissemination and programmatic incorporation of results. We describe the methodology and a case study of the approach in rural South Africa; assess if the methodology generated data with sufficient saturation, breadth and utility for programming purposes; and evaluate if this process successfully engaged the community. Between the two sites, 87 men and 105 women consented to in-depth interviews; 17 focus groups were conducted; and 13 health facilities and 7 NGOs were assessed. The methodology succeeded in quickly collecting high-quality data relevant to tailoring a comprehensive HIV programme and created a strong foundation for community engagement and integration with local health services. This methodology can be an accessible tool in guiding community engagement and tailoring future combination HIV prevention and care programmes.

  10. A diagnostic biochip for the comprehensive analysis of MLL translocations in acute leukemia.

    PubMed

    Maroc, N; Morel, A; Beillard, E; De La Chapelle, A L; Fund, X; Mozziconacci, M-J; Dupont, M; Cayuela, J-M; Gabert, J; Koki, A; Fert, V; Hermitte, F

    2004-09-01

    Reciprocal rearrangements of the MLL gene are among the most common chromosomal abnormalities in both Acute Lymphoblastic and Myeloid Leukemia. The MLL gene, located on the 11q23 chromosomal band, is involved in more than 40 recurrent translocations. In the present study, we describe the development and validation of a biochip-based assay designed to provide a comprehensive molecular analysis of MLL rearrangements when used in a standard clinical pathology laboratory. A retrospective blind study was run with cell lines (n=5), and MLL positive and negative patient samples (n=31), to evaluate assay performance. The limits of detection determined on cell line data were 10(-1), and the precision studies yielded 100% repeatability and 98% reproducibility. The study shows that the device can detect frequent (AF4, AF6, AF10, ELL or ENL) as well as rare partner genes (AF17, MSF). The identified fusion transcripts can then be used as molecular phenotypic markers of disease for the precise evaluation of minimal residual disease by RQ-PCR. This biochip-based molecular diagnostic tool allows, in a single experiment, rapid and accurate identification of MLL gene rearrangements among 32 different fusion gene (FG) partners, precise breakpoint positioning and comprehensive screening of all currently characterized MLL FGs.

  11. Development of self-compressing BLSOM for comprehensive analysis of big sequence data.

    PubMed

    Kikuchi, Akihito; Ikemura, Toshimichi; Abe, Takashi

    2015-01-01

    With the remarkable increase in genomic sequence data from various organisms, novel tools are needed for comprehensive analyses of available big sequence data. We previously developed a Batch-Learning Self-Organizing Map (BLSOM), which can cluster genomic fragment sequences according to phylotype solely dependent on oligonucleotide composition and applied to genome and metagenomic studies. BLSOM is suitable for high-performance parallel-computing and can analyze big data simultaneously, but a large-scale BLSOM needs a large computational resource. We have developed Self-Compressing BLSOM (SC-BLSOM) for reduction of computation time, which allows us to carry out comprehensive analysis of big sequence data without the use of high-performance supercomputers. The strategy of SC-BLSOM is to hierarchically construct BLSOMs according to data class, such as phylotype. The first-layer BLSOM was constructed with each of the divided input data pieces that represents the data subclass, such as phylotype division, resulting in compression of the number of data pieces. The second BLSOM was constructed with a total of weight vectors obtained in the first-layer BLSOMs. We compared SC-BLSOM with the conventional BLSOM by analyzing bacterial genome sequences. SC-BLSOM could be constructed faster than BLSOM and cluster the sequences according to phylotype with high accuracy, showing the method's suitability for efficient knowledge discovery from big sequence data.

  12. Refocusing on physical health: Community psychiatric nurses' perceptions of using enhanced health checks for people with severe mental illness.

    PubMed

    Bressington, Daniel; Mui, Jolene; Wells, Harvey; Chien, Wai Tong; Lam, Claire; White, Jacquie; Gray, Richard

    2016-06-01

    In the present qualitative, descriptive study, we explored Hong Kong community psychiatric nurses' (CPN) perceptions of using comprehensive physical health checks for service users diagnosed with severe mental illness (SMI). Research interviews were conducted with a purposive sample of 11 CPN in order to explore their perceptions about the use of the Health Improvement Profile (HIP) over a 1-year period. Interview data were analysed using inductive thematic analysis. The analysis revealed that the majority of CPN appreciated the comprehensive focus on the physical health of their clients and reported positive changes in their clinical practice. Many of them observed an increase in the motivation of their clients to improve their physical health, and also noted observable benefits in service users' well-being. The use of the HIP also helped the CPN identify implementation barriers, and highlighted areas of the tool that required modifications to suit the local cultural and clinical context. To our knowledge, this is the first study conducted in an Asian mental health service that explores nurses' views about using comprehensive health checks for people with SMI. The findings suggest that such approaches are viewed as being acceptable, feasible, and potentially beneficial in the community mental health setting. © 2016 Australian College of Mental Health Nurses Inc.

  13. [Test set for the evaluation of hearing and speech development after cochlear implantation in children].

    PubMed

    Lamprecht-Dinnesen, A; Sick, U; Sandrieser, P; Illg, A; Lesinski-Schiedat, A; Döring, W H; Müller-Deile, J; Kiefer, J; Matthias, K; Wüst, A; Konradi, E; Riebandt, M; Matulat, P; Von Der Haar-Heise, S; Swart, J; Elixmann, K; Neumann, K; Hildmann, A; Coninx, F; Meyer, V; Gross, M; Kruse, E; Lenarz, T

    2002-10-01

    Since autumn 1998 the multicenter interdisciplinary study group "Test Materials for CI Children" has been compiling a uniform examination tool for evaluation of speech and hearing development after cochlear implantation in childhood. After studying the relevant literature, suitable materials were checked for practical applicability, modified and provided with criteria for execution and break-off. For data acquisition, observation forms for preparation of a PC-version were developed. The evaluation set contains forms for master data with supplements relating to postoperative processes. The hearing tests check supra-threshold hearing with loudness scaling for children, speech comprehension in silence (Mainz and Göttingen Test for Speech Comprehension in Childhood) and phonemic differentiation (Oldenburg Rhyme Test for Children), the central auditory processes of detection, discrimination, identification and recognition (modification of the "Frankfurt Functional Hearing Test for Children") and audiovisual speech perception (Open Paragraph Tracking, Kiel Speech Track Program). The materials for speech and language development comprise phonetics-phonology, lexicon and semantics (LOGO Pronunciation Test), syntax and morphology (analysis of spontaneous speech), language comprehension (Reynell Scales), communication and pragmatics (observation forms). The MAIS and MUSS modified questionnaires are integrated. The evaluation set serves quality assurance and permits factor analysis as well as controls for regularity through the multicenter comparison of long-term developmental trends after cochlear implantation.

  14. MIPS: analysis and annotation of proteins from whole genomes in 2005

    PubMed Central

    Mewes, H. W.; Frishman, D.; Mayer, K. F. X.; Münsterkötter, M.; Noubibou, O.; Pagel, P.; Rattei, T.; Oesterheld, M.; Ruepp, A.; Stümpflen, V.

    2006-01-01

    The Munich Information Center for Protein Sequences (MIPS at the GSF), Neuherberg, Germany, provides resources related to genome information. Manually curated databases for several reference organisms are maintained. Several of these databases are described elsewhere in this and other recent NAR database issues. In a complementary effort, a comprehensive set of >400 genomes automatically annotated with the PEDANT system are maintained. The main goal of our current work on creating and maintaining genome databases is to extend gene centered information to information on interactions within a generic comprehensive framework. We have concentrated our efforts along three lines (i) the development of suitable comprehensive data structures and database technology, communication and query tools to include a wide range of different types of information enabling the representation of complex information such as functional modules or networks Genome Research Environment System, (ii) the development of databases covering computable information such as the basic evolutionary relations among all genes, namely SIMAP, the sequence similarity matrix and the CABiNet network analysis framework and (iii) the compilation and manual annotation of information related to interactions such as protein–protein interactions or other types of relations (e.g. MPCDB, MPPI, CYGD). All databases described and the detailed descriptions of our projects can be accessed through the MIPS WWW server (). PMID:16381839

  15. MIPS: analysis and annotation of proteins from whole genomes in 2005.

    PubMed

    Mewes, H W; Frishman, D; Mayer, K F X; Münsterkötter, M; Noubibou, O; Pagel, P; Rattei, T; Oesterheld, M; Ruepp, A; Stümpflen, V

    2006-01-01

    The Munich Information Center for Protein Sequences (MIPS at the GSF), Neuherberg, Germany, provides resources related to genome information. Manually curated databases for several reference organisms are maintained. Several of these databases are described elsewhere in this and other recent NAR database issues. In a complementary effort, a comprehensive set of >400 genomes automatically annotated with the PEDANT system are maintained. The main goal of our current work on creating and maintaining genome databases is to extend gene centered information to information on interactions within a generic comprehensive framework. We have concentrated our efforts along three lines (i) the development of suitable comprehensive data structures and database technology, communication and query tools to include a wide range of different types of information enabling the representation of complex information such as functional modules or networks Genome Research Environment System, (ii) the development of databases covering computable information such as the basic evolutionary relations among all genes, namely SIMAP, the sequence similarity matrix and the CABiNet network analysis framework and (iii) the compilation and manual annotation of information related to interactions such as protein-protein interactions or other types of relations (e.g. MPCDB, MPPI, CYGD). All databases described and the detailed descriptions of our projects can be accessed through the MIPS WWW server (http://mips.gsf.de).

  16. Enhanced terahertz imaging system performance analysis and design tool for concealed weapon identification

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Franck, Charmaine C.; Espinola, Richard L.; Petkie, Douglas T.; De Lucia, Frank C.; Jacobs, Eddie L.

    2011-11-01

    The U.S. Army Research Laboratory (ARL) and the U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) have developed a terahertz-band imaging system performance model/tool for detection and identification of concealed weaponry. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security & Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). This paper will provide a comprehensive review of an enhanced, user-friendly, Windows-executable, terahertz-band imaging system performance analysis and design tool that now includes additional features such as a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures. This newly enhanced THz imaging system design tool is an extension of the advanced THz imaging system performance model that was developed under the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. This paper will also provide example system component (active-illumination source and detector) trade-study analyses using the new features of this user-friendly THz imaging system performance analysis and design tool.

  17. Effect-directed analysis supporting monitoring of aquatic environments--An in-depth overview.

    PubMed

    Brack, Werner; Ait-Aissa, Selim; Burgess, Robert M; Busch, Wibke; Creusot, Nicolas; Di Paolo, Carolina; Escher, Beate I; Mark Hewitt, L; Hilscherova, Klara; Hollender, Juliane; Hollert, Henner; Jonker, Willem; Kool, Jeroen; Lamoree, Marja; Muschket, Matthias; Neumann, Steffen; Rostkowski, Pawel; Ruttkies, Christoph; Schollee, Jennifer; Schymanski, Emma L; Schulze, Tobias; Seiler, Thomas-Benjamin; Tindall, Andrew J; De Aragão Umbuzeiro, Gisela; Vrana, Branislav; Krauss, Martin

    2016-02-15

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification. The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches. A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Introduction of the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Costing Tool: a user-friendly spreadsheet program to estimate costs of providing patient-centered interventions.

    PubMed

    Reed, Shelby D; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L; Bowers, Margaret T; Samsa, Gregory P; Paul, Sara; Schulman, Kevin A; Whellan, David J; Riegel, Barbara J

    2012-01-01

    Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers and health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions.

  19. Using standardized tools to improve immunization costing data for program planning: the cost of the Colombian Expanded Program on Immunization.

    PubMed

    Castañeda-Orjuela, Carlos; Romero, Martin; Arce, Patricia; Resch, Stephen; Janusz, Cara B; Toscano, Cristiana M; De la Hoz-Restrepo, Fernando

    2013-07-02

    The cost of Expanded Programs on Immunization (EPI) is an important aspect of the economic and financial analysis needed for planning purposes. Costs also are needed for cost-effectiveness analysis of introducing new vaccines. We describe a costing tool that improves the speed, accuracy, and availability of EPI costs and that was piloted in Colombia. The ProVac CostVac Tool is a spreadsheet-based tool that estimates overall EPI costs considering program inputs (personnel, cold chain, vaccines, supplies, etc.) at three administrative levels (central, departmental, and municipal) and one service delivery level (health facilities). It uses various costing methods. The tool was evaluated through a pilot exercise in Colombia. In addition to the costs obtained from the central and intermediate administrative levels, a survey of 112 local health facilities was conducted to collect vaccination costs. Total cost of the EPI, cost per dose of vaccine delivered, and cost per fully vaccinated child with the recommended immunization schedule in Colombia in 2009 were estimated. The ProVac CostVac Tool is a novel, user-friendly tool, which allows users to conduct an EPI costing study following guidelines for cost studies. The total costs of the Colombian EPI were estimated at US$ 107.8 million in 2009. The cost for a fully immunized child with the recommended schedule was estimated at US$ 153.62. Vaccines and vaccination supplies accounted for 58% of total costs, personnel for 21%, cold chain for 18%, and transportation for 2%. Most EPI costs are incurred at the central level (62%). The major cost driver at the department and municipal levels is personnel costs. The ProVac CostVac Tool proved to be a comprehensive and useful tool that will allow researchers and health officials to estimate the actual cost for national immunization programs. The present analysis shows that personnel, cold chain, and transportation are important components of EPI and should be carefully estimated in the cost analysis, particularly when evaluating new vaccine introduction. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. The validity of a professional competence tool for physiotherapy students in simulation-based clinical education: a Rasch analysis.

    PubMed

    Judd, Belinda K; Scanlan, Justin N; Alison, Jennifer A; Waters, Donna; Gordon, Christopher J

    2016-08-05

    Despite the recent widespread adoption of simulation in clinical education in physiotherapy, there is a lack of validated tools for assessment in this setting. The Assessment of Physiotherapy Practice (APP) is a comprehensive tool used in clinical placement settings in Australia to measure professional competence of physiotherapy students. The aim of the study was to evaluate the validity of the APP for student assessment in simulation settings. A total of 1260 APPs were collected, 971 from students in simulation and 289 from students in clinical placements. Rasch analysis was used to examine the construct validity of the APP tool in three different simulation assessment formats: longitudinal assessment over 1 week of simulation; longitudinal assessment over 2 weeks; and a short-form (25 min) assessment of a single simulation scenario. Comparison with APPs from 5 week clinical placements in hospital and clinic-based settings were also conducted. The APP demonstrated acceptable fit to the expectations of the Rasch model for the 1 and 2 week clinical simulations, exhibiting unidimensional properties that were able to distinguish different levels of student performance. For the short-form simulation, nine of the 20 items recorded greater than 25 % of scores as 'not-assessed' by clinical educators which impacted on the suitability of the APP tool in this simulation format. The APP was a valid assessment tool when used in longitudinal simulation formats. A revised APP may be required for assessment in short-form simulation scenarios.

  1. PlantTFDB 4.0: toward a central hub for transcription factors and regulatory interactions in plants.

    PubMed

    Jin, Jinpu; Tian, Feng; Yang, De-Chang; Meng, Yu-Qi; Kong, Lei; Luo, Jingchu; Gao, Ge

    2017-01-04

    With the goal of providing a comprehensive, high-quality resource for both plant transcription factors (TFs) and their regulatory interactions with target genes, we upgraded plant TF database PlantTFDB to version 4.0 (http://planttfdb.cbi.pku.edu.cn/). In the new version, we identified 320 370 TFs from 165 species, presenting a more comprehensive genomic TF repertoires of green plants. Besides updating the pre-existing abundant functional and evolutionary annotation for identified TFs, we generated three new types of annotation which provide more directly clues to investigate functional mechanisms underlying: (i) a set of high-quality, non-redundant TF binding motifs derived from experiments; (ii) multiple types of regulatory elements identified from high-throughput sequencing data; (iii) regulatory interactions curated from literature and inferred by combining TF binding motifs and regulatory elements. In addition, we upgraded previous TF prediction server, and set up four novel tools for regulation prediction and functional enrichment analyses. Finally, we set up a novel companion portal PlantRegMap (http://plantregmap.cbi.pku.edu.cn) for users to access the regulation resource and analysis tools conveniently. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Comprehensive verticality analysis and web-based rehabilitation system for people with multiple sclerosis with supervised medical monitoring.

    PubMed

    Eguiluz-Perez, Gonzalo; Garcia-Zapirain, Begonya

    2014-01-01

    People with Multiple Sclerosis (MS) need regular physical activities along with medical treatment despite their ability or disability level. However, poorly performed exercises could aggravate muscle imbalances and worsen their health. The goal of our work is to create a comprehensive system, encompassing a face-to-face sessions performed by MS patients one day a week at the medical center with exercises at home the rest of the week through a web platform in combination with a tracking tool to analyze the position of patients during exercise and correct them in real-time. The whole system is currently testing during six months with ten participants, five persons with MS and 5 professionals related with MS. Two tests, the Functional Independence Measure and the Berg Balance Scale will be act as a barometer for measuring the degree of independence obtained by the people with MS and also the validity of the whole system as a rehabilitation tool. Preliminary results about the usability of the system using SUS scale, 72 and 76 points over 100 (patients and professionals respectively), demonstrate that our system is usable for both patients and professionals.

  3. Design/analysis of the JWST ISIM bonded joints for survivability at cryogenic temperatures

    NASA Astrophysics Data System (ADS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini, Benjamin; Young, Daniel

    2005-08-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite adhesively bonded joints at the cryogenic temperature of 30K (-405°F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to hybrid composite tubes (75mm square) made with M55J/954-6 and T300/954-6 prepregs. Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently, the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  4. Assessing team performance in the operating room: development and use of a "black-box" recorder and other tools for the intraoperative environment.

    PubMed

    Guerlain, Stephanie; Adams, Reid B; Turrentine, F Beth; Shin, Thomas; Guo, Hui; Collins, Stephen R; Calland, J Forrest

    2005-01-01

    The objective of this research was to develop a digital system to archive the complete operative environment along with the assessment tools for analysis of this data, allowing prospective studies of operative performance, intraoperative errors, team performance, and communication. Ability to study this environment will yield new insights, allowing design of systems to avoid preventable errors that contribute to perioperative complications. A multitrack, synchronized, digital audio-visual recording system (RATE tool) was developed to monitor intraoperative performance, including software to synchronize data and allow assignment of independent observational scores. Cases were scored for technical performance, participants' situational awareness (knowledge of critical information), and their comfort and satisfaction with the conduct of the procedure. Laparoscopic cholecystectomy (n = 10) was studied. Technical performance of the RATE tool was excellent. The RATE tool allowed real time, multitrack data collection of all aspects of the operative environment, while permitting digital recording of the objective assessment data in a time synchronized and annotated fashion during the procedure. The mean technical performance score was 73% +/- 28% of maximum (perfect) performance. Situational awareness varied widely among team members, with the attending surgeon typically the only team member having comprehensive knowledge of critical case information. The RATE tool allows prospective analysis of performance measures such as technical judgments, team performance, and communication patterns, offers the opportunity to conduct prospective intraoperative studies of human performance, and allows for postoperative discussion, review, and teaching. This study also suggests that gaps in situational awareness might be an underappreciated source of operative adverse events. Future uses of this system will aid teaching, failure or adverse event analysis, and intervention research.

  5. Comprehensive feedback on trainee surgeons’ non-technical skills

    PubMed Central

    Dieckmann, Peter; Beier-Holgersen, Randi; Rosenberg, Jacob; Oestergaard, Doris

    2015-01-01

    Objectives This study aimed to explore the content of conversations, feedback style, and perceived usefulness of feedback to trainee surgeons when conversations were stimulated by a tool for assessing surgeons’ non-technical skills. Methods Trainee surgeons and their supervisors used the Non-Technical Skills for Surgeons in Denmark tool to stimulate feedback conversations. Audio recordings of post-operation feedback conversations were collected. Trainees and supervisors provided questionnaire responses on the usefulness and comprehensiveness of the feedback. The feedback conversations were qualitatively analyzed for content and feedback style. Usefulness was investigated using a scale from 1 to 5 and written comments were qualitatively analyzed. Results Six trainees and six supervisors participated in eight feedback conversations. Eighty questionnaires (response rate 83 percent) were collected from 13 trainees and 12 supervisors. Conversations lasted median eight (2-15) minutes. Supervisors used the elements and categories in the tool to structure the content of the conversations. Supervisors tended to talk about the trainees’ actions and their own frames rather than attempting to understand the trainees’ perceptions. Supervisors and trainees welcomed the feedback opportunity and agreed that the conversations were useful and comprehensive. Conclusions The content of the feedback conversations reflected the contents of the tool and the feedback was considered useful and comprehensive. However, supervisors talked primarily about their own frames, so in order for the feedback to reach its full potential, supervisors may benefit from training techniques to stimulate a deeper reflection among trainees. PMID:25602262

  6. Perceived Breastfeeding Support Assessment Tool (PBSAT): development and testing of psychometric properties with Pakistani urban working mothers.

    PubMed

    Hirani, Shela Akbar Ali; Karmaliani, Rozina; Christie, Thomas; Parpio, Yasmin; Rafique, Ghazala

    2013-06-01

    breast feeding is an essential source of nutrition among young babies; however, in Pakistan a gradual decline in prevalence of breast feeding, especially among urban working mothers, has been reported. Previous studies among Pakistani urban working mothers have revealed that ensuring exclusivity and continuation of breast feeding is challenging if social and/or workplace environmental support is minimal or absent. This problem indicated a crucial need to assess availability of breast-feeding support for Pakistani urban working mothers by using a comprehensive, reliable, and validated tool in their national language (Urdu). to develop and test the psychometric properties of the 'Perceived Breastfeeding Support Assessment Tool' (PBSAT) that can gauge Pakistani urban working mothers' perceptions about breast-feeding support. this methodological research was undertaken in five phases. During phase I, a preliminary draft of the PBSAT was developed by using the Socio-ecological model, reviewing literature, and referring to two United States based tools. In Phase II, the instrument was evaluated by seven different experts, and, in Phase III, the instrument was revised, translated, and back translated. In Phase IV, the tool was pilot tested among 20 participants and then modified on the basis of statistical analysis. In Phase V, the refined instrument was tested on 200 breast-feeding working mothers recruited through purposive sampling from the government and private health-care settings in Karachi, Pakistan. Approvals were received from the Ethical Review Committees of the identified settings. the 29-item based PBSAT revealed an acceptable inter-rater reliability of 0.95, and an internal consistency reliability coefficient (Cronbach's alpha) of 0.85. A construct validity assessment through Exploratory Factor Analysis revealed that the PBSAT has two dimensions, 'workplace environmental support' (12 items; α=0.86) and 'social environmental support' (17 items; α=0.77). the study developed a 29-item based two-dimensional tool (in Urdu) that has acceptable psychometric properties. The PBSAT is context specific, comprehensive, and user-friendly, so it can be administered by health-care workers, employers, policy makers, and researchers to improve the quality of services of breast-feeding urban working mothers, and could ultimately improve child health in Pakistan. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing.

    PubMed

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D'Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-04-05

    International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants' comprehension of the study information was measured by using a validated digitised audio questionnaire. The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants' 'recall' and 'understanding' between first and second visits were statistically significant (F (1,41)=25.38, p<0.00001 and (F (1, 41) = 31.61, p<0.00001 respectively. Our locally developed multimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings.

  8. Plant Architecture: A Dynamic, Multilevel and Comprehensive Approach to Plant Form, Structure and Ontogeny

    PubMed Central

    Barthélémy, Daniel; Caraglio, Yves

    2007-01-01

    Background and Aims The architecture of a plant depends on the nature and relative arrangement of each of its parts; it is, at any given time, the expression of an equilibrium between endogenous growth processes and exogenous constraints exerted by the environment. The aim of architectural analysis is, by means of observation and sometimes experimentation, to identify and understand these endogenous processes and to separate them from the plasticity of their expression resulting from external influences. Scope Using the identification of several morphological criteria and considering the plant as a whole, from germination to death, architectural analysis is essentially a detailed, multilevel, comprehensive and dynamic approach to plant development. Despite their recent origin, architectural concepts and analysis methods provide a powerful tool for studying plant form and ontogeny. Completed by precise morphological observations and appropriated quantitative methods of analysis, recent researches in this field have greatly increased our understanding of plant structure and development and have led to the establishment of a real conceptual and methodological framework for plant form and structure analysis and representation. This paper is a summarized update of current knowledge on plant architecture and morphology; its implication and possible role in various aspects of modern plant biology is also discussed. PMID:17218346

  9. Two-dimensional chromatographic analysis using three second-dimension columns for continuous comprehensive analysis of intact proteins.

    PubMed

    Zhu, Zaifang; Chen, Huang; Ren, Jiangtao; Lu, Juan J; Gu, Congying; Lynch, Kyle B; Wu, Si; Wang, Zhe; Cao, Chengxi; Liu, Shaorong

    2018-03-01

    We develop a new two-dimensional (2D) high performance liquid chromatography (HPLC) approach for intact protein analysis. Development of 2D HPLC has a bottleneck problem - limited second-dimension (second-D) separation speed. We solve this problem by incorporating multiple second-D columns to allow several second-D separations to be proceeded in parallel. To demonstrate the feasibility of using this approach for comprehensive protein analysis, we select ion-exchange chromatography as the first-dimension and reverse-phase chromatography as the second-D. We incorporate three second-D columns in an innovative way so that three reverse-phase separations can be performed simultaneously. We test this system for separating both standard proteins and E. coli lysates and achieve baseline resolutions for eleven standard proteins and obtain more than 500 peaks for E. coli lysates. This is an indication that the sample complexities are greatly reduced. We see less than 10 bands when each fraction of the second-D effluents are analyzed by sodium dodecyl sulfate - polyacrylamide gel electrophoresis (SDS-PAGE), compared to hundreds of SDS-PAGE bands as the original sample is analyzed. This approach could potentially be an excellent and general tool for protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. A comprehensive quality control workflow for paired tumor-normal NGS experiments.

    PubMed

    Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc

    2017-06-01

    Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  11. PATIKA: an integrated visual environment for collaborative construction and analysis of cellular pathways.

    PubMed

    Demir, E; Babur, O; Dogrusoz, U; Gursoy, A; Nisanci, G; Cetin-Atalay, R; Ozturk, M

    2002-07-01

    Availability of the sequences of entire genomes shifts the scientific curiosity towards the identification of function of the genomes in large scale as in genome studies. In the near future, data produced about cellular processes at molecular level will accumulate with an accelerating rate as a result of proteomics studies. In this regard, it is essential to develop tools for storing, integrating, accessing, and analyzing this data effectively. We define an ontology for a comprehensive representation of cellular events. The ontology presented here enables integration of fragmented or incomplete pathway information and supports manipulation and incorporation of the stored data, as well as multiple levels of abstraction. Based on this ontology, we present the architecture of an integrated environment named Patika (Pathway Analysis Tool for Integration and Knowledge Acquisition). Patika is composed of a server-side, scalable, object-oriented database and client-side editors to provide an integrated, multi-user environment for visualizing and manipulating network of cellular events. This tool features automated pathway layout, functional computation support, advanced querying and a user-friendly graphical interface. We expect that Patika will be a valuable tool for rapid knowledge acquisition, microarray generated large-scale data interpretation, disease gene identification, and drug development. A prototype of Patika is available upon request from the authors.

  12. Enhancing knowledge discovery from cancer genomics data with Galaxy

    PubMed Central

    Albuquerque, Marco A.; Grande, Bruno M.; Ritch, Elie J.; Pararajalingam, Prasath; Jessa, Selin; Krzywinski, Martin; Grewal, Jasleen K.; Shah, Sohrab P.; Boutros, Paul C.

    2017-01-01

    Abstract The field of cancer genomics has demonstrated the power of massively parallel sequencing techniques to inform on the genes and specific alterations that drive tumor onset and progression. Although large comprehensive sequence data sets continue to be made increasingly available, data analysis remains an ongoing challenge, particularly for laboratories lacking dedicated resources and bioinformatics expertise. To address this, we have produced a collection of Galaxy tools that represent many popular algorithms for detecting somatic genetic alterations from cancer genome and exome data. We developed new methods for parallelization of these tools within Galaxy to accelerate runtime and have demonstrated their usability and summarized their runtimes on multiple cloud service providers. Some tools represent extensions or refinement of existing toolkits to yield visualizations suited to cohort-wide cancer genomic analysis. For example, we present Oncocircos and Oncoprintplus, which generate data-rich summaries of exome-derived somatic mutation. Workflows that integrate these to achieve data integration and visualizations are demonstrated on a cohort of 96 diffuse large B-cell lymphomas and enabled the discovery of multiple candidate lymphoma-related genes. Our toolkit is available from our GitHub repository as Galaxy tool and dependency definitions and has been deployed using virtualization on multiple platforms including Docker. PMID:28327945

  13. Enhancing knowledge discovery from cancer genomics data with Galaxy.

    PubMed

    Albuquerque, Marco A; Grande, Bruno M; Ritch, Elie J; Pararajalingam, Prasath; Jessa, Selin; Krzywinski, Martin; Grewal, Jasleen K; Shah, Sohrab P; Boutros, Paul C; Morin, Ryan D

    2017-05-01

    The field of cancer genomics has demonstrated the power of massively parallel sequencing techniques to inform on the genes and specific alterations that drive tumor onset and progression. Although large comprehensive sequence data sets continue to be made increasingly available, data analysis remains an ongoing challenge, particularly for laboratories lacking dedicated resources and bioinformatics expertise. To address this, we have produced a collection of Galaxy tools that represent many popular algorithms for detecting somatic genetic alterations from cancer genome and exome data. We developed new methods for parallelization of these tools within Galaxy to accelerate runtime and have demonstrated their usability and summarized their runtimes on multiple cloud service providers. Some tools represent extensions or refinement of existing toolkits to yield visualizations suited to cohort-wide cancer genomic analysis. For example, we present Oncocircos and Oncoprintplus, which generate data-rich summaries of exome-derived somatic mutation. Workflows that integrate these to achieve data integration and visualizations are demonstrated on a cohort of 96 diffuse large B-cell lymphomas and enabled the discovery of multiple candidate lymphoma-related genes. Our toolkit is available from our GitHub repository as Galaxy tool and dependency definitions and has been deployed using virtualization on multiple platforms including Docker. © The Author 2017. Published by Oxford University Press.

  14. Relation between different metal pollution criteria in sediments and its contribution on assessing toxicity.

    PubMed

    Alves, Cristina M; Ferreira, Carlos M H; Soares, Helena M V M

    2018-05-14

    Several tools have been developed and applied to evaluate the metal pollution status of sediments and predict their potential ecological risk assessment. To date, a comprehensive relationship between the information given by these sediment tools for predicting metal bioavailability and the effective toxicity observed is lacking. In this work, the possible inter-correlations between the data outcoming from using several qualitative evaluation tools of the sediment contamination (contamination factor, CF, the enrichment factor, EF, or the geoaccumulation index, Igeo), metal speciation on sediments (evaluated by the modified BCR sequential extraction procedure) and free metal concentrations in pore waters were studied. It was also our aim to evaluate if these assessment tools could be used for predicting the pore waters toxicity data as toxicity proxy. Principal component analysis and cluster analysis revealed that two quality indices used (CF and EF) were highly correlatable with the more labile fractions from BCR sediment speciation. However, neither of these parameters did correlate with the toxicity of pore waters measured by the chronic toxicity (72 h) in Pseudokirchneriella subcapitata. In contrast, the toxic effects of the given total metal load in sediments were better evaluated by using an additive metal approach using pore water free metal concentrations. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. RaMP: A Comprehensive Relational Database of Metabolomics Pathways for Pathway Enrichment Analysis of Genes and Metabolites

    PubMed Central

    Zhang, Bofei; Hu, Senyang; Baskin, Elizabeth; Patt, Andrew; Siddiqui, Jalal K.

    2018-01-01

    The value of metabolomics in translational research is undeniable, and metabolomics data are increasingly generated in large cohorts. The functional interpretation of disease-associated metabolites though is difficult, and the biological mechanisms that underlie cell type or disease-specific metabolomics profiles are oftentimes unknown. To help fully exploit metabolomics data and to aid in its interpretation, analysis of metabolomics data with other complementary omics data, including transcriptomics, is helpful. To facilitate such analyses at a pathway level, we have developed RaMP (Relational database of Metabolomics Pathways), which combines biological pathways from the Kyoto Encyclopedia of Genes and Genomes (KEGG), Reactome, WikiPathways, and the Human Metabolome DataBase (HMDB). To the best of our knowledge, an off-the-shelf, public database that maps genes and metabolites to biochemical/disease pathways and can readily be integrated into other existing software is currently lacking. For consistent and comprehensive analysis, RaMP enables batch and complex queries (e.g., list all metabolites involved in glycolysis and lung cancer), can readily be integrated into pathway analysis tools, and supports pathway overrepresentation analysis given a list of genes and/or metabolites of interest. For usability, we have developed a RaMP R package (https://github.com/Mathelab/RaMP-DB), including a user-friendly RShiny web application, that supports basic simple and batch queries, pathway overrepresentation analysis given a list of genes or metabolites of interest, and network visualization of gene-metabolite relationships. The package also includes the raw database file (mysql dump), thereby providing a stand-alone downloadable framework for public use and integration with other tools. In addition, the Python code needed to recreate the database on another system is also publicly available (https://github.com/Mathelab/RaMP-BackEnd). Updates for databases in RaMP will be checked multiple times a year and RaMP will be updated accordingly. PMID:29470400

  16. RaMP: A Comprehensive Relational Database of Metabolomics Pathways for Pathway Enrichment Analysis of Genes and Metabolites.

    PubMed

    Zhang, Bofei; Hu, Senyang; Baskin, Elizabeth; Patt, Andrew; Siddiqui, Jalal K; Mathé, Ewy A

    2018-02-22

    The value of metabolomics in translational research is undeniable, and metabolomics data are increasingly generated in large cohorts. The functional interpretation of disease-associated metabolites though is difficult, and the biological mechanisms that underlie cell type or disease-specific metabolomics profiles are oftentimes unknown. To help fully exploit metabolomics data and to aid in its interpretation, analysis of metabolomics data with other complementary omics data, including transcriptomics, is helpful. To facilitate such analyses at a pathway level, we have developed RaMP (Relational database of Metabolomics Pathways), which combines biological pathways from the Kyoto Encyclopedia of Genes and Genomes (KEGG), Reactome, WikiPathways, and the Human Metabolome DataBase (HMDB). To the best of our knowledge, an off-the-shelf, public database that maps genes and metabolites to biochemical/disease pathways and can readily be integrated into other existing software is currently lacking. For consistent and comprehensive analysis, RaMP enables batch and complex queries (e.g., list all metabolites involved in glycolysis and lung cancer), can readily be integrated into pathway analysis tools, and supports pathway overrepresentation analysis given a list of genes and/or metabolites of interest. For usability, we have developed a RaMP R package (https://github.com/Mathelab/RaMP-DB), including a user-friendly RShiny web application, that supports basic simple and batch queries, pathway overrepresentation analysis given a list of genes or metabolites of interest, and network visualization of gene-metabolite relationships. The package also includes the raw database file (mysql dump), thereby providing a stand-alone downloadable framework for public use and integration with other tools. In addition, the Python code needed to recreate the database on another system is also publicly available (https://github.com/Mathelab/RaMP-BackEnd). Updates for databases in RaMP will be checked multiple times a year and RaMP will be updated accordingly.

  17. Exposure Assessment Tools by Approaches - Exposure Reconstruction (Biomonitoring and Reverse Dosimetry)

    EPA Pesticide Factsheets

    This page provides access to a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases,

  18. Comprehensive metabolomic profiling and incident cardiovascular disease: a systematic review

    USDA-ARS?s Scientific Manuscript database

    Background: Metabolomics is a promising tool of cardiovascular biomarker discovery. We systematically reviewed the literature on comprehensive metabolomic profiling in association with incident cardiovascular disease (CVD). Methods and Results: We searched MEDLINE and EMBASE from inception to Janua...

  19. Graphical Modeling Meets Systems Pharmacology.

    PubMed

    Lombardo, Rosario; Priami, Corrado

    2017-01-01

    A main source of failures in systems projects (including systems pharmacology) is poor communication level and different expectations among the stakeholders. A common and not ambiguous language that is naturally comprehensible by all the involved players is a boost to success. We present bStyle, a modeling tool that adopts a graphical language close enough to cartoons to be a common media to exchange ideas and data and that it is at the same time formal enough to enable modeling, analysis, and dynamic simulations of a system. Data analysis and simulation integrated in the same application are fundamental to understand the mechanisms of actions of drugs: a core aspect of systems pharmacology.

  20. Graphical Modeling Meets Systems Pharmacology

    PubMed Central

    Lombardo, Rosario; Priami, Corrado

    2017-01-01

    A main source of failures in systems projects (including systems pharmacology) is poor communication level and different expectations among the stakeholders. A common and not ambiguous language that is naturally comprehensible by all the involved players is a boost to success. We present bStyle, a modeling tool that adopts a graphical language close enough to cartoons to be a common media to exchange ideas and data and that it is at the same time formal enough to enable modeling, analysis, and dynamic simulations of a system. Data analysis and simulation integrated in the same application are fundamental to understand the mechanisms of actions of drugs: a core aspect of systems pharmacology. PMID:28469411

  1. Reusable, extensible, and modifiable R scripts and Kepler workflows for comprehensive single set ChIP-seq analysis.

    PubMed

    Cormier, Nathan; Kolisnik, Tyler; Bieda, Mark

    2016-07-05

    There has been an enormous expansion of use of chromatin immunoprecipitation followed by sequencing (ChIP-seq) technologies. Analysis of large-scale ChIP-seq datasets involves a complex series of steps and production of several specialized graphical outputs. A number of systems have emphasized custom development of ChIP-seq pipelines. These systems are primarily based on custom programming of a single, complex pipeline or supply libraries of modules and do not produce the full range of outputs commonly produced for ChIP-seq datasets. It is desirable to have more comprehensive pipelines, in particular ones addressing common metadata tasks, such as pathway analysis, and pipelines producing standard complex graphical outputs. It is advantageous if these are highly modular systems, available as both turnkey pipelines and individual modules, that are easily comprehensible, modifiable and extensible to allow rapid alteration in response to new analysis developments in this growing area. Furthermore, it is advantageous if these pipelines allow data provenance tracking. We present a set of 20 ChIP-seq analysis software modules implemented in the Kepler workflow system; most (18/20) were also implemented as standalone, fully functional R scripts. The set consists of four full turnkey pipelines and 16 component modules. The turnkey pipelines in Kepler allow data provenance tracking. Implementation emphasized use of common R packages and widely-used external tools (e.g., MACS for peak finding), along with custom programming. This software presents comprehensive solutions and easily repurposed code blocks for ChIP-seq analysis and pipeline creation. Tasks include mapping raw reads, peakfinding via MACS, summary statistics, peak location statistics, summary plots centered on the transcription start site (TSS), gene ontology, pathway analysis, and de novo motif finding, among others. These pipelines range from those performing a single task to those performing full analyses of ChIP-seq data. The pipelines are supplied as both Kepler workflows, which allow data provenance tracking, and, in the majority of cases, as standalone R scripts. These pipelines are designed for ease of modification and repurposing.

  2. Mutational Signatures in Cancer (MuSiCa): a web application to implement mutational signatures analysis in cancer samples.

    PubMed

    Díaz-Gay, Marcos; Vila-Casadesús, Maria; Franch-Expósito, Sebastià; Hernández-Illán, Eva; Lozano, Juan José; Castellví-Bel, Sergi

    2018-06-14

    Mutational signatures have been proved as a valuable pattern in somatic genomics, mainly regarding cancer, with a potential application as a biomarker in clinical practice. Up to now, several bioinformatic packages to address this topic have been developed in different languages/platforms. MutationalPatterns has arisen as the most efficient tool for the comparison with the signatures currently reported in the Catalogue of Somatic Mutations in Cancer (COSMIC) database. However, the analysis of mutational signatures is nowadays restricted to a small community of bioinformatic experts. In this work we present Mutational Signatures in Cancer (MuSiCa), a new web tool based on MutationalPatterns and built using the Shiny framework in R language. By means of a simple interface suited to non-specialized researchers, it provides a comprehensive analysis of the somatic mutational status of the supplied cancer samples. It permits characterizing the profile and burden of mutations, as well as quantifying COSMIC-reported mutational signatures. It also allows classifying samples according to the above signature contributions. MuSiCa is a helpful web application to characterize mutational signatures in cancer samples. It is accessible online at http://bioinfo.ciberehd.org/GPtoCRC/en/tools.html and source code is freely available at https://github.com/marcos-diazg/musica .

  3. EuPathDB: the eukaryotic pathogen genomics database resource

    PubMed Central

    Aurrecoechea, Cristina; Barreto, Ana; Basenko, Evelina Y.; Brestelli, John; Brunk, Brian P.; Cade, Shon; Crouch, Kathryn; Doherty, Ryan; Falke, Dave; Fischer, Steve; Gajria, Bindu; Harb, Omar S.; Heiges, Mark; Hertz-Fowler, Christiane; Hu, Sufen; Iodice, John; Kissinger, Jessica C.; Lawrence, Cris; Li, Wei; Pinney, Deborah F.; Pulman, Jane A.; Roos, David S.; Shanmugasundram, Achchuthan; Silva-Franco, Fatima; Steinbiss, Sascha; Stoeckert, Christian J.; Spruill, Drew; Wang, Haiming; Warrenfeltz, Susanne; Zheng, Jie

    2017-01-01

    The Eukaryotic Pathogen Genomics Database Resource (EuPathDB, http://eupathdb.org) is a collection of databases covering 170+ eukaryotic pathogens (protists & fungi), along with relevant free-living and non-pathogenic species, and select pathogen hosts. To facilitate the discovery of meaningful biological relationships, the databases couple preconfigured searches with visualization and analysis tools for comprehensive data mining via intuitive graphical interfaces and APIs. All data are analyzed with the same workflows, including creation of gene orthology profiles, so data are easily compared across data sets, data types and organisms. EuPathDB is updated with numerous new analysis tools, features, data sets and data types. New tools include GO, metabolic pathway and word enrichment analyses plus an online workspace for analysis of personal, non-public, large-scale data. Expanded data content is mostly genomic and functional genomic data while new data types include protein microarray, metabolic pathways, compounds, quantitative proteomics, copy number variation, and polysomal transcriptomics. New features include consistent categorization of searches, data sets and genome browser tracks; redesigned gene pages; effective integration of alternative transcripts; and a EuPathDB Galaxy instance for private analyses of a user's data. Forthcoming upgrades include user workspaces for private integration of data with existing EuPathDB data and improved integration and presentation of host–pathogen interactions. PMID:27903906

  4. Primary care physicians' perspectives on computer-based health risk assessment tools for chronic diseases: a mixed methods study.

    PubMed

    Voruganti, Teja R; O'Brien, Mary Ann; Straus, Sharon E; McLaughlin, John R; Grunfeld, Eva

    2015-09-24

    Health risk assessment tools compute an individual's risk of developing a disease. Routine use of such tools by primary care physicians (PCPs) is potentially useful in chronic disease prevention. We sought physicians' awareness and perceptions of the usefulness, usability and feasibility of performing assessments with computer-based risk assessment tools in primary care settings. Focus groups and usability testing with a computer-based risk assessment tool were conducted with PCPs from both university-affiliated and community-based practices. Analysis was derived from grounded theory methodology. PCPs (n = 30) were aware of several risk assessment tools although only select tools were used routinely. The decision to use a tool depended on how use impacted practice workflow and whether the tool had credibility. Participants felt that embedding tools in the electronic medical records (EMRs) system might allow for health information from the medical record to auto-populate into the tool. User comprehension of risk could also be improved with computer-based interfaces that present risk in different formats. In this study, PCPs chose to use certain tools more regularly because of usability and credibility. Despite there being differences in the particular tools a clinical practice used, there was general appreciation for the usefulness of tools for different clinical situations. Participants characterised particular features of an ideal tool, feeling strongly that embedding risk assessment tools in the EMR would maximise accessibility and use of the tool for chronic disease management. However, appropriate practice workflow integration and features that facilitate patient understanding at point-of-care are also essential.

  5. Application of a novel metabolomic approach based on atmospheric pressure photoionization mass spectrometry using flow injection analysis for the study of Alzheimer's disease.

    PubMed

    González-Domínguez, Raúl; García-Barrera, Tamara; Gómez-Ariza, José Luis

    2015-01-01

    The use of atmospheric pressure photoionization is not widespread in metabolomics, despite its considerable potential for the simultaneous analysis of compounds with diverse polarities. This work considers the development of a novel analytical approach based on flow injection analysis and atmospheric pressure photoionization mass spectrometry for rapid metabolic screening of serum samples. Several experimental parameters were optimized, such as type of dopant, flow injection solvent, and their flows, given that a careful selection of these variables is mandatory for a comprehensive analysis of metabolites. Toluene and methanol were the most suitable dopant and flow injection solvent, respectively. Moreover, analysis in negative mode required higher solvent and dopant flows (100 µl min(-1) and 40 µl min(-1), respectively) compared to positive mode (50 µl min(-1) and 20 µl min(-1)). Then, the optimized approach was used to elucidate metabolic alterations associated with Alzheimer's disease. Thereby, results confirm the increase of diacylglycerols, ceramides, ceramide-1-phosphate and free fatty acids, indicating membrane destabilization processes, and reduction of fatty acid amides and several neurotransmitters related to impairments in neuronal transmission, among others. Therefore, it could be concluded that this metabolomic tool presents a great potential for analysis of biological samples, considering its high-throughput screening capability, fast analysis and comprehensive metabolite coverage. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Confocal fluorescence techniques in industrial application

    NASA Astrophysics Data System (ADS)

    Eggeling, Christian; Gall, Karsten; Palo, Kaupo; Kask, Peet; Brand, Leif

    2003-06-01

    The FCS+plus family of evaluation tools for confocal fluorescence spectroscopy, which was developed during recent years, offers a comprehensive view to a series of fluorescence properties. Originating in fluorescence correlation spectroscopy (FCS) and using similar experimental equipment, a system of signal processing methods such as fluorescence intensity distribution analysis (FIDA) was created to analyze in detail the fluctuation behavior of fluorescent particles within a small area of detection. Giving simultaneous access to molecular parameters like concentration, translational and rotational diffusion, molecular brightness, and multicolor coincidence, this portfolio was enhanced by more traditional techniques of fluorescence lifetime as well as time-resolved anisotropy determination. The cornerstones of the FCS+plus methodology will be shortly described. The inhibition of a phosphatase enzyme activity gives a comprehensive industrial application that demonstrates FCS+plus' versatility and its potential for pharmaceutical drug discovery.

  7. Visualizing Java uncertainty

    NASA Astrophysics Data System (ADS)

    Knight, Claire; Munro, Malcolm

    2001-07-01

    Distributed component based systems seem to be the immediate future for software development. The use of such techniques, object oriented languages, and the combination with ever more powerful higher-level frameworks has led to the rapid creation and deployment of such systems to cater for the demand of internet and service driven business systems. This diversity of solution through both components utilised and the physical/virtual locations of those components can provide powerful resolutions to the new demand. The problem lies in the comprehension and maintenance of such systems because they then have inherent uncertainty. The components combined at any given time for a solution may differ, the messages generated, sent, and/or received may differ, and the physical/virtual locations cannot be guaranteed. Trying to account for this uncertainty and to build in into analysis and comprehension tools is important for both development and maintenance activities.

  8. Improved serial analysis of V1 ribosomal sequence tags (SARST-V1) provides a rapid, comprehensive, sequence-based characterization of bacterial diversity and community composition.

    PubMed

    Yu, Zhongtang; Yu, Marie; Morrison, Mark

    2006-04-01

    Serial analysis of ribosomal sequence tags (SARST) is a recently developed technology that can generate large 16S rRNA gene (rrs) sequence data sets from microbiomes, but there are numerous enzymatic and purification steps required to construct the ribosomal sequence tag (RST) clone libraries. We report here an improved SARST method, which still targets the V1 hypervariable region of rrs genes, but reduces the number of enzymes, oligonucleotides, reagents, and technical steps needed to produce the RST clone libraries. The new method, hereafter referred to as SARST-V1, was used to examine the eubacterial diversity present in community DNA recovered from the microbiome resident in the ovine rumen. The 190 sequenced clones contained 1055 RSTs and no less than 236 unique phylotypes (based on > or = 95% sequence identity) that were assigned to eight different eubacterial phyla. Rarefaction and monomolecular curve analyses predicted that the complete RST clone library contains 99% of the 353 unique phylotypes predicted to exist in this microbiome. When compared with ribosomal intergenic spacer analysis (RISA) of the same community DNA sample, as well as a compilation of nine previously published conventional rrs clone libraries prepared from the same type of samples, the RST clone library provided a more comprehensive characterization of the eubacterial diversity present in rumen microbiomes. As such, SARST-V1 should be a useful tool applicable to comprehensive examination of diversity and composition in microbiomes and offers an affordable, sequence-based method for diversity analysis.

  9. Association of Indoor Smoke-Free Air Laws with Hospital Admissions for Acute Myocardial Infarction and Stroke in Three States

    PubMed Central

    Loomis, Brett R.; Juster, Harlan R.

    2012-01-01

    Objective. To examine whether comprehensive smoke-free air laws enacted in Florida, New York, and Oregon are associated with reductions in hospital admissions for acute myocardial infarction (AMI) and stroke. Methods. Analyzed trends in county-level, age-adjusted, hospital admission rates for AMI and stroke from 1990 to 2006 (quarterly) for Florida, 1995 to 2006 (monthly) for New York, and 1998 to 2006 (monthly) for Oregon to identify any association between admission rates and passage of comprehensive smoke-free air laws. Interrupted time series analysis was used to adjust for the effects of preexisting moderate local-level laws, seasonal variation in hospital admissions, differences across counties, and a secular time trend. Results. More than 3 years after passage of statewide comprehensive smoke-free air laws, rates of hospitalization for AMI were reduced by 18.4% (95% CI: 8.8–28.0%) in Florida and 15.5% (95% CI: 11.0–20.1%) in New York. Rates of hospitalization for stroke were reduced by 18.1% (95% CI: 9.3–30.0%) in Florida. The few local comprehensive laws in Oregon were not associated with reductions in AMI or stroke statewide. Conclusion. Comprehensive smoke-free air laws are an effective policy tool for reducing the burden of AMI and stroke. PMID:22778759

  10. SpacePy - a Python-based library of tools for the space sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morley, Steven K; Welling, Daniel T; Koller, Josef

    Space science deals with the bodies within the solar system and the interplanetary medium; the primary focus is on atmospheres and above - at Earth the short timescale variation in the the geomagnetic field, the Van Allen radiation belts and the deposition of energy into the upper atmosphere are key areas of investigation. SpacePy is a package for Python, targeted at the space sciences, that aims to make basic data analysis, modeling and visualization easier. It builds on the capabilities of the well-known NumPy and MatPlotLib packages. Publication quality output direct from analyses is emphasized. The SpacePy project seeks tomore » promote accurate and open research standards by providing an open environment for code development. In the space physics community there has long been a significant reliance on proprietary languages that restrict free transfer of data and reproducibility of results. By providing a comprehensive, open-source library of widely used analysis and visualization tools in a free, modern and intuitive language, we hope that this reliance will be diminished. SpacePy includes implementations of widely used empirical models, statistical techniques used frequently in space science (e.g. superposed epoch analysis), and interfaces to advanced tools such as electron drift shell calculations for radiation belt studies. SpacePy also provides analysis and visualization tools for components of the Space Weather Modeling Framework - currently this only includes the BATS-R-US 3-D magnetohydrodynamic model and the RAM ring current model - including streamline tracing in vector fields. Further development is currently underway. External libraries, which include well-known magnetic field models, high-precision time conversions and coordinate transformations are wrapped for access from Python using SWIG and f2py. The rest of the tools have been implemented directly in Python. The provision of open-source tools to perform common tasks will provide openness in the analysis methods employed in scientific studies and will give access to advanced tools to all space scientists regardless of affiliation or circumstance.« less

  11. [Adaptations of psychotropic drugs in patients aged 75 years and older in a departement of geriatric internal medecine: report of 100 cases].

    PubMed

    Couderc, Anne-Laure; Bailly-Agaledes, Cindy; Camalet, Joëlle; Capriz-Ribière, Françoise; Gary, André; Robert, Philippe; Brocker, Patrice; Guérin, Olivier

    2011-06-01

    The elderly often with multiple diseases are particularly at risk from adverse drug reactions. Nearly half of iatrogenic drug in the elderly are preventable. Some medications such as psychotropic drugs are particularly involved in iatrogenic accidents. We wanted to know if the tools of the comprehensive geriatric assessment or other factors could influence the changes of psychotropic drugs in a geriatric departement. Our prospective study of four months in 100 patients aged 75 years and older hospitalized in the Geriatric Internal Medecine Departement of University Hospital of Nice investigated what were the clinical or biological reasons and tools used during changes of psychotropic drugs. We compared these changes according to the comprehensive geriatric assessment tools and we analyzed the changes based on lists of potentially inappropriate medications by Laroche et al. and from the instrument STOPP/START. The Mini Mental State Examination (MMSE) was the tool that has most influenced the changes in psychotropic including a tendency to increase and the introduction of anxiolytics when MMSE < 20 (p = 0.007) while neuroleptics instead arrested and decreased (p = 0.012). The comprehensive geriatric assessment has its place in decision support during the potentially iatrogenic prescriptions of drugs such as psychotropic and new tools such as STOPP/START can also be a help to the prescriber informed.

  12. School environment assessment tools to address behavioural risk factors of non-communicable diseases: A scoping review.

    PubMed

    Saluja, Kiran; Rawal, Tina; Bassi, Shalini; Bhaumik, Soumyadeep; Singh, Ankur; Park, Min Hae; Kinra, Sanjay; Arora, Monika

    2018-06-01

    We aimed to identify, describe and analyse school environment assessment (SEA) tools that address behavioural risk factors (unhealthy diet, physical inactivity, tobacco and alcohol consumption) for non-communicable diseases (NCD). We searched in MEDLINE and Web of Science, hand-searched reference lists and contacted experts. Basic characteristics, measures assessed and measurement properties (validity, reliability, usability) of identified tools were extracted. We narratively synthesized the data and used content analysis to develop a list of measures used in the SEA tools. Twenty-four SEA tools were identified, mostly from developed countries. Out of these, 15 were questionnaire based, 8 were checklists or observation based tools and one tool used a combined checklist/observation based and telephonic questionnaire approach. Only 1 SEA tool had components related to all the four NCD risk factors, 2 SEA tools has assessed three NCD risk factors (diet/nutrition, physical activity, tobacco), 10 SEA tools has assessed two NCD risk factors (diet/nutrition and physical activity) and 11 SEA tools has assessed only one of the NCD risk factor. Several measures were used in the tools to assess the four NCD risk factors, but tobacco and alcohol was sparingly included. Measurement properties were reported for 14 tools. The review provides a comprehensive list of measures used in SEA tools which could be a valuable resource to guide future development of such tools. A valid and reliable SEA tool which could simultaneously evaluate all NCD risk factors, that has been tested in different settings with varying resource availability is needed.

  13. Research resource: Update and extension of a glycoprotein hormone receptors web application.

    PubMed

    Kreuchwig, Annika; Kleinau, Gunnar; Kreuchwig, Franziska; Worth, Catherine L; Krause, Gerd

    2011-04-01

    The SSFA-GPHR (Sequence-Structure-Function-Analysis of Glycoprotein Hormone Receptors) database provides a comprehensive set of mutation data for the glycoprotein hormone receptors (covering the lutropin, the FSH, and the TSH receptors). Moreover, it provides a platform for comparison and investigation of these homologous receptors and helps in understanding protein malfunctions associated with several diseases. Besides extending the data set (> 1100 mutations), the database has been completely redesigned and several novel features and analysis tools have been added to the web site. These tools allow the focused extraction of semiquantitative mutant data from the GPHR subtypes and different experimental approaches. Functional and structural data of the GPHRs are now linked interactively at the web interface, and new tools for data visualization (on three-dimensional protein structures) are provided. The interpretation of functional findings is supported by receptor morphings simulating intramolecular changes during the activation process, which thus help to trace the potential function of each amino acid and provide clues to the local structural environment, including potentially relocated spatial counterpart residues. Furthermore, double and triple mutations are newly included to allow the analysis of their functional effects related to their spatial interrelationship in structures or homology models. A new important feature is the search option and data visualization by interactive and user-defined snake-plots. These new tools allow fast and easy searches for specific functional data and thereby give deeper insights in the mechanisms of hormone binding, signal transduction, and signaling regulation. The web application "Sequence-Structure-Function-Analysis of GPHRs" is accessible on the internet at http://www.ssfa-gphr.de/.

  14. Phases of ERA - Tools

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  15. Status of Cognitive Testing of Adults in India

    PubMed Central

    Porrselvi, A. P.; Shankar, V.

    2017-01-01

    The assessment of cognitive function is a challenging yet an integral component of psychological, psychiatric, and neurological evaluation. Cognitive assessment tools either can be administered quickly for screening for neurocognitive disorders or can be comprehensive and detailed to identify cognitive deficits for the purpose of localization, diagnosis, and rehabilitation. This article is a comprehensive review of published research that discusses the current challenges for cognitive testing in India, available tools used for the assessment of cognitive function in India, and future directions for cognitive testing in India. PMID:29184333

  16. A Comprehensive Strategy for Accurate Mutation Detection of the Highly Homologous PMS2.

    PubMed

    Li, Jianli; Dai, Hongzheng; Feng, Yanming; Tang, Jia; Chen, Stella; Tian, Xia; Gorman, Elizabeth; Schmitt, Eric S; Hansen, Terah A A; Wang, Jing; Plon, Sharon E; Zhang, Victor Wei; Wong, Lee-Jun C

    2015-09-01

    Germline mutations in the DNA mismatch repair gene PMS2 underlie the cancer susceptibility syndrome, Lynch syndrome. However, accurate molecular testing of PMS2 is complicated by a large number of highly homologous sequences. To establish a comprehensive approach for mutation detection of PMS2, we have designed a strategy combining targeted capture next-generation sequencing (NGS), multiplex ligation-dependent probe amplification, and long-range PCR followed by NGS to simultaneously detect point mutations and copy number changes of PMS2. Exonic deletions (E2 to E9, E5 to E9, E8, E10, E14, and E1 to E15), duplications (E11 to E12), and a nonsense mutation, p.S22*, were identified. Traditional multiplex ligation-dependent probe amplification and Sanger sequencing approaches cannot differentiate the origin of the exonic deletions in the 3' region when PMS2 and PMS2CL share identical sequences as a result of gene conversion. Our approach allows unambiguous identification of mutations in the active gene with a straightforward long-range-PCR/NGS method. Breakpoint analysis of multiple samples revealed that recurrent exon 14 deletions are mediated by homologous Alu sequences. Our comprehensive approach provides a reliable tool for accurate molecular analysis of genes containing multiple copies of highly homologous sequences and should improve PMS2 molecular analysis for patients with Lynch syndrome. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  17. Automated and comprehensive link engineering supporting branched, ring, and mesh network topologies

    NASA Astrophysics Data System (ADS)

    Farina, J.; Khomchenko, D.; Yevseyenko, D.; Meester, J.; Richter, A.

    2016-02-01

    Link design, while relatively easy in the past, can become quite cumbersome with complex channel plans and equipment configurations. The task of designing optical transport systems and selecting equipment is often performed by an applications or sales engineer using simple tools, such as custom Excel spreadsheets. Eventually, every individual has their own version of the spreadsheet as well as their own methodology for building the network. This approach becomes unmanageable very quickly and leads to mistakes, bending of the engineering rules and installations that do not perform as expected. We demonstrate a comprehensive planning environment, which offers an efficient approach to unify, control and expedite the design process by controlling libraries of equipment and engineering methodologies, automating the process and providing the analysis tools necessary to predict system performance throughout the system and for all channels. In addition to the placement of EDFAs and DCEs, performance analysis metrics are provided at every step of the way. Metrics that can be tracked include power, CD and OSNR, SPM, XPM, FWM and SBS. Automated routine steps assist in design aspects such as equalization, padding and gain setting for EDFAs, the placement of ROADMs and transceivers, and creating regeneration points. DWDM networks consisting of a large number of nodes and repeater huts, interconnected in linear, branched, mesh and ring network topologies, can be designed much faster when compared with conventional design methods. Using flexible templates for all major optical components, our technology-agnostic planning approach supports the constant advances in optical communications.

  18. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).

  19. General Pressurization Model in Simscape

    NASA Technical Reports Server (NTRS)

    Servin, Mario; Garcia, Vicky

    2010-01-01

    System integration is an essential part of the engineering design process. The Ares I Upper Stage (US) is a complex system which is made up of thousands of components assembled into subsystems including a J2-X engine, liquid hydrogen (LH2) and liquid oxygen (LO2) tanks, avionics, thrust vector control, motors, etc. System integration is the task of connecting together all of the subsystems into one large system. To ensure that all the components will "fit together" as well as safety and, quality, integration analysis is required. Integration analysis verifies that, as an integrated system, the system will behave as designed. Models that represent the actual subsystems are built for more comprehensive analysis. Matlab has been an instrument widely use by engineers to construct mathematical models of systems. Simulink, one of the tools offered by Matlab, provides multi-domain graphical environment to simulate and design time-varying systems. Simulink is a powerful tool to analyze the dynamic behavior of systems over time. Furthermore, Simscape, a tool provided by Simulink, allows users to model physical (such as mechanical, thermal and hydraulic) systems using physical networks. Using Simscape, a model representing an inflow of gas to a pressurized tank was created where the temperature and pressure of the tank are measured over time to show the behavior of the gas. By further incorporation of Simscape into model building, the full potential of this software can be discovered and it hopefully can become a more utilized tool.

  20. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    NASA Astrophysics Data System (ADS)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  1. Causal Influence of Articulatory Motor Cortex on Comprehending Single Spoken Words: TMS Evidence.

    PubMed

    Schomers, Malte R; Kirilina, Evgeniya; Weigand, Anne; Bajbouj, Malek; Pulvermüller, Friedemann

    2015-10-01

    Classic wisdom had been that motor and premotor cortex contribute to motor execution but not to higher cognition and language comprehension. In contrast, mounting evidence from neuroimaging, patient research, and transcranial magnetic stimulation (TMS) suggest sensorimotor interaction and, specifically, that the articulatory motor cortex is important for classifying meaningless speech sounds into phonemic categories. However, whether these findings speak to the comprehension issue is unclear, because language comprehension does not require explicit phonemic classification and previous results may therefore relate to factors alien to semantic understanding. We here used the standard psycholinguistic test of spoken word comprehension, the word-to-picture-matching task, and concordant TMS to articulatory motor cortex. TMS pulses were applied to primary motor cortex controlling either the lips or the tongue as subjects heard critical word stimuli starting with bilabial lip-related or alveolar tongue-related stop consonants (e.g., "pool" or "tool"). A significant cross-over interaction showed that articulatory motor cortex stimulation delayed comprehension responses for phonologically incongruent words relative to congruous ones (i.e., lip area TMS delayed "tool" relative to "pool" responses). As local TMS to articulatory motor areas differentially delays the comprehension of phonologically incongruous spoken words, we conclude that motor systems can take a causal role in semantic comprehension and, hence, higher cognition. © The Author 2014. Published by Oxford University Press.

  2. Information transfer satellite concept study. Volume 4: computer manual

    NASA Technical Reports Server (NTRS)

    Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.

    1971-01-01

    The Satellite Telecommunications Analysis and Modeling Program (STAMP) provides the user with a flexible and comprehensive tool for the analysis of ITS system requirements. While obtaining minimum cost design points, the program enables the user to perform studies over a wide range of user requirements and parametric demands. The program utilizes a total system approach wherein the ground uplink and downlink, the spacecraft, and the launch vehicle are simultaneously synthesized. A steepest descent algorithm is employed to determine the minimum total system cost design subject to the fixed user requirements and imposed constraints. In the process of converging to the solution, the pertinent subsystem tradeoffs are resolved. This report documents STAMP through a technical analysis and a description of the principal techniques employed in the program.

  3. Identification of metabolic pathways using pathfinding approaches: a systematic review.

    PubMed

    Abd Algfoor, Zeyad; Shahrizal Sunar, Mohd; Abdullah, Afnizanfaizal; Kolivand, Hoshang

    2017-03-01

    Metabolic pathways have become increasingly available for various microorganisms. Such pathways have spurred the development of a wide array of computational tools, in particular, mathematical pathfinding approaches. This article can facilitate the understanding of computational analysis of metabolic pathways in genomics. Moreover, stoichiometric and pathfinding approaches in metabolic pathway analysis are discussed. Three major types of studies are elaborated: stoichiometric identification models, pathway-based graph analysis and pathfinding approaches in cellular metabolism. Furthermore, evaluation of the outcomes of the pathways with mathematical benchmarking metrics is provided. This review would lead to better comprehension of metabolism behaviors in living cells, in terms of computed pathfinding approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. MFV-class: a multi-faceted visualization tool of object classes.

    PubMed

    Zhang, Zhi-meng; Pan, Yun-he; Zhuang, Yue-ting

    2004-11-01

    Classes are key software components in an object-oriented software system. In many industrial OO software systems, there are some classes that have complicated structure and relationships. So in the processes of software maintenance, testing, software reengineering, software reuse and software restructure, it is a challenge for software engineers to understand these classes thoroughly. This paper proposes a class comprehension model based on constructivist learning theory, and implements a software visualization tool (MFV-Class) to help in the comprehension of a class. The tool provides multiple views of class to uncover manifold facets of class contents. It enables visualizing three object-oriented metrics of classes to help users focus on the understanding process. A case study was conducted to evaluate our approach and the toolkit.

  5. Beyond Readability: Investigating Coherence of Clinical Text for Consumers

    PubMed Central

    Hetzel, Scott; Dalrymple, Prudence; Keselman, Alla

    2011-01-01

    Background A basic tenet of consumer health informatics is that understandable health resources empower the public. Text comprehension holds great promise for helping to characterize consumer problems in understanding health texts. The need for efficient ways to assess consumer-oriented health texts and the availability of computationally supported tools led us to explore the effect of various text characteristics on readers’ understanding of health texts, as well as to develop novel approaches to assessing these characteristics. Objective The goal of this study was to compare the impact of two different approaches to enhancing readability, and three interventions, on individuals’ comprehension of short, complex passages of health text. Methods Participants were 80 university staff, faculty, or students. Each participant was asked to “retell” the content of two health texts: one a clinical trial in the domain of diabetes mellitus, and the other typical Visit Notes. These texts were transformed for the intervention arms of the study. Two interventions provided terminology support via (1) standard dictionary or (2) contextualized vocabulary definitions. The third intervention provided coherence improvement. We assessed participants’ comprehension of the clinical texts through propositional analysis, an open-ended questionnaire, and analysis of the number of errors made. Results For the clinical trial text, the effect of text condition was not significant in any of the comparisons, suggesting no differences in recall, despite the varying levels of support (P = .84). For the Visit Note, however, the difference in the median total propositions recalled between the Coherent and the (Original + Dictionary) conditions was significant (P = .04). This suggests that participants in the Coherent condition recalled more of the original Visit Notes content than did participants in the Original and the Dictionary conditions combined. However, no difference was seen between (Original + Dictionary) and Vocabulary (P = .36) nor Coherent and Vocabulary (P = .62). No statistically significant effect of any document transformation was found either in the open-ended questionnaire (clinical trial: P = .86, Visit Note: P = .20) or in the error rate (clinical trial: P = .47, Visit Note: P = .25). However, post hoc power analysis suggested that increasing the sample size by approximately 6 participants per condition would result in a significant difference for the Visit Note, but not for the clinical trial text. Conclusions Statistically, the results of this study attest that improving coherence has a small effect on consumer comprehension of clinical text, but the task is extremely labor intensive and not scalable. Further research is needed using texts from more diverse clinical domains and more heterogeneous participants, including actual patients. Since comprehensibility of clinical text appears difficult to automate, informatics support tools may most productively support the health care professionals tasked with making clinical information understandable to patients. PMID:22138127

  6. Self-report pain and symptom measures for primary dysmenorrhoea: a critical review.

    PubMed

    Chen, C X; Kwekkeboom, K L; Ward, S E

    2015-03-01

    Primary dysmenorrhoea (PD) is highly prevalent among women of reproductive age and it can have significant short- and long-term consequences for both women and society as a whole. Validated symptom measures are fundamental for researchers to understand women's symptom experience of PD and to test symptom interventions. The objective of this paper was to critically review the content and psychometric properties of self-report tools to measure symptoms of PD. Databases including PubMed, PsychoINFO, Cumulative Index of Nursing and Allied Health Literature, and Health and Psychosocial Instruments were searched for self-report symptom measures that had been used among women with either PD or perimenstrual symptoms. A total of 15 measures met inclusion criteria and were included in the final analysis. The measures were categorized into generic pain measures, dysmenorrhoea-specific measures, and tools designed to measure perimenstrual symptoms. These measures had varying degrees of comprehensiveness of symptoms being measured, relevance to PD, multidimensionality and psychometric soundness. No single measure was found to be optimal for use, but some dysmenorrhoea-specific measures could be recommended if revised and further tested. Key issues in symptom measurement for PD are discussed. Future research needs to strengthen dysmenorrhoea-specific symptom measures by including a comprehensive list of symptoms based on the pathogenesis of PD, exploring relevant symptom dimensions beyond symptom severity (e.g., frequency, duration, symptom distress), and testing psychometric properties of the adapted tools using sound methodology and diverse samples. © 2014 European Pain Federation - EFIC®

  7. Thermal Analysis of Magnetically-Coupled Pump for Cryogenic Applications

    NASA Technical Reports Server (NTRS)

    Senocak, Inanc; Udaykumar, H. S.; Ndri, Narcisse; Francois, Marianne; Shyy, Wei

    1999-01-01

    Magnetically-coupled pump is under evaluation at Kennedy Space Center for possible cryogenic applications. A major concern is the impact of low temperature fluid flows on the pump performance. As a first step toward addressing this and related issues, a computational fluid dynamics and heat transfer tool has been adopted in a pump geometry. The computational tool includes (i) a commercial grid generator to handle multiple grid blocks and complicated geometric definitions, and (ii) an in-house computational fluid dynamics and heat transfer software developed in the Principal Investigator's group at the University of Florida. Both pure-conduction and combined convection-conduction computations have been conducted. A pure-conduction analysis gives insufficient information about the overall thermal distribution. Combined convection-conduction analysis indicates the significant influence of the coolant over the entire flow path. Since 2-D simulation is of limited help, future work on full 3-D modeling of the pump using multi-materials is needed. A comprehensive and accurate model can be developed to take into account the effect of multi-phase flow in the cooling flow loop, and the magnetic interactions.

  8. Query2Question: Translating Visualization Interaction into Natural Language.

    PubMed

    Nafari, Maryam; Weaver, Chris

    2015-06-01

    Richly interactive visualization tools are increasingly popular for data exploration and analysis in a wide variety of domains. Existing systems and techniques for recording provenance of interaction focus either on comprehensive automated recording of low-level interaction events or on idiosyncratic manual transcription of high-level analysis activities. In this paper, we present the architecture and translation design of a query-to-question (Q2Q) system that automatically records user interactions and presents them semantically using natural language (written English). Q2Q takes advantage of domain knowledge and uses natural language generation (NLG) techniques to translate and transcribe a progression of interactive visualization states into a visual log of styled text that complements and effectively extends the functionality of visualization tools. We present Q2Q as a means to support a cross-examination process in which questions rather than interactions are the focus of analytic reasoning and action. We describe the architecture and implementation of the Q2Q system, discuss key design factors and variations that effect question generation, and present several visualizations that incorporate Q2Q for analysis in a variety of knowledge domains.

  9. Metabolomics study of Saw palmetto extracts based on 1H NMR spectroscopy.

    PubMed

    de Combarieu, Eric; Martinelli, Ernesto Marco; Pace, Roberto; Sardone, Nicola

    2015-04-01

    Preparations containing Saw palmetto extracts are used in traditional medicine to treat benign prostatic hyperplasia. According to the European and the American Pharmacopoeias, the extract is obtained from comminuted Saw palmetto berries by a suitable extracting procedure using ethanol or supercritical carbon dioxide or a mixture of n-hexane and methylpentanes. In the present study an approach to metabolomics profiling using nuclear magnetic resonance (NMR) has been used as a finger-printing tool to assess the overall composition of the extracts. The phytochemical analysis coupled with principal component analysis (PCA) showed the same composition of the Saw palmetto extracts obtained with carbon dioxide and hexane with minor not significant differences for extracts obtained with ethanol. In fact these differences are anyhow lower than the batch-to-batch variability ascribable to the natural-occurring variability in the Saw palmetto fruits' phytochemical composition. The fingerprinting analysis combined with chemometric method, is a technique, which would provide a tool to comprehensively assess the quality control of Saw palmetto extracts. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Primary Health Care Evaluation: the view of clients and professionals about the Family Health Strategy.

    PubMed

    da Silva, Simone Albino; Baitelo, Tamara Cristina; Fracolli, Lislaine Aparecida

    2015-01-01

    to evaluate the attributes of primary health care as for access; longitudinality; comprehensiveness; coordination; family counseling and community counseling in the Family Health Strategy, triangulating and comparing the views of stakeholders involved in the care process. evaluative research with a quantitative approach and cross-sectional design. Data collected using the Primary Care Assessment Tool for interviews with 527 adult clients, 34 health professionals, and 330 parents of children up to two years old, related to 33 family health teams, in eleven municipalities. Analysis conducted in the Statistical Package for Social Sciences software, with a confidence interval of 95% and error of 0.1. the three groups assessed the first contact access - accessibility with low scores. Professionals evaluated with a high score the other attributes. Clients assigned low score evaluations for the attributes: community counseling; family counseling; comprehensiveness - services rendered; comprehensiveness - available services. the quality of performance self-reported by the professionals of the Family Health Strategy is not perceived or valued by clients, and the actions and services may have been developed inappropriately or insufficiently to be apprehended by the experience of clients.

  11. A comprehensive map of the mTOR signaling network

    PubMed Central

    Caron, Etienne; Ghosh, Samik; Matsuoka, Yukiko; Ashton-Beaucage, Dariel; Therrien, Marc; Lemieux, Sébastien; Perreault, Claude; Roux, Philippe P; Kitano, Hiroaki

    2010-01-01

    The mammalian target of rapamycin (mTOR) is a central regulator of cell growth and proliferation. mTOR signaling is frequently dysregulated in oncogenic cells, and thus an attractive target for anticancer therapy. Using CellDesigner, a modeling support software for graphical notation, we present herein a comprehensive map of the mTOR signaling network, which includes 964 species connected by 777 reactions. The map complies with both the systems biology markup language (SBML) and graphical notation (SBGN) for computational analysis and graphical representation, respectively. As captured in the mTOR map, we review and discuss our current understanding of the mTOR signaling network and highlight the impact of mTOR feedback and crosstalk regulations on drug-based cancer therapy. This map is available on the Payao platform, a Web 2.0 based community-wide interactive process for creating more accurate and information-rich databases. Thus, this comprehensive map of the mTOR network will serve as a tool to facilitate systems-level study of up-to-date mTOR network components and signaling events toward the discovery of novel regulatory processes and therapeutic strategies for cancer. PMID:21179025

  12. Missing Value Imputation Approach for Mass Spectrometry-based Metabolomics Data.

    PubMed

    Wei, Runmin; Wang, Jingye; Su, Mingming; Jia, Erik; Chen, Shaoqiu; Chen, Tianlu; Ni, Yan

    2018-01-12

    Missing values exist widely in mass-spectrometry (MS) based metabolomics data. Various methods have been applied for handling missing values, but the selection can significantly affect following data analyses. Typically, there are three types of missing values, missing not at random (MNAR), missing at random (MAR), and missing completely at random (MCAR). Our study comprehensively compared eight imputation methods (zero, half minimum (HM), mean, median, random forest (RF), singular value decomposition (SVD), k-nearest neighbors (kNN), and quantile regression imputation of left-censored data (QRILC)) for different types of missing values using four metabolomics datasets. Normalized root mean squared error (NRMSE) and NRMSE-based sum of ranks (SOR) were applied to evaluate imputation accuracy. Principal component analysis (PCA)/partial least squares (PLS)-Procrustes analysis were used to evaluate the overall sample distribution. Student's t-test followed by correlation analysis was conducted to evaluate the effects on univariate statistics. Our findings demonstrated that RF performed the best for MCAR/MAR and QRILC was the favored one for left-censored MNAR. Finally, we proposed a comprehensive strategy and developed a public-accessible web-tool for the application of missing value imputation in metabolomics ( https://metabolomics.cc.hawaii.edu/software/MetImp/ ).

  13. MIPS: analysis and annotation of proteins from whole genomes

    PubMed Central

    Mewes, H. W.; Amid, C.; Arnold, R.; Frishman, D.; Güldener, U.; Mannhaupt, G.; Münsterkötter, M.; Pagel, P.; Strack, N.; Stümpflen, V.; Warfsmann, J.; Ruepp, A.

    2004-01-01

    The Munich Information Center for Protein Sequences (MIPS-GSF), Neuherberg, Germany, provides protein sequence-related information based on whole-genome analysis. The main focus of the work is directed toward the systematic organization of sequence-related attributes as gathered by a variety of algorithms, primary information from experimental data together with information compiled from the scientific literature. MIPS maintains automatically generated and manually annotated genome-specific databases, develops systematic classification schemes for the functional annotation of protein sequences and provides tools for the comprehensive analysis of protein sequences. This report updates the information on the yeast genome (CYGD), the Neurospora crassa genome (MNCDB), the database of complete cDNAs (German Human Genome Project, NGFN), the database of mammalian protein–protein interactions (MPPI), the database of FASTA homologies (SIMAP), and the interface for the fast retrieval of protein-associated information (QUIPOS). The Arabidopsis thaliana database, the rice database, the plant EST databases (MATDB, MOsDB, SPUTNIK), as well as the databases for the comprehensive set of genomes (PEDANT genomes) are described elsewhere in the 2003 and 2004 NAR database issues, respectively. All databases described, and the detailed descriptions of our projects can be accessed through the MIPS web server (http://mips.gsf.de). PMID:14681354

  14. MIPS: analysis and annotation of proteins from whole genomes.

    PubMed

    Mewes, H W; Amid, C; Arnold, R; Frishman, D; Güldener, U; Mannhaupt, G; Münsterkötter, M; Pagel, P; Strack, N; Stümpflen, V; Warfsmann, J; Ruepp, A

    2004-01-01

    The Munich Information Center for Protein Sequences (MIPS-GSF), Neuherberg, Germany, provides protein sequence-related information based on whole-genome analysis. The main focus of the work is directed toward the systematic organization of sequence-related attributes as gathered by a variety of algorithms, primary information from experimental data together with information compiled from the scientific literature. MIPS maintains automatically generated and manually annotated genome-specific databases, develops systematic classification schemes for the functional annotation of protein sequences and provides tools for the comprehensive analysis of protein sequences. This report updates the information on the yeast genome (CYGD), the Neurospora crassa genome (MNCDB), the database of complete cDNAs (German Human Genome Project, NGFN), the database of mammalian protein-protein interactions (MPPI), the database of FASTA homologies (SIMAP), and the interface for the fast retrieval of protein-associated information (QUIPOS). The Arabidopsis thaliana database, the rice database, the plant EST databases (MATDB, MOsDB, SPUTNIK), as well as the databases for the comprehensive set of genomes (PEDANT genomes) are described elsewhere in the 2003 and 2004 NAR database issues, respectively. All databases described, and the detailed descriptions of our projects can be accessed through the MIPS web server (http://mips.gsf.de).

  15. The Development of a Visual-Perceptual Chemistry Specific (VPCS) Assessment Tool

    ERIC Educational Resources Information Center

    Oliver-Hoyo, Maria; Sloan, Caroline

    2014-01-01

    The development of the Visual-Perceptual Chemistry Specific (VPCS) assessment tool is based on items that align to eight visual-perceptual skills considered as needed by chemistry students. This tool includes a comprehensive range of visual operations and presents items within a chemistry context without requiring content knowledge to solve…

  16. Interactive and Authentic e-Learning Tools for Criminal Justice Education

    ERIC Educational Resources Information Center

    Miner-Romanoff, Karen; McCombs, Jonathan; Chongwony, Lewis

    2017-01-01

    This mixed-method study tested the effectiveness of two experiential e-learning tools for criminal justice courses. The first tool was a comprehensive video series, including a criminal trial and interviews with the judge, defense counsel, prosecution, investigators and court director (virtual trial), in order to enhance course and learning…

  17. Exposure Assessment Tools by Media - Air

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  18. Aviation Environmental Design Tool (AEDT) System Architecture

    DOT National Transportation Integrated Search

    2007-01-29

    The Federal Aviation Administration's Office of Environment and Energy (FAA-AEE) is : developing a comprehensive suite of software tools that will allow for thorough assessment of the environmental effects of aviation. The main goal of the effort is ...

  19. Exposure Assessment Tools by Routes - Inhalation

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  20. Exposure Assessment Tools by Chemical Classes

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  1. Exposure Assessment Tools by Routes

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  2. Exposure Assessment Tools by Media - Food

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  3. Exposure Assessment Tools by Media

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  4. Exposure Assessment Tools by Routes - Ingestion

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  5. Exposure Assessment Tools by Approaches

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  6. A Multi-Pronged Plan

    ERIC Educational Resources Information Center

    Starkman, Neal

    2007-01-01

    As schools adopt new and varied technologies to protect the campus community, the need to look at security tools in terms of a comprehensive, layered, and integrated strategy, becomes clear. This article discusses how schools are using these security tools.

  7. Assessment of the Aviation Environmental Design Tool

    DOT National Transportation Integrated Search

    2009-06-29

    A comprehensive Tools Suite to allow for : thorough evaluation of the environmental effects and impacts : of aviation is currently being developed by the U.S. This suite : consists of the Environmental Design Space (EDS), the : Aviation Environmental...

  8. EPA EcoBox Tools by Stressors - Biological

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  9. Exposure Assessment Tools by Routes - Dermal

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  10. EPA EcoBox Tools by Stressors - Physical

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  11. EPA EcoBox Tools by Stressors

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  12. EPA EcoBox Tools by Effects - Aquatic

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  13. EPA EcoBox Tools by Exposure Pathways

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  14. EPA EcoBox Tools by Effects - References

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  15. EPA EcoBox Tools by Stressors - References

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  16. EPA EcoBox Tools by Stressors - Chemical

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  17. EPA EcoBox Tools by Effects

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  18. EPA EcoBox Tools by Receptors - Biota

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  19. EPA EcoBox Tools by Receptors

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  20. EPA EcoBox Tools by Receptors - References

    EPA Pesticide Factsheets

    Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

Top