Sample records for integrated tool set

  1. GARNET--gene set analysis with exploration of annotation relations.

    PubMed

    Rho, Kyoohyoung; Kim, Bumjin; Jang, Youngjun; Lee, Sanghyun; Bae, Taejeong; Seo, Jihae; Seo, Chaehwa; Lee, Jihyun; Kang, Hyunjung; Yu, Ungsik; Kim, Sunghoon; Lee, Sanghyuk; Kim, Wan Kyu

    2011-02-15

    Gene set analysis is a powerful method of deducing biological meaning for an a priori defined set of genes. Numerous tools have been developed to test statistical enrichment or depletion in specific pathways or gene ontology (GO) terms. Major difficulties towards biological interpretation are integrating diverse types of annotation categories and exploring the relationships between annotation terms of similar information. GARNET (Gene Annotation Relationship NEtwork Tools) is an integrative platform for gene set analysis with many novel features. It includes tools for retrieval of genes from annotation database, statistical analysis & visualization of annotation relationships, and managing gene sets. In an effort to allow access to a full spectrum of amassed biological knowledge, we have integrated a variety of annotation data that include the GO, domain, disease, drug, chromosomal location, and custom-defined annotations. Diverse types of molecular networks (pathways, transcription and microRNA regulations, protein-protein interaction) are also included. The pair-wise relationship between annotation gene sets was calculated using kappa statistics. GARNET consists of three modules--gene set manager, gene set analysis and gene set retrieval, which are tightly integrated to provide virtually automatic analysis for gene sets. A dedicated viewer for annotation network has been developed to facilitate exploration of the related annotations. GARNET (gene annotation relationship network tools) is an integrative platform for diverse types of gene set analysis, where complex relationships among gene annotations can be easily explored with an intuitive network visualization tool (http://garnet.isysbio.org/ or http://ercsb.ewha.ac.kr/garnet/).

  2. Toolpack mathematical software development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osterweil, L.

    1982-07-21

    The purpose of this research project was to produce a well integrated set of tools for the support of numerical computation. The project entailed the specification, design and implementation of both a diversity of tools and an innovative tool integration mechanism. This large configuration of tightly integrated tools comprises an environment for numerical software development, and has been named Toolpack/IST (Integrated System of Tools). Following the creation of this environment in prototype form, the environment software was readied for widespread distribution by transitioning it to a development organization for systematization, documentation and distribution. It is expected that public release ofmore » Toolpack/IST will begin imminently and will provide a basis for evaluation of the innovative software approaches taken as well as a uniform set of development tools for the numerical software community.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  4. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  5. Integrated decision support tools for Puget Sound salmon recovery planning

    EPA Science Inventory

    We developed a set of tools to provide decision support for community-based salmon recovery planning in Salish Sea watersheds. Here we describe how these tools are being integrated and applied in collaboration with Puget Sound tribes and community stakeholders to address restora...

  6. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    PubMed

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  7. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations.

    PubMed

    Dwivedi, Bhakti; Kowalski, Jeanne

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/.

  8. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations

    PubMed Central

    Dwivedi, Bhakti

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/. PMID:29415010

  9. Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool

    PubMed Central

    Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.

    2011-01-01

    Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150

  10. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  11. SNPConvert: SNP Array Standardization and Integration in Livestock Species.

    PubMed

    Nicolazzi, Ezequiel Luis; Marras, Gabriele; Stella, Alessandra

    2016-06-09

    One of the main advantages of single nucleotide polymorphism (SNP) array technology is providing genotype calls for a specific number of SNP markers at a relatively low cost. Since its first application in animal genetics, the number of available SNP arrays for each species has been constantly increasing. However, conversely to that observed in whole genome sequence data analysis, SNP array data does not have a common set of file formats or coding conventions for allele calling. Therefore, the standardization and integration of SNP array data from multiple sources have become an obstacle, especially for users with basic or no programming skills. Here, we describe the difficulties related to handling SNP array data, focusing on file formats, SNP allele coding, and mapping. We also present SNPConvert suite, a multi-platform, open-source, and user-friendly set of tools to overcome these issues. This tool, which can be integrated with open-source and open-access tools already available, is a first step towards an integrated system to standardize and integrate any type of raw SNP array data. The tool is available at: https://github. com/nicolazzie/SNPConvert.git.

  12. Evaluating the Development of Science Research Skills in Work-Integrated Learning through the Use of Workplace Science Tools

    ERIC Educational Resources Information Center

    McCurdy, Susan M.; Zegwaard, Karsten E.; Dalgety, Jacinta

    2013-01-01

    Concept understanding, the development of analytical skills and a research mind set are explored through the use of academic tools common in a tertiary science education and relevant work-integrated learning (WIL) experiences. The use and development of the tools; laboratory book, technical report, and literature review are examined by way of…

  13. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  14. FORTRAN tools

    NASA Technical Reports Server (NTRS)

    Presser, L.

    1978-01-01

    An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.

  15. An integrated set of UNIX based system tools at control room level

    NASA Astrophysics Data System (ADS)

    Potepan, F.; Scafuri, C.; Bortolotto, C.; Surace, G.

    1994-12-01

    The design effort of providing a simple point-and-click approach to the equipment access has led to the definition and realization of a modular set of software tools to be used at the ELETTRA control room level. Point-to-point equipment access requires neither programming nor specific knowledge of the control system architecture. The development and integration of communication, graphic, editing and global database modules are described in depth, followed by a report of their use in the first commissioning period.

  16. TARA: Tool Assisted Requirements Analysis

    DTIC Science & Technology

    1988-05-01

    provided during the project and to aid tool integration . Chapter 6 provides a brief discussion of the experience of specifying the ASET case study in CORE...set of Prolog clauses. This includes the context-free grammar rules depicted in Figure 2.1, integrity constraints such as those defining the binding...Jeremaes (1986). This was developed originally for specifying database management ". semantics (for example, the preservation of integrity constraints

  17. Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Johnson, C. W.; Gotchy, M. B.

    2000-01-01

    The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

  18. Calculation of Coupled Vibroacoustics Response Estimates from a Library of Available Uncoupled Transfer Function Sets

    NASA Technical Reports Server (NTRS)

    Smith, Andrew; LaVerde, Bruce; Hunt, Ron; Fulcher, Clay; Towner, Robert; McDonald, Emmett

    2012-01-01

    The design and theoretical basis of a new database tool that quickly generates vibroacoustic response estimates using a library of transfer functions (TFs) is discussed. During the early stages of a launch vehicle development program, these response estimates can be used to provide vibration environment specification to hardware vendors. The tool accesses TFs from a database, combines the TFs, and multiplies these by input excitations to estimate vibration responses. The database is populated with two sets of uncoupled TFs; the first set representing vibration response of a bare panel, designated as H(sup s), and the second set representing the response of the free-free component equipment by itself, designated as H(sup c). For a particular configuration undergoing analysis, the appropriate H(sup s) and H(sup c) are selected and coupled to generate an integrated TF, designated as H(sup s +c). This integrated TF is then used with the appropriate input excitations to estimate vibration responses. This simple yet powerful tool enables a user to estimate vibration responses without directly using finite element models, so long as suitable H(sup s) and H(sup c) sets are defined in the database libraries. The paper discusses the preparation of the database tool and provides the assumptions and methodologies necessary to combine H(sup s) and H(sup c) sets into an integrated H(sup s + c). An experimental validation of the approach is also presented.

  19. Integration of g4tools in Geant4

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, Ivana

    2014-06-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  20. Eportfolios as Evidence of Standards and Outcomes in Work-Integrated Learning

    ERIC Educational Resources Information Center

    Ferns, Sonia; Comfort, Jude

    2014-01-01

    Electronic portfolios (ePortfolios) are a student focused tool which support and evidence work-integrated learning (WIL) experiences and capabilities in a tertiary education setting. Such settings are increasingly faced by a regulatory framework requiring evidence of student competency and skill acquisition. The commitment of educational…

  1. Watershed Health Assessment Tools Investigating Fisheries

    EPA Science Inventory

    WHATIF is software that integrates a number of calculators, tools, and models for assessing the health of watersheds and streams with an emphasis on fish communities. The tool set consists of hydrologic and stream geometry calculators, a fish assemblage predictor, a fish habitat ...

  2. Integrative Functional Genomics for Systems Genetics in GeneWeaver.org.

    PubMed

    Bubier, Jason A; Langston, Michael A; Baker, Erich J; Chesler, Elissa J

    2017-01-01

    The abundance of existing functional genomics studies permits an integrative approach to interpreting and resolving the results of diverse systems genetics studies. However, a major challenge lies in assembling and harmonizing heterogeneous data sets across species for facile comparison to the positional candidate genes and coexpression networks that come from systems genetic studies. GeneWeaver is an online database and suite of tools at www.geneweaver.org that allows for fast aggregation and analysis of gene set-centric data. GeneWeaver contains curated experimental data together with resource-level data such as GO annotations, MP annotations, and KEGG pathways, along with persistent stores of user entered data sets. These can be entered directly into GeneWeaver or transferred from widely used resources such as GeneNetwork.org. Data are analyzed using statistical tools and advanced graph algorithms to discover new relations, prioritize candidate genes, and generate function hypotheses. Here we use GeneWeaver to find genes common to multiple gene sets, prioritize candidate genes from a quantitative trait locus, and characterize a set of differentially expressed genes. Coupling a large multispecies repository curated and empirical functional genomics data to fast computational tools allows for the rapid integrative analysis of heterogeneous data for interpreting and extrapolating systems genetics results.

  3. Unlocking the potential of publicly available microarray data using inSilicoDb and inSilicoMerging R/Bioconductor packages.

    PubMed

    Taminau, Jonatan; Meganck, Stijn; Lazar, Cosmin; Steenhoff, David; Coletta, Alain; Molter, Colin; Duque, Robin; de Schaetzen, Virginie; Weiss Solís, David Y; Bersini, Hugues; Nowé, Ann

    2012-12-24

    With an abundant amount of microarray gene expression data sets available through public repositories, new possibilities lie in combining multiple existing data sets. In this new context, analysis itself is no longer the problem, but retrieving and consistently integrating all this data before delivering it to the wide variety of existing analysis tools becomes the new bottleneck. We present the newly released inSilicoMerging R/Bioconductor package which, together with the earlier released inSilicoDb R/Bioconductor package, allows consistent retrieval, integration and analysis of publicly available microarray gene expression data sets. Inside the inSilicoMerging package a set of five visual and six quantitative validation measures are available as well. By providing (i) access to uniformly curated and preprocessed data, (ii) a collection of techniques to remove the batch effects between data sets from different sources, and (iii) several validation tools enabling the inspection of the integration process, these packages enable researchers to fully explore the potential of combining gene expression data for downstream analysis. The power of using both packages is demonstrated by programmatically retrieving and integrating gene expression studies from the InSilico DB repository [https://insilicodb.org/app/].

  4. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  5. MatchingTools: A Python library for symbolic effective field theory calculations

    NASA Astrophysics Data System (ADS)

    Criado, Juan C.

    2018-06-01

    MatchingTools is a Python library for doing symbolic calculations in effective field theory. It provides the tools to construct general models by defining their field content and their interaction Lagrangian. Once a model is given, the heavy particles can be integrated out at the tree level to obtain an effective Lagrangian in which only the light particles appear. After integration, some of the terms of the resulting Lagrangian might not be independent. MatchingTools contains functions for transforming these terms to rewrite them in terms of any chosen set of operators.

  6. Whole Watershed Restoration Planning Tools for Estimating Tradeoffs Among Multiple Objectives

    EPA Science Inventory

    We developed a set of decision support tools to assist whole watershed restoration planning in the Pacific Northwest. Here we describe how these tools are being integrated and applied in collaboration with tribes and community stakeholders to address restoration of hydrological ...

  7. Virtual Worlds and Gamification to Increase Integration of International Students in Higher Education: An Inclusive Design Approach

    ERIC Educational Resources Information Center

    Zhang, Bo; Robb, Nigel; Eyerman, Joe; Goodman, Lizbeth

    2017-01-01

    In response to the growing trend of internationalisation in education, it is important to consider approaches to help international students integrate in their new settings. One possible approach involves the use of e-Learning tools, such as virtual worlds and gamification. To maximise the potential effectiveness of such tools, it may be…

  8. The ontology life cycle: Integrated tools for editing, publishing, peer review, and evolution of ontologies

    PubMed Central

    Noy, Natalya; Tudorache, Tania; Nyulas, Csongor; Musen, Mark

    2010-01-01

    Ontologies have become a critical component of many applications in biomedical informatics. However, the landscape of the ontology tools today is largely fragmented, with independent tools for ontology editing, publishing, and peer review: users develop an ontology in an ontology editor, such as Protégé; and publish it on a Web server or in an ontology library, such as BioPortal, in order to share it with the community; they use the tools provided by the library or mailing lists and bug trackers to collect feedback from users. In this paper, we present a set of tools that bring the ontology editing and publishing closer together, in an integrated platform for the entire ontology lifecycle. This integration streamlines the workflow for collaborative development and increases integration between the ontologies themselves through the reuse of terms. PMID:21347039

  9. Debating Life on Mars: The Knowledge Integration Environment (KIE) in Varied School Settings.

    ERIC Educational Resources Information Center

    Shear, Linda

    Technology-enabled learning environments are beginning to come of age. Tools and frameworks are now available that have been shown to improve learning and are being deployed more widely in varied school settings. Teachers are now faced with the formidable challenge of integrating these promising new environments with the everyday context in which…

  10. Facility Composer (Trademark) and PACES (Trademark) Integration: Development of an XML Interface Based on Industry Foundation Classes

    DTIC Science & Technology

    2007-11-01

    Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria

  11. Laughter Filled the Classroom: Outcomes of Professional Development in Arts Integration for Elementary Teachers in Inclusion Settings

    ERIC Educational Resources Information Center

    Koch, Katherine A.; Thompson, Janna Chevon

    2017-01-01

    This qualitative study examined teachers' experiences with an arts integration curriculum. This study considered the teachers' perceptions of arts integrations before and after being introduced to the concepts of arts integration. The teachers were provided with knowledge and tools to integrate the arts into general education curriculum and…

  12. CAS-viewer: web-based tool for splicing-guided integrative analysis of multi-omics cancer data.

    PubMed

    Han, Seonggyun; Kim, Dongwook; Kim, Youngjun; Choi, Kanghoon; Miller, Jason E; Kim, Dokyoon; Lee, Younghee

    2018-04-20

    The Cancer Genome Atlas (TCGA) project is a public resource that provides transcriptomic, DNA sequence, methylation, and clinical data for 33 cancer types. Transforming the large size and high complexity of TCGA cancer genome data into integrated knowledge can be useful to promote cancer research. Alternative splicing (AS) is a key regulatory mechanism of genes in human cancer development and in the interaction with epigenetic factors. Therefore, AS-guided integration of existing TCGA data sets will make it easier to gain insight into the genetic architecture of cancer risk and related outcomes. There are already existing tools analyzing and visualizing alternative mRNA splicing patterns for large-scale RNA-seq experiments. However, these existing web-based tools are limited to the analysis of individual TCGA data sets at a time, such as only transcriptomic information. We implemented CAS-viewer (integrative analysis of Cancer genome data based on Alternative Splicing), a web-based tool leveraging multi-cancer omics data from TCGA. It illustrates alternative mRNA splicing patterns along with methylation, miRNAs, and SNPs, and then provides an analysis tool to link differential transcript expression ratio to methylation, miRNA, and splicing regulatory elements for 33 cancer types. Moreover, one can analyze AS patterns with clinical data to identify potential transcripts associated with different survival outcome for each cancer. CAS-viewer is a web-based application for transcript isoform-driven integration of multi-omics data in multiple cancer types and will aid in the visualization and possible discovery of biomarkers for cancer by integrating multi-omics data from TCGA.

  13. Methylation Integration (Mint) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    A comprehensive software pipeline and set of Galaxy tools/workflows for integrative analysis of genome-wide DNA methylation and hydroxymethylation data. Data types can be either bisulfite sequencing and/or pull-down methods.

  14. Coordinating complex decision support activities across distributed applications

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  15. Empowering districts to target priorities for improving child health service in Uganda using change management and rapid assessment methods.

    PubMed

    Odaga, John; Henriksson, Dorcus K; Nkolo, Charles; Tibeihaho, Hector; Musabe, Richard; Katusiime, Margaret; Sinabulya, Zaccheus; Mucunguzi, Stephen; Mbonye, Anthony K; Valadez, Joseph J

    2016-01-01

    Local health system managers in low- and middle-income countries have the responsibility to set health priorities and allocate resources accordingly. Although tools exist to aid this process, they are not widely applied for various reasons including non-availability, poor knowledge of the tools, and poor adaptability into the local context. In Uganda, delivery of basic services is devolved to the District Local Governments through the District Health Teams (DHTs). The Community and District Empowerment for Scale-up (CODES) project aims to provide a set of management tools that aid contextualised priority setting, fund allocation, and problem-solving in a systematic way to improve effective coverage and quality of child survival interventions. Although the various tools have previously been used at the national level, the project aims to combine them in an integral way for implementation at the district level. These tools include Lot Quality Assurance Sampling (LQAS) surveys to generate local evidence, Bottleneck analysis and Causal analysis as analytical tools, Continuous Quality Improvement, and Community Dialogues based on Citizen Report Cards and U reports. The tools enable identification of gaps, prioritisation of possible solutions, and allocation of resources accordingly. This paper presents some of the tools used by the project in five districts in Uganda during the proof-of-concept phase of the project. All five districts were trained and participated in LQAS surveys and readily adopted the tools for priority setting and resource allocation. All districts developed health operational work plans, which were based on the evidence and each of the districts implemented more than three of the priority activities which were included in their work plans. In the five districts, the CODES project demonstrated that DHTs can adopt and integrate these tools in the planning process by systematically identifying gaps and setting priority interventions for child survival.

  16. Empowering districts to target priorities for improving child health service in Uganda using change management and rapid assessment methods

    PubMed Central

    Odaga, John; Henriksson, Dorcus K.; Nkolo, Charles; Tibeihaho, Hector; Musabe, Richard; Katusiime, Margaret; Sinabulya, Zaccheus; Mucunguzi, Stephen; Mbonye, Anthony K.; Valadez, Joseph J.

    2016-01-01

    Background Local health system managers in low- and middle-income countries have the responsibility to set health priorities and allocate resources accordingly. Although tools exist to aid this process, they are not widely applied for various reasons including non-availability, poor knowledge of the tools, and poor adaptability into the local context. In Uganda, delivery of basic services is devolved to the District Local Governments through the District Health Teams (DHTs). The Community and District Empowerment for Scale-up (CODES) project aims to provide a set of management tools that aid contextualised priority setting, fund allocation, and problem-solving in a systematic way to improve effective coverage and quality of child survival interventions. Design Although the various tools have previously been used at the national level, the project aims to combine them in an integral way for implementation at the district level. These tools include Lot Quality Assurance Sampling (LQAS) surveys to generate local evidence, Bottleneck analysis and Causal analysis as analytical tools, Continuous Quality Improvement, and Community Dialogues based on Citizen Report Cards and U reports. The tools enable identification of gaps, prioritisation of possible solutions, and allocation of resources accordingly. This paper presents some of the tools used by the project in five districts in Uganda during the proof-of-concept phase of the project. Results All five districts were trained and participated in LQAS surveys and readily adopted the tools for priority setting and resource allocation. All districts developed health operational work plans, which were based on the evidence and each of the districts implemented more than three of the priority activities which were included in their work plans. Conclusions In the five districts, the CODES project demonstrated that DHTs can adopt and integrate these tools in the planning process by systematically identifying gaps and setting priority interventions for child survival. PMID:27225791

  17. Integration of relational and textual biomedical sources. A pilot experiment using a semi-automated method for logical schema acquisition.

    PubMed

    García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J

    2010-01-01

    Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.

  18. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  19. Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project

    NASA Astrophysics Data System (ADS)

    van Eck, T.; Giardini, D.

    2010-12-01

    The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.

  20. Salmon recovery planning using the VELMA model

    EPA Science Inventory

    We developed a set of tools to provide decision support for community-based salmon recovery planning in Pacific Northwest watersheds. This seminar describes how these tools are being integrated and applied in collaboration with Puget Sound tribes and community stakeholders to add...

  1. An introduction to Space Weather Integrated Modeling

    NASA Astrophysics Data System (ADS)

    Zhong, D.; Feng, X.

    2012-12-01

    The need for a software toolkit that integrates space weather models and data is one of many challenges we are facing with when applying the models to space weather forecasting. To meet this challenge, we have developed Space Weather Integrated Modeling (SWIM) that is capable of analysis and visualizations of the results from a diverse set of space weather models. SWIM has a modular design and is written in Python, by using NumPy, matplotlib, and the Visualization ToolKit (VTK). SWIM provides data management module to read a variety of spacecraft data products and a specific data format of Solar-Interplanetary Conservation Element/Solution Element MHD model (SIP-CESE MHD model) for the study of solar-terrestrial phenomena. Data analysis, visualization and graphic user interface modules are also presented in a user-friendly way to run the integrated models and visualize the 2-D and 3-D data sets interactively. With these tools we can locally or remotely analysis the model result rapidly, such as extraction of data on specific location in time-sequence data sets, plotting interplanetary magnetic field lines, multi-slicing of solar wind speed, volume rendering of solar wind density, animation of time-sequence data sets, comparing between model result and observational data. To speed-up the analysis, an in-situ visualization interface is used to support visualizing the data 'on-the-fly'. We also modified some critical time-consuming analysis and visualization methods with the aid of GPU and multi-core CPU. We have used this tool to visualize the data of SIP-CESE MHD model in real time, and integrated the Database Model of shock arrival, Shock Propagation Model, Dst forecasting model and SIP-CESE MHD model developed by SIGMA Weather Group at State Key Laboratory of Space Weather/CAS.

  2. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    PubMed

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  3. Integrating the hospital library with patient care, teaching and research: model and Web 2.0 tools to create a social and collaborative community of clinical research in a hospital setting.

    PubMed

    Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin

    2010-09-01

    Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.

  4. Integrated management of timber and deer: coastal forests of British Columbia and Alaska.

    Treesearch

    J.B. Nyberg; R.S. McNay; M.D. [and others] Kirchhoff

    1989-01-01

    Current techniques for integrating timber and deer management in coastal British Columbia and Alaska are reviewed and evaluated. Integration can be improved by setting objectives for deer habitat and timber, improving managers' knowledge of interactions, and providing planning tools to analyze alternative programs of forest management. A handbook designed to...

  5. Designed tools for analysis of lithography patterns and nanostructures

    NASA Astrophysics Data System (ADS)

    Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann

    2017-03-01

    We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.

  6. The Visual Geophysical Exploration Environment: A Multi-dimensional Scientific Visualization

    NASA Astrophysics Data System (ADS)

    Pandya, R. E.; Domenico, B.; Murray, D.; Marlino, M. R.

    2003-12-01

    The Visual Geophysical Exploration Environment (VGEE) is an online learning environment designed to help undergraduate students understand fundamental Earth system science concepts. The guiding principle of the VGEE is the importance of hands-on interaction with scientific visualization and data. The VGEE consists of four elements: 1) an online, inquiry-based curriculum for guiding student exploration; 2) a suite of El Nino-related data sets adapted for student use; 3) a learner-centered interface to a scientific visualization tool; and 4) a set of concept models (interactive tools that help students understand fundamental scientific concepts). There are two key innovations featured in this interactive poster session. One is the integration of concept models and the visualization tool. Concept models are simple, interactive, Java-based illustrations of fundamental physical principles. We developed eight concept models and integrated them into the visualization tool to enable students to probe data. The ability to probe data using a concept model addresses the common problem of transfer: the difficulty students have in applying theoretical knowledge to everyday phenomenon. The other innovation is a visualization environment and data that are discoverable in digital libraries, and installed, configured, and used for investigations over the web. By collaborating with the Integrated Data Viewer developers, we were able to embed a web-launchable visualization tool and access to distributed data sets into the online curricula. The Thematic Real-time Environmental Data Distributed Services (THREDDS) project is working to provide catalogs of datasets that can be used in new VGEE curricula under development. By cataloging this curricula in the Digital Library for Earth System Education (DLESE), learners and educators can discover the data and visualization tool within a framework that guides their use.

  7. Meeting report: applied biopharmaceutics and quality by design for dissolution/release specification setting: product quality for patient benefit.

    PubMed

    Selen, Arzu; Cruañes, Maria T; Müllertz, Anette; Dickinson, Paul A; Cook, Jack A; Polli, James E; Kesisoglou, Filippos; Crison, John; Johnson, Kevin C; Muirhead, Gordon T; Schofield, Timothy; Tsong, Yi

    2010-09-01

    A biopharmaceutics and Quality by Design (QbD) conference was held on June 10-12, 2009 in Rockville, Maryland, USA to provide a forum and identify approaches for enhancing product quality for patient benefit. Presentations concerned the current biopharmaceutical toolbox (i.e., in vitro, in silico, pre-clinical, in vivo, and statistical approaches), as well as case studies, and reflections on new paradigms. Plenary and breakout session discussions evaluated the current state and envisioned a future state that more effectively integrates QbD and biopharmaceutics. Breakout groups discussed the following four topics: Integrating Biopharmaceutical Assessment into the QbD Paradigm, Predictive Statistical Tools, Predictive Mechanistic Tools, and Predictive Analytical Tools. Nine priority areas, further described in this report, were identified for advancing integration of biopharmaceutics and support a more fundamentally based, integrated approach to setting product dissolution/release acceptance criteria. Collaboration among a broad range of disciplines and fostering a knowledge sharing environment that places the patient's needs as the focus of drug development, consistent with science- and risk-based spirit of QbD, were identified as key components of the path forward.

  8. Integration of SimSET photon history generator in GATE for efficient Monte Carlo simulations of pinhole SPECT.

    PubMed

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J S; Tsui, Benjamin M W

    2008-07-01

    The authors developed and validated an efficient Monte Carlo simulation (MCS) workflow to facilitate small animal pinhole SPECT imaging research. This workflow seamlessly integrates two existing MCS tools: simulation system for emission tomography (SimSET) and GEANT4 application for emission tomography (GATE). Specifically, we retained the strength of GATE in describing complex collimator/detector configurations to meet the anticipated needs for studying advanced pinhole collimation (e.g., multipinhole) geometry, while inserting the fast SimSET photon history generator (PHG) to circumvent the relatively slow GEANT4 MCS code used by GATE in simulating photon interactions inside voxelized phantoms. For validation, data generated from this new SimSET-GATE workflow were compared with those from GATE-only simulations as well as experimental measurements obtained using a commercial small animal pinhole SPECT system. Our results showed excellent agreement (e.g., in system point response functions and energy spectra) between SimSET-GATE and GATE-only simulations, and, more importantly, a significant computational speedup (up to approximately 10-fold) provided by the new workflow. Satisfactory agreement between MCS results and experimental data were also observed. In conclusion, the authors have successfully integrated SimSET photon history generator in GATE for fast and realistic pinhole SPECT simulations, which can facilitate research in, for example, the development and application of quantitative pinhole and multipinhole SPECT for small animal imaging. This integrated simulation tool can also be adapted for studying other preclinical and clinical SPECT techniques.

  9. Development of a comprehensive software engineering environment

    NASA Technical Reports Server (NTRS)

    Hartrum, Thomas C.; Lamont, Gary B.

    1987-01-01

    The generation of a set of tools for software lifecycle is a recurring theme in the software engineering literature. The development of such tools and their integration into a software development environment is a difficult task because of the magnitude (number of variables) and the complexity (combinatorics) of the software lifecycle process. An initial development of a global approach was initiated in 1982 as the Software Development Workbench (SDW). Continuing efforts focus on tool development, tool integration, human interfacing, data dictionaries, and testing algorithms. Current efforts are emphasizing natural language interfaces, expert system software development associates and distributed environments with Ada as the target language. The current implementation of the SDW is on a VAX-11/780. Other software development tools are being networked through engineering workstations.

  10. EuPathDB: the eukaryotic pathogen genomics database resource

    PubMed Central

    Aurrecoechea, Cristina; Barreto, Ana; Basenko, Evelina Y.; Brestelli, John; Brunk, Brian P.; Cade, Shon; Crouch, Kathryn; Doherty, Ryan; Falke, Dave; Fischer, Steve; Gajria, Bindu; Harb, Omar S.; Heiges, Mark; Hertz-Fowler, Christiane; Hu, Sufen; Iodice, John; Kissinger, Jessica C.; Lawrence, Cris; Li, Wei; Pinney, Deborah F.; Pulman, Jane A.; Roos, David S.; Shanmugasundram, Achchuthan; Silva-Franco, Fatima; Steinbiss, Sascha; Stoeckert, Christian J.; Spruill, Drew; Wang, Haiming; Warrenfeltz, Susanne; Zheng, Jie

    2017-01-01

    The Eukaryotic Pathogen Genomics Database Resource (EuPathDB, http://eupathdb.org) is a collection of databases covering 170+ eukaryotic pathogens (protists & fungi), along with relevant free-living and non-pathogenic species, and select pathogen hosts. To facilitate the discovery of meaningful biological relationships, the databases couple preconfigured searches with visualization and analysis tools for comprehensive data mining via intuitive graphical interfaces and APIs. All data are analyzed with the same workflows, including creation of gene orthology profiles, so data are easily compared across data sets, data types and organisms. EuPathDB is updated with numerous new analysis tools, features, data sets and data types. New tools include GO, metabolic pathway and word enrichment analyses plus an online workspace for analysis of personal, non-public, large-scale data. Expanded data content is mostly genomic and functional genomic data while new data types include protein microarray, metabolic pathways, compounds, quantitative proteomics, copy number variation, and polysomal transcriptomics. New features include consistent categorization of searches, data sets and genome browser tracks; redesigned gene pages; effective integration of alternative transcripts; and a EuPathDB Galaxy instance for private analyses of a user's data. Forthcoming upgrades include user workspaces for private integration of data with existing EuPathDB data and improved integration and presentation of host–pathogen interactions. PMID:27903906

  11. ATD-1 ATM Technology Demonstration-1 and Integrated Scheduling

    NASA Technical Reports Server (NTRS)

    Quon, Leighton

    2014-01-01

    Enabling efficient arrivals for the NextGen Air Traffic Management System and developing a set of integrated decision support tools to reduce the high cognitive workload so that controllers are able to simultaneously achieve safe, efficient, and expedient operations at high traffic demand levels.

  12. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.

    PubMed

    List, Markus

    2017-06-10

    Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  13. The Past, Present, and Future of Configuration Management

    DTIC Science & Technology

    1992-07-01

    surveying the tools and environments, it is possible to extract a set of 15 CM concepts [12] that capture the essence of automated support for CM. These...tools in maintaining the configuration’s integrity, as in Jasmine [20]. 9. Subsystem: provide a means to limit the effect of changes and recompilation...workspace facility. Thus, the ser- vices model, in essence , is intended to provide plug in/plug out, "black box" capabilities. The initial set of 50

  14. INVESTIGATING UNCERTAINTY AND SENSITIVITY IN INTEGRATED MULTIMEDIA ENVIRONMENTAL MODELS: TOOLS FOR 3MRA

    EPA Science Inventory

    Sufficiently elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The ensuing challenge of examining ever more complex, integrated, higher-ord...

  15. INVESTIGATING UNCERTAINTY AND SENSITIVITY IN INTEGRATED, MULTIMEDIA ENVIRONMENTAL MODELS: TOOLS FOR FRAMES-3MRA

    EPA Science Inventory

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites...

  16. Improving K-12 STEM Education Outcomes through Technological Integration

    ERIC Educational Resources Information Center

    Urban, Michael J., Ed.; Falvo, David A., Ed.

    2016-01-01

    The application of technology in classroom settings has equipped educators with innovative tools and techniques for effective teaching practice. Integrating digital technologies at the elementary and secondary levels helps to enrich the students' learning experience and maximize competency in the areas of science, technology, engineering, and…

  17. A Set of Web-based Tools for Integrating Scientific Research and Decision-Making through Systems Thinking

    EPA Science Inventory

    Currently, many policy and management decisions are made without considering the goods and services humans derive from ecosystems and the costs associated with protecting them. This approach is unlikely to be sustainable. Conceptual frameworks provide a tool for capturing, visual...

  18. Minimizing Security Vulnerabilities in High-Tech Classrooms

    ERIC Educational Resources Information Center

    Ozkan, Betul C.; Gunay, Vedat

    2004-01-01

    Emerging technologies are quickly becoming part of daily learning and teaching endeavors in academia. Due to the access to certain high-tech tools educators must learn how to integrate these tools in educational settings. However, many also encounter problems and weaknesses in the same high-tech environment that uses and delivers information…

  19. Stimulating a Culture of Improvement: Introducing 
an Integrated Quality Tool for Organizational Self-Assessment.

    PubMed

    Coleman, Cathy

    2015-06-01

    As leaders and systems-level agents of change, oncology nurses are challenged by opportunities to guide organizational transformation from the front line to the board room. Across all care settings, reform and change initiatives are constants in the quest to optimize quality and healthcare outcomes for individuals, teams, populations, and organizations. This article describes a practical, evidence-based, integrated quality tool for initiating organizational self-assessment to prioritize issues and stimulate a culture of continuous improvement.

  20. Listeriomics: an Interactive Web Platform for Systems Biology of Listeria

    PubMed Central

    Koutero, Mikael; Tchitchek, Nicolas; Cerutti, Franck; Lechat, Pierre; Maillet, Nicolas; Hoede, Claire; Chiapello, Hélène; Gaspin, Christine

    2017-01-01

    ABSTRACT As for many model organisms, the amount of Listeria omics data produced has recently increased exponentially. There are now >80 published complete Listeria genomes, around 350 different transcriptomic data sets, and 25 proteomic data sets available. The analysis of these data sets through a systems biology approach and the generation of tools for biologists to browse these various data are a challenge for bioinformaticians. We have developed a web-based platform, named Listeriomics, that integrates different tools for omics data analyses, i.e., (i) an interactive genome viewer to display gene expression arrays, tiling arrays, and sequencing data sets along with proteomics and genomics data sets; (ii) an expression and protein atlas that connects every gene, small RNA, antisense RNA, or protein with the most relevant omics data; (iii) a specific tool for exploring protein conservation through the Listeria phylogenomic tree; and (iv) a coexpression network tool for the discovery of potential new regulations. Our platform integrates all the complete Listeria species genomes, transcriptomes, and proteomes published to date. This website allows navigation among all these data sets with enriched metadata in a user-friendly format and can be used as a central database for systems biology analysis. IMPORTANCE In the last decades, Listeria has become a key model organism for the study of host-pathogen interactions, noncoding RNA regulation, and bacterial adaptation to stress. To study these mechanisms, several genomics, transcriptomics, and proteomics data sets have been produced. We have developed Listeriomics, an interactive web platform to browse and correlate these heterogeneous sources of information. Our website will allow listeriologists and microbiologists to decipher key regulation mechanism by using a systems biology approach. PMID:28317029

  1. The center for causal discovery of biomedical knowledge from big data

    PubMed Central

    Bahar, Ivet; Becich, Michael J; Benos, Panayiotis V; Berg, Jeremy; Espino, Jeremy U; Glymour, Clark; Jacobson, Rebecca Crowley; Kienholz, Michelle; Lee, Adrian V; Lu, Xinghua; Scheines, Richard

    2015-01-01

    The Big Data to Knowledge (BD2K) Center for Causal Discovery is developing and disseminating an integrated set of open source tools that support causal modeling and discovery of biomedical knowledge from large and complex biomedical datasets. The Center integrates teams of biomedical and data scientists focused on the refinement of existing and the development of new constraint-based and Bayesian algorithms based on causal Bayesian networks, the optimization of software for efficient operation in a supercomputing environment, and the testing of algorithms and software developed using real data from 3 representative driving biomedical projects: cancer driver mutations, lung disease, and the functional connectome of the human brain. Associated training activities provide both biomedical and data scientists with the knowledge and skills needed to apply and extend these tools. Collaborative activities with the BD2K Consortium further advance causal discovery tools and integrate tools and resources developed by other centers. PMID:26138794

  2. Visuals, Path Control, and Knowledge Gain: Variables that Affect Students' Approval and Enjoyment of a Multimedia Text as a Learning Tool

    ERIC Educational Resources Information Center

    George-Palilonis, Jennifer; Filak, Vincent

    2010-01-01

    As graphically driven, animated, interactive applications offer educators new opportunities for shaping course content, new avenues for research arise as well. Along with these developments comes a need to study the effectiveness of the individual tools at our disposal as well as various methods for integrating those tools in a classroom setting.…

  3. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  4. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  5. Preparing for the future: a review of tools and strategies to support autonomous goal setting for children and youth with autism spectrum disorders.

    PubMed

    Hodgetts, Sandra; Park, Elly

    2017-03-01

    Despite recognized benefits, current clinical practice rarely includes direct input from children and youth with autism spectrum disorder (ASD) in setting rehabilitation goals. This study reviews tools and evidence-based strategies to assist with autonomous goal settings for children and youth with ASD. This study included two components: (1) A scoping review of existing tools and strategies to assist with autonomous goal setting in individuals with ASD and (2) a chart review of inter-disciplinary service plan goals for children and youth with ASD. Eleven data sources, evaluating five different tools to assist with autonomous goal setting for children and youth with ASD, were found. Three themes emerged from the integration of the scoping review and chart review, which are discussed in the paper: (1) generalizability of findings, (2) adaptations to support participation and (3) practice implications. Children and youth with ASD can participate in setting rehabilitation goals, but few tools to support their participation have been evaluated, and those tools that do exist do not align well with current services foci. Visual aids appear to be one effective support, but further research on effective strategies for meaningful engagement in autonomous goal setting for children and youth with ASD is warranted. Implications for rehabilitation Persons with ASD are less self-determined than their peers. Input into one's own rehabilitation goals and priorities is an important component of self-determination. Few tools exist to help engage children and youth with ASD in setting their own rehabilitation goals. An increased focus on identifying, developing and evaluating effective tools and strategies to facilitate engagement of children and youth with ASD in setting their own rehabilitation goals is warranted.

  6. Improving hydrologic disaster forecasting and response for transportation by assimilating and fusing NASA and other data sets : final report.

    DOT National Transportation Integrated Search

    2017-04-15

    In this 3-year project, the research team developed the Hydrologic Disaster Forecast and Response (HDFR) system, a set of integrated software tools for end users that streamlines hydrologic prediction workflows involving automated retrieval of hetero...

  7. Towards the integration, annotation and association of historical microarray experiments with RNA-seq.

    PubMed

    Chavan, Shweta S; Bauer, Michael A; Peterson, Erich A; Heuck, Christoph J; Johann, Donald J

    2013-01-01

    Transcriptome analysis by microarrays has produced important advances in biomedicine. For instance in multiple myeloma (MM), microarray approaches led to the development of an effective disease subtyping via cluster assignment, and a 70 gene risk score. Both enabled an improved molecular understanding of MM, and have provided prognostic information for the purposes of clinical management. Many researchers are now transitioning to Next Generation Sequencing (NGS) approaches and RNA-seq in particular, due to its discovery-based nature, improved sensitivity, and dynamic range. Additionally, RNA-seq allows for the analysis of gene isoforms, splice variants, and novel gene fusions. Given the voluminous amounts of historical microarray data, there is now a need to associate and integrate microarray and RNA-seq data via advanced bioinformatic approaches. Custom software was developed following a model-view-controller (MVC) approach to integrate Affymetrix probe set-IDs, and gene annotation information from a variety of sources. The tool/approach employs an assortment of strategies to integrate, cross reference, and associate microarray and RNA-seq datasets. Output from a variety of transcriptome reconstruction and quantitation tools (e.g., Cufflinks) can be directly integrated, and/or associated with Affymetrix probe set data, as well as necessary gene identifiers and/or symbols from a diversity of sources. Strategies are employed to maximize the annotation and cross referencing process. Custom gene sets (e.g., MM 70 risk score (GEP-70)) can be specified, and the tool can be directly assimilated into an RNA-seq pipeline. A novel bioinformatic approach to aid in the facilitation of both annotation and association of historic microarray data, in conjunction with richer RNA-seq data, is now assisting with the study of MM cancer biology.

  8. Electronic laboratory notebooks progress and challenges in implementation.

    PubMed

    Machina, Hari K; Wild, David J

    2013-08-01

    Electronic laboratory notebooks (ELNs) are increasingly replacing paper notebooks in life science laboratories, including those in industry, academic settings, and hospitals. ELNs offer significant advantages over paper notebooks, but adopting them in a predominantly paper-based environment is usually disruptive. The benefits of ELN increase when they are integrated with other laboratory informatics tools such as laboratory information management systems, chromatography data systems, analytical instrumentation, and scientific data management systems, but there is no well-established path for effective integration of these tools. In this article, we review and evaluate some of the approaches that have been taken thus far and also some radical new methods of integration that are emerging.

  9. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  10. Integrated Authoring Tool for Mobile Augmented Reality-Based E-Learning Applications

    ERIC Educational Resources Information Center

    Lobo, Marcos Fermin; Álvarez García, Víctor Manuel; del Puerto Paule Ruiz, María

    2013-01-01

    Learning management systems are increasingly being used to complement classroom teaching and learning and in some instances even replace traditional classroom settings with online educational tools. Mobile augmented reality is an innovative trend in e-learning that is creating new opportunities for teaching and learning. This article proposes a…

  11. Linking climate change and fish conservation efforts using spatially explicit decision support tools

    Treesearch

    Douglas P. Peterson; Seth J. Wenger; Bruce E. Rieman; Daniel J. Isaak

    2013-01-01

    Fisheries professionals are increasingly tasked with incorporating climate change projections into their decisions. Here we demonstrate how a structured decision framework, coupled with analytical tools and spatial data sets, can help integrate climate and biological information to evaluate management alternatives. We present examples that link downscaled climate...

  12. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  13. Final Report on the Creation of the Wind Integration National Dataset (WIND) Toolkit and API: October 1, 2013 - September 30, 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Bri-Mathias

    2016-04-08

    The primary objective of this work was to create a state-of-the-art national wind resource data set and to provide detailed wind plant output data for specific sites based on that data set. Corresponding retrospective wind forecasts were also included at all selected locations. The combined information from these activities was used to create the Wind Integration National Dataset (WIND), and an extraction tool was developed to allow web-based data access.

  14. 3D Slicer as a tool for interactive brain tumor segmentation.

    PubMed

    Kikinis, Ron; Pieper, Steve

    2011-01-01

    User interaction is required for reliable segmentation of brain tumors in clinical practice and in clinical research. By incorporating current research tools, 3D Slicer provides a set of interactive, easy to use tools that can be efficiently used for this purpose. One of the modules of 3D Slicer is an interactive editor tool, which contains a variety of interactive segmentation effects. Use of these effects for fast and reproducible segmentation of a single glioblastoma from magnetic resonance imaging data is demonstrated. The innovation in this work lies not in the algorithm, but in the accessibility of the algorithm because of its integration into a software platform that is practical for research in a clinical setting.

  15. Tablets in K-12 Education: Integrated Experiences and Implications

    ERIC Educational Resources Information Center

    An, Heejung, Ed.; Alon, Sandra, Ed.; Fuentes, David, Ed.

    2015-01-01

    The inclusion of new and emerging technologies in the education sector has been a topic of interest to researchers, educators, and software developers alike in recent years. Utilizing the proper tools in a classroom setting is a critical factor in student success. "Tablets in K-12 Education: Integrated Experiences and Implications"…

  16. Integrating Cost as an Independent Variable Analysis with Evolutionary Acquisition - A Multiattribute Design Evaluation Approach

    DTIC Science & Technology

    2003-03-01

    within the Automated Cost Estimating Integrated Tools ( ACEIT ) software suite (version 5.x). With this capability, one can set cost targets or time...not allow the user to vary more than one decision variable. This limitation of the ACEIT approach thus hinders a holistic view when attempting to

  17. Technology Integration by General Education Teachers of English Language Learners

    ERIC Educational Resources Information Center

    Anglin, Marie Simone

    2017-01-01

    There is a growing population of English language learners (ELLs) in elementary schools across the United States, and a current academic achievement gap between ELLs and non-ELLs. Researchers have found that integration of Web 2.0 tools has benefitted ELLs in language learning settings, outside of the general classroom. The research problem…

  18. Serving the enterprise and beyond with informatics for integrating biology and the bedside (i2b2)

    PubMed Central

    Weber, Griffin; Mendis, Michael; Gainer, Vivian; Chueh, Henry C; Churchill, Susanne; Kohane, Isaac

    2010-01-01

    Informatics for Integrating Biology and the Bedside (i2b2) is one of seven projects sponsored by the NIH Roadmap National Centers for Biomedical Computing (http://www.ncbcs.org). Its mission is to provide clinical investigators with the tools necessary to integrate medical record and clinical research data in the genomics age, a software suite to construct and integrate the modern clinical research chart. i2b2 software may be used by an enterprise's research community to find sets of interesting patients from electronic patient medical record data, while preserving patient privacy through a query tool interface. Project-specific mini-databases (“data marts”) can be created from these sets to make highly detailed data available on these specific patients to the investigators on the i2b2 platform, as reviewed and restricted by the Institutional Review Board. The current version of this software has been released into the public domain and is available at the URL: http://www.i2b2.org/software. PMID:20190053

  19. Closeout of CRADA JSA 2012S004: Chapter 5, Integrated Control System, of the document of the ESS Conceptual Design Report, publicly available at https://europeanspallationsource.se/accelerator-documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Satogata, Todd

    2013-04-22

    The integrated control system (ICS) is responsible for the whole ESS machine and facility: accelerator, target, neutron scattering instruments and conventional facilities. This unified approach keeps the costs of development, maintenance and support relatively low. ESS has selected a standardised, field-proven controls framework, the Experimental Physics and Industrial Control System (EPICS), which was originally developed jointly by Argonne and Los Alamos National Laboratories. Complementing this selection are best practices and experience from similar facilities regarding platform standardisation, control system development and device integration and commissioning. The components of ICS include the control system core, the control boxes, the BLED databasemore » management system, and the human machine interface. The control system core is a set of systems and tools that make it possible for the control system to provide required data, information and services to engineers, operators, physicists and the facility itself. The core components are the timing system that makes possible clock synchronisation across the facility, the machine protection system (MPS) and the personnel protection system (PPS) that prevent damage to the machine and personnel, and a set of control system services. Control boxes are servers that control a collection of equipment (for example a radio frequency cavity). The integrated control system will include many control boxes that can be assigned to one supplier, such as an internal team, a collaborating institute or a commercial vendor. This approach facilitates a clear division of responsibilities and makes integration much easier. A control box is composed of a standardised hardware platform, components, development tools and services. On the top level, it interfaces with the core control system components (timing, MPS, PPS) and with the human-machine interface. At the bottom, it interfaces with the equipment and parts of the facility through a set of analog and digital signals, real-time control loops and other communication buses. The ICS central data management system is named BLED (beam line element databases). BLED is a set of databases, tools and services that is used to store, manage and access data. It holds vital control system configuration and physics-related (lattice) information about the accelerator, target and instruments. It facilitates control system configuration by bringing together direct input-output controller (IOC) con guration and real-time data from proton and neutron beam line models. BLED also simplifies development and speeds up the code-test-debug cycle. The set of tools that access BLED will be tailored to the needs of different categories of users, such as ESS staff physicists, engineers, and operators; external partner laboratories; and visiting experimental instrument users. The human-machine interface is vital to providing a high-quality experience to ICS users. It encompasses a wide array of devices and software tools, from control room screens to engineer terminal windows; from beam physics data tools to post-mortem data analysis tools. It serves users with a wide range of skills from widely varied backgrounds. The Controls Group is developing a set of user profiles to accommodate this diverse range of use-cases and users.« less

  20. Using Dialogic Methods as a Participatory Knowledge Translation Approach to Promote Integration of Nurse Practitioners in Primary Healthcare Settings.

    PubMed

    Oelke, Nelly D; Plamondon, Katrina M; Mendel, Donna

    2016-01-01

    Background: Nurse practitioners (NPs) were introduced in British Columbia (BC) in 2005 as a new category of health provider. Given the newness of NPs in our health system, it is not unexpected that continued work is required to better integrate NPs in healthcare in BC. Aim: This paper will focus on a research study using dialogic methods as a participatory knowledge translation approach to facilitate integration of NPs in primary healthcare (PHC) settings. Methods: Deliberative dialogue (DD) is a useful knowledge translation tool in health services delivery. Through facilitated conversations with stakeholders, invited to consider research evidence in the context of their experience and tacit knowledge, collective data are generated. DD is a powerful tool to engage stakeholders in the development and implementation of evidence-informed policies and services through discussion of issues, consideration of priorities and development of concrete actions that can be implemented by policy makers and decision-makers. Two DD sessions were held with stakeholders involved in supporting NP integration in a health authority in southern interior BC. Stakeholders were provided syntheses of a literature review and interview results. The first session resulted in the collective development of 10 actions to promote NP integration in PHC settings. The second session was conducted six months later to discuss progress and revisions to actions. Discussion: The use of the dialogic methods used in studying NP integration in PHC settings proved useful in promoting real conversation about the implications of research evidence in living contexts, enabling diverse stakeholders to co-create collaborative actions for further NP integration. The conversations and actions were used to support further NP integration during the study and beyond. Conclusion: DD is a useful approach for transforming health services policy and delivery. It has the potential to move change forward with co-created solutions by the stakeholders involved.

  1. Collaborative Writing with Web 2.0 Technologies: Education Students' Perceptions

    ERIC Educational Resources Information Center

    Brodahl, Cornelia; Hadjerrouit, Said; Hansen, Nils Kristian

    2011-01-01

    Web 2.0 technologies are becoming popular in teaching and learning environments. Among them several online collaborative writing tools, like wikis and blogs, have been integrated into educational settings. Research has been carried out on a wide range of subjects related to wikis, while other, comparable tools like Google Docs and EtherPad remain…

  2. A Practical Guide to the Technology and Adoption of Software Process Automation

    DTIC Science & Technology

    1994-03-01

    IDE’s integration of Software through Pictures, CodeCenter, and FrameMaker ). However, successful use of in- tegrated tools, as reflected in actual...tool for a specific platform. Thus, when a Work Context calls for a word processor, the weaver.tis file can be set up to call FrameMaker for the Sun4

  3. Decision support and data warehousing tools boost competitive advantage.

    PubMed

    Waldo, B H

    1998-01-01

    The ability to communicate across the care continuum is fast becoming an integral component of the successful health enterprise. As integrated delivery systems are formed and patient care delivery is restructured, health care professionals must be able to distribute, access, and evaluate information across departments and care settings. The Aberdeen Group, a computer and communications research and consulting organization, believes that "the single biggest challenge for next-generation health care providers is to improve on how they consolidate and manage information across the continuum of care. This involves building a strategic warehouse of clinical and financial information that can be shared and leveraged by health care professionals, regardless of the location or type of care setting" (Aberdeen Group, Inc., 1997). The value and importance of data and systems integration are growing. Organizations that create a strategy and implement DSS tools to provide decision-makers with the critical information they need to face the competition and maintain quality and costs will have the advantage.

  4. Telehealth in the School Setting: An Integrative Review

    ERIC Educational Resources Information Center

    Reynolds, Cori A.; Maughan, Erin D.

    2015-01-01

    Telehealth, the provision of health care through long-distance telecommunications technology, is a tool that can be used by school nurses to address and improve the health status of schoolchildren. The purpose of this literature review is to examine research related to implementation of telehealth in the school setting. A review of the literature…

  5. Integrating Digital and STEM Practices

    ERIC Educational Resources Information Center

    White, Tobin; Martin, Lee

    2012-01-01

    As mobile devices become increasingly pervasive among youth, the gap between students with and without access to personal computers at home may soon be replaced by a new digital divide: between one set of informal ways of using those tools that are familiar, personally meaningful, and relevant to their out-of-school lives, and another set of uses…

  6. Integrating teaching and authentic research in the field and laboratory settings

    NASA Astrophysics Data System (ADS)

    Daryanto, S.; Wang, L.; Kaseke, K. F.; Ravi, S.

    2016-12-01

    Typically authentic research activities are separated from rigorous classroom teaching. Here we assessed the potential of integrating teaching and research activities both in the field and in the laboratory. We worked with students from both US and abroad without strong science background to utilize advanced environmental sensors and statistical tool to conduct innovative projects. The students include one from Namibia and two local high school students in Indianapolis (through Project SEED, Summer Experience for the Economically Disadvantaged). They conducted leaf potential measurements, isotope measurements and meta-analysis. The experience showed us the great potential of integrating teaching and research in both field and laboratory settings.

  7. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    NASA Astrophysics Data System (ADS)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )

  8. A hospital-wide clinical findings dictionary based on an extension of the International Classification of Diseases (ICD).

    PubMed

    Bréant, C; Borst, F; Campi, D; Griesser, V; Momjian, S

    1999-01-01

    The use of a controlled vocabulary set in a hospital-wide clinical information system is of crucial importance for many departmental database systems to communicate and exchange information. In the absence of an internationally recognized clinical controlled vocabulary set, a new extension of the International statistical Classification of Diseases (ICD) is proposed. It expands the scope of the standard ICD beyond diagnosis and procedures to clinical terminology. In addition, the common Clinical Findings Dictionary (CFD) further records the definition of clinical entities. The construction of the vocabulary set and the CFD is incremental and manual. Tools have been implemented to facilitate the tasks of defining/maintaining/publishing dictionary versions. The design of database applications in the integrated clinical information system is driven by the CFD which is part of the Medical Questionnaire Designer tool. Several integrated clinical database applications in the field of diabetes and neuro-surgery have been developed at the HUG.

  9. A hospital-wide clinical findings dictionary based on an extension of the International Classification of Diseases (ICD).

    PubMed Central

    Bréant, C.; Borst, F.; Campi, D.; Griesser, V.; Momjian, S.

    1999-01-01

    The use of a controlled vocabulary set in a hospital-wide clinical information system is of crucial importance for many departmental database systems to communicate and exchange information. In the absence of an internationally recognized clinical controlled vocabulary set, a new extension of the International statistical Classification of Diseases (ICD) is proposed. It expands the scope of the standard ICD beyond diagnosis and procedures to clinical terminology. In addition, the common Clinical Findings Dictionary (CFD) further records the definition of clinical entities. The construction of the vocabulary set and the CFD is incremental and manual. Tools have been implemented to facilitate the tasks of defining/maintaining/publishing dictionary versions. The design of database applications in the integrated clinical information system is driven by the CFD which is part of the Medical Questionnaire Designer tool. Several integrated clinical database applications in the field of diabetes and neuro-surgery have been developed at the HUG. Images Figure 1 PMID:10566451

  10. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  11. Understanding and reduction of defects on finished EUV masks

    NASA Astrophysics Data System (ADS)

    Liang, Ted; Sanchez, Peter; Zhang, Guojing; Shu, Emily; Nagpal, Rajesh; Stivers, Alan

    2005-05-01

    To reduce the risk of EUV lithography adaptation for the 32nm technology node in 2009, Intel has operated a EUV mask Pilot Line since early 2004. The Pilot Line integrates all the necessary process modules including common tool sets shared with current photomask production as well as EUV specific tools. This integrated endeavor ensures a comprehensive understanding of any issues, and development of solutions for the eventual fabrication of defect-free EUV masks. Two enabling modules for "defect-free" masks are pattern inspection and repair, which have been integrated into the Pilot Line. This is the first time we are able to look at real defects originated from multilayer blanks and patterning process on finished masks over entire mask area. In this paper, we describe our efforts in the qualification of DUV pattern inspection and electron beam mask repair tools for Pilot Line operation, including inspection tool sensitivity, defect classification and characterization, and defect repair. We will discuss the origins of each of the five classes of defects as seen by DUV pattern inspection tool on finished masks, and present solutions of eliminating and mitigating them.

  12. Desiderata for Healthcare Integrated Data Repositories Based on Architectural Comparison of Three Public Repositories

    PubMed Central

    Huser, Vojtech; Cimino, James J.

    2013-01-01

    Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network’s Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management. PMID:24551366

  13. Desiderata for healthcare integrated data repositories based on architectural comparison of three public repositories.

    PubMed

    Huser, Vojtech; Cimino, James J

    2013-01-01

    Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network's Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management.

  14. Language Motivation in a Reconfigured Europe: Access, Identity, Autonomy

    ERIC Educational Resources Information Center

    Ushioda, Ema

    2006-01-01

    In this paper, I propose that we need to develop an appropriate set of conceptual tools for examining motivational issues pertaining to linguistic diversity, mobility and social integration in a rapidly changing and expanding Europe. I begin by drawing on research that has begun to reframe the concept of integrative motivation in the context of…

  15. Integration of Multimedia Courseware into ESP Instruction for Technological Purposes in Higher Technical Education

    ERIC Educational Resources Information Center

    Tsai, Shu-Chiao

    2012-01-01

    This study reports on integrating ESP (English for specific purposes) multimedia courseware for semiconductor technology into instruction of three different language programs in higher education by using it as a silent partner. It focuses primarily on techniques and tools to motivate retention of under-prepared students in an EFL setting. The…

  16. An Application of Variational Theory to an Integrated Walrasian Model of Exchange, Consumption and Production

    NASA Astrophysics Data System (ADS)

    Donato, M. B.; Milasi, M.; Vitanza, C.

    2010-09-01

    An existence result of a Walrasian equilibrium for an integrated model of exchange, consumption and production is obtained. The equilibrium model is characterized in terms of a suitable generalized quasi-variational inequality; so the existence result comes from an original technique which takes into account tools of convex and set-valued analysis.

  17. Vertical and Horizontal Integration of Laboratory Curricula and Course Projects across the Electronic Engineering Technology Program

    ERIC Educational Resources Information Center

    Zhan, Wei; Goulart, Ana; Morgan, Joseph A.; Porter, Jay R.

    2011-01-01

    This paper discusses the details of the curricular development effort with a focus on the vertical and horizontal integration of laboratory curricula and course projects within the Electronic Engineering Technology (EET) program at Texas A&M University. Both software and hardware aspects are addressed. A common set of software tools are…

  18. BUSCA: an integrative web server to predict subcellular localization of proteins.

    PubMed

    Savojardo, Castrense; Martelli, Pier Luigi; Fariselli, Piero; Profiti, Giuseppe; Casadio, Rita

    2018-04-30

    Here, we present BUSCA (http://busca.biocomp.unibo.it), a novel web server that integrates different computational tools for predicting protein subcellular localization. BUSCA combines methods for identifying signal and transit peptides (DeepSig and TPpred3), GPI-anchors (PredGPI) and transmembrane domains (ENSEMBLE3.0 and BetAware) with tools for discriminating subcellular localization of both globular and membrane proteins (BaCelLo, MemLoci and SChloro). Outcomes from the different tools are processed and integrated for annotating subcellular localization of both eukaryotic and bacterial protein sequences. We benchmark BUSCA against protein targets derived from recent CAFA experiments and other specific data sets, reporting performance at the state-of-the-art. BUSCA scores better than all other evaluated methods on 2732 targets from CAFA2, with a F1 value equal to 0.49 and among the best methods when predicting targets from CAFA3. We propose BUSCA as an integrated and accurate resource for the annotation of protein subcellular localization.

  19. The Plant Genome Integrative Explorer Resource: PlantGenIE.org.

    PubMed

    Sundell, David; Mannapperuma, Chanaka; Netotea, Sergiu; Delhomme, Nicolas; Lin, Yao-Cheng; Sjödin, Andreas; Van de Peer, Yves; Jansson, Stefan; Hvidsten, Torgeir R; Street, Nathaniel R

    2015-12-01

    Accessing and exploring large-scale genomics data sets remains a significant challenge to researchers without specialist bioinformatics training. We present the integrated PlantGenIE.org platform for exploration of Populus, conifer and Arabidopsis genomics data, which includes expression networks and associated visualization tools. Standard features of a model organism database are provided, including genome browsers, gene list annotation, Blast homology searches and gene information pages. Community annotation updating is supported via integration of WebApollo. We have produced an RNA-sequencing (RNA-Seq) expression atlas for Populus tremula and have integrated these data within the expression tools. An updated version of the ComPlEx resource for performing comparative plant expression analyses of gene coexpression network conservation between species has also been integrated. The PlantGenIE.org platform provides intuitive access to large-scale and genome-wide genomics data from model forest tree species, facilitating both community contributions to annotation improvement and tools supporting use of the included data resources to inform biological insight. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  20. Web-TCGA: an online platform for integrated analysis of molecular cancer data sets.

    PubMed

    Deng, Mario; Brägelmann, Johannes; Schultze, Joachim L; Perner, Sven

    2016-02-06

    The Cancer Genome Atlas (TCGA) is a pool of molecular data sets publicly accessible and freely available to cancer researchers anywhere around the world. However, wide spread use is limited since an advanced knowledge of statistics and statistical software is required. In order to improve accessibility we created Web-TCGA, a web based, freely accessible online tool, which can also be run in a private instance, for integrated analysis of molecular cancer data sets provided by TCGA. In contrast to already available tools, Web-TCGA utilizes different methods for analysis and visualization of TCGA data, allowing users to generate global molecular profiles across different cancer entities simultaneously. In addition to global molecular profiles, Web-TCGA offers highly detailed gene and tumor entity centric analysis by providing interactive tables and views. As a supplement to other already available tools, such as cBioPortal (Sci Signal 6:pl1, 2013, Cancer Discov 2:401-4, 2012), Web-TCGA is offering an analysis service, which does not require any installation or configuration, for molecular data sets available at the TCGA. Individual processing requests (queries) are generated by the user for mutation, methylation, expression and copy number variation (CNV) analyses. The user can focus analyses on results from single genes and cancer entities or perform a global analysis (multiple cancer entities and genes simultaneously).

  1. Bioenergy Knowledge Discovery Framework Fact Sheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The Bioenergy Knowledge Discovery Framework (KDF) supports the development of a sustainable bioenergy industry by providing access to a variety of data sets, publications, and collaboration and mapping tools that support bioenergy research, analysis, and decision making. In the KDF, users can search for information, contribute data, and use the tools and map interface to synthesize, analyze, and visualize information in a spatially integrated manner.

  2. Youth Mental Health, Family Practice, and Knowledge Translation Video Games about Psychosis: Family Physicians' Perspectives.

    PubMed

    Ferrari, Manuela; Suzanne, Archie

    2017-01-01

    Family practitioners face many challenges providing mental healthcare to youth. Digital technology may offer solutions, but the products often need to be adapted for primary care. This study reports on family physicians' perspectives on the relevance and feasibility of a digital knowledge translation (KT) tool, a set of video games, designed to raise awareness about psychosis, marijuana use, and facilitate access to mental health services among youth. As part of an integrated knowledge translation project, five family physicians from a family health team participated in a focus group. The focus group delved into their perspectives on treating youth with mental health concerns while exploring their views on implementing the digital KT tool in their practice. Qualitative data was analyzed using thematic analysis to identify patterns, concepts, and themes in the transcripts. Three themes were identified: (a) challenges in assessing youth with mental health concerns related to training, time constraints, and navigating the system; (b) feedback on the KT tool; and, (c) ideas on how to integrate it into a primary care practice. Family practitioners felt that the proposed video game KT tool could be used to address youth's mental health and addictions issues in primary care settings.

  3. Pilot study of digital tools to support multimodal hand hygiene in a clinical setting.

    PubMed

    Thirkell, Gary; Chambers, Joanne; Gilbart, Wayne; Thornhill, Kerrill; Arbogast, James; Lacey, Gerard

    2018-03-01

    Digital tools for hand hygiene do not share data, limiting their potential to support multimodal programs. The Christie NHS Foundation Trust, United Kingdom, worked with GOJO (in the United States), MEG (in Ireland), and SureWash (in Ireland) to integrate their systems and pilot their combined use in a clinical setting. A 28-bed medical oncology unit piloted the system for 5 weeks. Live data from the tools were combined to create a novel combined risk status metric that was displayed publicly and via a management Web site. The combined risk status reduced over the pilot period. However, larger and longer duration studies are required to reach statistical significance. Staff and especially patient reaction was positive in that 70% of the hand hygiene training events were by patients. The digital tools did not negatively impact clinical workflow and received positive engagement from staff and patients. The combined risk status did not change significantly over the short pilot period because there was also no specific hand hygiene improvement campaign underway at the time of the pilot study. The results indicate that integrated digital tools can provide both rich data and novel tools that both measure impact and provide feedback to support the implementation of multimodal hand hygiene campaigns, reducing the need for significant additional personnel resources. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. All rights reserved.

  4. Ergonomics action research II: a framework for integrating HF into work system design.

    PubMed

    Neumann, W P; Village, J

    2012-01-01

    This paper presents a conceptual framework that can support efforts to integrate human factors (HF) into the work system design process, where improved and cost-effective application of HF is possible. The framework advocates strategies of broad stakeholder participation, linking of performance and health goals, and process focussed change tools that can help practitioners engage in improvements to embed HF into a firm's work system design process. Recommended tools include business process mapping of the design process, implementing design criteria, using cognitive mapping to connect to managers' strategic goals, tactical use of training and adopting virtual HF (VHF) tools to support the integration effort. Consistent with organisational change research, the framework provides guidance but does not suggest a strict set of steps. This allows more adaptability for the practitioner who must navigate within a particular organisational context to secure support for embedding HF into the design process for improved operator wellbeing and system performance. There has been little scientific literature about how a practitioner might integrate HF into a company's work system design process. This paper proposes a framework for this effort by presenting a coherent conceptual framework, process tools, design tools and procedural advice that can be adapted for a target organisation.

  5. Grid Data and Tools | Grid Modernization | NREL

    Science.gov Websites

    technologies and strategies, including renewable resource data sets and models of the electric power system . Renewable Resource Data A library of resource information to inform the design of efficient, integrated

  6. A Tropical Marine Microbial Natural Products Geobibliography as an Example of Desktop Exploration of Current Research Using Web Visualisation Tools

    PubMed Central

    Mukherjee, Joydeep; Llewellyn, Lyndon E; Evans-Illidge, Elizabeth A

    2008-01-01

    Microbial marine biodiscovery is a recent scientific endeavour developing at a time when information and other technologies are also undergoing great technical strides. Global visualisation of datasets is now becoming available to the world through powerful and readily available software such as Worldwind™, ArcGIS Explorer™ and Google Earth™. Overlaying custom information upon these tools is within the hands of every scientist and more and more scientific organisations are making data available that can also be integrated into these global visualisation tools. The integrated global view that these tools enable provides a powerful desktop exploration tool. Here we demonstrate the value of this approach to marine microbial biodiscovery by developing a geobibliography that incorporates citations on tropical and near-tropical marine microbial natural products research with Google Earth™ and additional ancillary global data sets. The tools and software used are all readily available and the reader is able to use and install the material described in this article. PMID:19172194

  7. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    PubMed

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  8. CMS Configuration Editor: GUI based application for user analysis job

    NASA Astrophysics Data System (ADS)

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  9. Enhancing Competence in Health Social Work Education through Simulation-Based Learning: Strategies from a Case Study of a Family Session

    ERIC Educational Resources Information Center

    Craig, Shelley L.; McInroy, Lauren B.; Bogo, Marion; Thompson, Michelle

    2017-01-01

    Simulation-based learning (SBL) is a powerful tool for social work education, preparing students to practice in integrated health care settings. In an educational environment addressing patient health using an integrated care model, there is growing emphasis on students developing clinical competencies prior to entering clinical placements or…

  10. Integrating iPad Technology in Earth Science K-12 Outreach Courses: Field and Classroom Applications

    ERIC Educational Resources Information Center

    Wallace, Davin J.; Witus, Alexandra E.

    2013-01-01

    Incorporating technology into courses is becoming a common practice in universities. However, in the geosciences, it is difficult to find technology that can easily be transferred between classroom- and field-based settings. The iPad is ideally suited to bridge this gap. Here, we fully integrate the iPad as an educational tool into two…

  11. Pathview: an R/Bioconductor package for pathway-based data integration and visualization.

    PubMed

    Luo, Weijun; Brouwer, Cory

    2013-07-15

    Pathview is a novel tool set for pathway-based data integration and visualization. It maps and renders user data on relevant pathway graphs. Users only need to supply their data and specify the target pathway. Pathview automatically downloads the pathway graph data, parses the data file, maps and integrates user data onto the pathway and renders pathway graphs with the mapped data. Although built as a stand-alone program, Pathview may seamlessly integrate with pathway and functional analysis tools for large-scale and fully automated analysis pipelines. The package is freely available under the GPLv3 license through Bioconductor and R-Forge. It is available at http://bioconductor.org/packages/release/bioc/html/pathview.html and at http://Pathview.r-forge.r-project.org/. luo_weijun@yahoo.com Supplementary data are available at Bioinformatics online.

  12. Outdoor environmental assessment of attention promoting settings for preschool children.

    PubMed

    Mårtensson, F; Boldemann, C; Söderström, M; Blennow, M; Englund, J-E; Grahn, P

    2009-12-01

    The restorative potential of green outdoor environments for children in preschool settings was investigated by measuring the attention of children playing in settings with different environmental features. Eleven preschools with outdoor environments typical for the Stockholm area were assessed using the outdoor play environment categories (OPEC) and the fraction of visible sky from play structures (sky view factor), and 198 children, aged 4.5-6.5 years, were rated by the staff for inattentive, hyperactive and impulsive behaviors with the ECADDES tool. Children playing in large and integrated outdoor areas containing large areas of trees, shrubbery and a hilly terrain showed less often behaviors of inattention (p<.05). The choice of tool for assessment of attention is discussed in relation to outdoor stay and play characteristics in Swedish preschool settings. The results indicate that the restorative potential of green outdoor environments applies also to preschool children and that environmental assessment tools as OPEC can be useful when to locate and develop health-promoting land adjacent to preschools.

  13. Adapting a Technology-Based Implementation Support Tool for Community Mental Health: Challenges and Lessons Learned.

    PubMed

    Livet, Melanie; Fixsen, Amanda

    2018-01-01

    With mental health services shifting to community-based settings, community mental health (CMH) organizations are under increasing pressure to deliver effective services. Despite availability of evidence-based interventions, there is a gap between effective mental health practices and the care that is routinely delivered. Bridging this gap requires availability of easily tailorable implementation support tools to assist providers in implementing evidence-based intervention with quality, thereby increasing the likelihood of achieving the desired client outcomes. This study documents the process and lessons learned from exploring the feasibility of adapting such a technology-based tool, Centervention, as the example innovation, for use in CMH settings. Mixed-methods data on core features, innovation-provider fit, and organizational capacity were collected from 44 CMH providers. Lessons learned included the need to augment delivery through technology with more personal interactions, the importance of customizing and integrating the tool with existing technologies, and the need to incorporate a number of strategies to assist with adoption and use of Centervention-like tools in CMH contexts. This study adds to the current body of literature on the adaptation process for technology-based tools and provides information that can guide additional innovations for CMH settings.

  14. Smart Grid Interoperability Maturity Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Levinson, Alex; Mater, J.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less

  15. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis

    PubMed Central

    Lal, Aparna

    2016-01-01

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change. PMID:26848669

  16. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis.

    PubMed

    Lal, Aparna

    2016-02-02

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  17. Integrated national energy planning and management: methodology and application to Sri Lanka. World Bank technical paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munasinghe, M.; Meier, P.

    1988-01-01

    Given the importance of energy in modern economies, the first part of the volume is devoted to examining some of the key conceptual and analytical tools available for energy-policy analysis and planning. Policy tools and institutional frameworks that will facilitate better energy management are also discussed. Energy-policy analysis is explained, while effective energy management techniques are discussed to achieve desirable national objectives, using a selected set of policies and policy instruments. In the second part of the volume, the actual application of the principles set out earlier is explained through a case study of Sri Lanka. The monograph integrates themore » many aspects of the short-term programs already begun with the options for the medium to long term, and ends with the outline of a long-term strategy for Sri Lanka.« less

  18. Digital Sketch Modelling: Integrating Digital Sketching as a Transition between Sketching and CAD in Industrial Design Education

    ERIC Educational Resources Information Center

    Ranscombe, Charlie; Bissett-Johnson, Katherine

    2017-01-01

    Literature on the use of design tools in educational settings notes an uneasy relationship between student use of traditional hand sketching and digital modelling tools (CAD) during the industrial design process. This is often manifested in the transition from sketching to CAD and exacerbated by a preference of current students to use CAD. In this…

  19. Comprehensive Yet Scalable Health Information Systems for Low Resource Settings: A Collaborative Effort in Sierra Leone

    PubMed Central

    Braa, Jørn; Kanter, Andrew S.; Lesh, Neal; Crichton, Ryan; Jolliffe, Bob; Sæbø, Johan; Kossi, Edem; Seebregts, Christopher J.

    2010-01-01

    We address the problem of how to integrate health information systems in low-income African countries in which technical infrastructure and human resources vary wildly within countries. We describe a set of tools to meet the needs of different service areas including managing aggregate indicators, patient level record systems, and mobile tools for community outreach. We present the case of Sierra Leone and use this case to motivate and illustrate an architecture that allows us to provide services at each level of the health system (national, regional, facility and community) and provide different configurations of the tools as appropriate for the individual area. Finally, we present a, collaborative implementation of this approach in Sierra Leone. PMID:21347003

  20. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    PubMed

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  1. KA-SB: from data integration to large scale reasoning

    PubMed Central

    Roldán-García, María del Mar; Navas-Delgado, Ismael; Kerzazi, Amine; Chniber, Othmane; Molina-Castro, Joaquín; Aldana-Montes, José F

    2009-01-01

    Background The analysis of information in the biological domain is usually focused on the analysis of data from single on-line data sources. Unfortunately, studying a biological process requires having access to disperse, heterogeneous, autonomous data sources. In this context, an analysis of the information is not possible without the integration of such data. Methods KA-SB is a querying and analysis system for final users based on combining a data integration solution with a reasoner. Thus, the tool has been created with a process divided into two steps: 1) KOMF, the Khaos Ontology-based Mediator Framework, is used to retrieve information from heterogeneous and distributed databases; 2) the integrated information is crystallized in a (persistent and high performance) reasoner (DBOWL). This information could be further analyzed later (by means of querying and reasoning). Results In this paper we present a novel system that combines the use of a mediation system with the reasoning capabilities of a large scale reasoner to provide a way of finding new knowledge and of analyzing the integrated information from different databases, which is retrieved as a set of ontology instances. This tool uses a graphical query interface to build user queries easily, which shows a graphical representation of the ontology and allows users o build queries by clicking on the ontology concepts. Conclusion These kinds of systems (based on KOMF) will provide users with very large amounts of information (interpreted as ontology instances once retrieved), which cannot be managed using traditional main memory-based reasoners. We propose a process for creating persistent and scalable knowledgebases from sets of OWL instances obtained by integrating heterogeneous data sources with KOMF. This process has been applied to develop a demo tool , which uses the BioPax Level 3 ontology as the integration schema, and integrates UNIPROT, KEGG, CHEBI, BRENDA and SABIORK databases. PMID:19796402

  2. Integrative Genomics Viewer (IGV): high-performance genomics data visualization and exploration

    PubMed Central

    Thorvaldsdóttir, Helga; Mesirov, Jill P.

    2013-01-01

    Data visualization is an essential component of genomic data analysis. However, the size and diversity of the data sets produced by today’s sequencing and array-based profiling methods present major challenges to visualization tools. The Integrative Genomics Viewer (IGV) is a high-performance viewer that efficiently handles large heterogeneous data sets, while providing a smooth and intuitive user experience at all levels of genome resolution. A key characteristic of IGV is its focus on the integrative nature of genomic studies, with support for both array-based and next-generation sequencing data, and the integration of clinical and phenotypic data. Although IGV is often used to view genomic data from public sources, its primary emphasis is to support researchers who wish to visualize and explore their own data sets or those from colleagues. To that end, IGV supports flexible loading of local and remote data sets, and is optimized to provide high-performance data visualization and exploration on standard desktop systems. IGV is freely available for download from http://www.broadinstitute.org/igv, under a GNU LGPL open-source license. PMID:22517427

  3. Integrative Genomics Viewer (IGV): high-performance genomics data visualization and exploration.

    PubMed

    Thorvaldsdóttir, Helga; Robinson, James T; Mesirov, Jill P

    2013-03-01

    Data visualization is an essential component of genomic data analysis. However, the size and diversity of the data sets produced by today's sequencing and array-based profiling methods present major challenges to visualization tools. The Integrative Genomics Viewer (IGV) is a high-performance viewer that efficiently handles large heterogeneous data sets, while providing a smooth and intuitive user experience at all levels of genome resolution. A key characteristic of IGV is its focus on the integrative nature of genomic studies, with support for both array-based and next-generation sequencing data, and the integration of clinical and phenotypic data. Although IGV is often used to view genomic data from public sources, its primary emphasis is to support researchers who wish to visualize and explore their own data sets or those from colleagues. To that end, IGV supports flexible loading of local and remote data sets, and is optimized to provide high-performance data visualization and exploration on standard desktop systems. IGV is freely available for download from http://www.broadinstitute.org/igv, under a GNU LGPL open-source license.

  4. Integrated Network Decompositions and Dynamic Programming for Graph Optimization (INDDGO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The INDDGO software package offers a set of tools for finding exact solutions to graph optimization problems via tree decompositions and dynamic programming algorithms. Currently the framework offers serial and parallel (distributed memory) algorithms for finding tree decompositions and solving the maximum weighted independent set problem. The parallel dynamic programming algorithm is implemented on top of the MADNESS task-based runtime.

  5. Protein Sequence Annotation Tool (PSAT): A centralized web-based meta-server for high-throughput sequence annotations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Elo; Huang, Amy; Cadag, Eithon

    In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less

  6. Protein Sequence Annotation Tool (PSAT): A centralized web-based meta-server for high-throughput sequence annotations

    DOE PAGES

    Leung, Elo; Huang, Amy; Cadag, Eithon; ...

    2016-01-20

    In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less

  7. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  8. Modification site localization scoring integrated into a search engine.

    PubMed

    Baker, Peter R; Trinidad, Jonathan C; Chalkley, Robert J

    2011-07-01

    Large proteomic data sets identifying hundreds or thousands of modified peptides are becoming increasingly common in the literature. Several methods for assessing the reliability of peptide identifications both at the individual peptide or data set level have become established. However, tools for measuring the confidence of modification site assignments are sparse and are not often employed. A few tools for estimating phosphorylation site assignment reliabilities have been developed, but these are not integral to a search engine, so require a particular search engine output for a second step of processing. They may also require use of a particular fragmentation method and are mostly only applicable for phosphorylation analysis, rather than post-translational modifications analysis in general. In this study, we present the performance of site assignment scoring that is directly integrated into the search engine Protein Prospector, which allows site assignment reliability to be automatically reported for all modifications present in an identified peptide. It clearly indicates when a site assignment is ambiguous (and if so, between which residues), and reports an assignment score that can be translated into a reliability measure for individual site assignments.

  9. The center for causal discovery of biomedical knowledge from big data.

    PubMed

    Cooper, Gregory F; Bahar, Ivet; Becich, Michael J; Benos, Panayiotis V; Berg, Jeremy; Espino, Jeremy U; Glymour, Clark; Jacobson, Rebecca Crowley; Kienholz, Michelle; Lee, Adrian V; Lu, Xinghua; Scheines, Richard

    2015-11-01

    The Big Data to Knowledge (BD2K) Center for Causal Discovery is developing and disseminating an integrated set of open source tools that support causal modeling and discovery of biomedical knowledge from large and complex biomedical datasets. The Center integrates teams of biomedical and data scientists focused on the refinement of existing and the development of new constraint-based and Bayesian algorithms based on causal Bayesian networks, the optimization of software for efficient operation in a supercomputing environment, and the testing of algorithms and software developed using real data from 3 representative driving biomedical projects: cancer driver mutations, lung disease, and the functional connectome of the human brain. Associated training activities provide both biomedical and data scientists with the knowledge and skills needed to apply and extend these tools. Collaborative activities with the BD2K Consortium further advance causal discovery tools and integrate tools and resources developed by other centers. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Initial development of prototype performance model for highway design

    DOT National Transportation Integrated Search

    1997-12-01

    The Federal Highway Administration (FHWA) has undertaken a multiyear project to develop the Interactive Highway Safety Design Model (IHSDM), which is a CADD-based integrated set of software tools to analyze a highway design to identify safety issues ...

  11. Integrated Analysis of Pharmacologic, Clinical, and SNP Microarray Data using Projection onto the Most Interesting Statistical Evidence with Adaptive Permutation Testing

    PubMed Central

    Pounds, Stan; Cao, Xueyuan; Cheng, Cheng; Yang, Jun; Campana, Dario; Evans, William E.; Pui, Ching-Hon; Relling, Mary V.

    2010-01-01

    Powerful methods for integrated analysis of multiple biological data sets are needed to maximize interpretation capacity and acquire meaningful knowledge. We recently developed Projection Onto the Most Interesting Statistical Evidence (PROMISE). PROMISE is a statistical procedure that incorporates prior knowledge about the biological relationships among endpoint variables into an integrated analysis of microarray gene expression data with multiple biological and clinical endpoints. Here, PROMISE is adapted to the integrated analysis of pharmacologic, clinical, and genome-wide genotype data that incorporating knowledge about the biological relationships among pharmacologic and clinical response data. An efficient permutation-testing algorithm is introduced so that statistical calculations are computationally feasible in this higher-dimension setting. The new method is applied to a pediatric leukemia data set. The results clearly indicate that PROMISE is a powerful statistical tool for identifying genomic features that exhibit a biologically meaningful pattern of association with multiple endpoint variables. PMID:21516175

  12. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software

    PubMed Central

    Mejías, Andrés; Herrera, Reyes S.; Márquez, Marco A.; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-01

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)—this one designed using the intuitive graphical system of EJS—located on the user’s computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented. PMID:28067801

  13. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software.

    PubMed

    Mejías, Andrés; Herrera, Reyes S; Márquez, Marco A; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-05

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)-this one designed using the intuitive graphical system of EJS-located on the user's computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.

  14. A novel way of integrating rule-based knowledge into a web ontology language framework.

    PubMed

    Gamberger, Dragan; Krstaçić, Goran; Jović, Alan

    2013-01-01

    Web ontology language (OWL), used in combination with the Protégé visual interface, is a modern standard for development and maintenance of ontologies and a powerful tool for knowledge presentation. In this work, we describe a novel possibility to use OWL also for the conceptualization of knowledge presented by a set of rules. In this approach, rules are represented as a hierarchy of actionable classes with necessary and sufficient conditions defined by the description logic formalism. The advantages are that: the set of the rules is not an unordered set anymore, the concepts defined in descriptive ontologies can be used directly in the bodies of rules, and Protégé presents an intuitive tool for editing the set of rules. Standard ontology reasoning processes are not applicable in this framework, but experiments conducted on the rule sets have demonstrated that the reasoning problems can be successfully solved.

  15. Integrative modules for efficient genome engineering in yeast

    PubMed Central

    Amen, Triana; Kaganovich, Daniel

    2017-01-01

    We present a set of vectors containing integrative modules for efficient genome integration into the commonly used selection marker loci of the yeast Saccharomyces cerevisiae. A fragment for genome integration is generated via PCR with a unique set of short primers and integrated into HIS3, URA3, ADE2, and TRP1 loci. The desired level of expression can be achieved by using constitutive (TEF1p, GPD1p), inducible (CUP1p, GAL1/10p), and daughter-specific (DSE4p) promoters available in the modules. The reduced size of the integrative module compared to conventional integrative plasmids allows efficient integration of multiple fragments. We demonstrate the efficiency of this tool by simultaneously tagging markers of the nucleus, vacuole, actin, and peroxisomes with genomically integrated fluorophores. Improved integration of our new pDK plasmid series allows stable introduction of several genes and can be used for multi-color imaging. New bidirectional promoters (TEF1p-GPD1p, TEF1p-CUP1p, and TEF1p-DSE4p) allow tractable metabolic engineering. PMID:28660202

  16. Multi-criteria development and incorporation into decision tools for health technology adoption.

    PubMed

    Poulin, Paule; Austen, Lea; Scott, Catherine M; Waddell, Cameron D; Dixon, Elijah; Poulin, Michelle; Lafrenière, René

    2013-01-01

    When introducing new health technologies, decision makers must integrate research evidence with local operational management information to guide decisions about whether and under what conditions the technology will be used. Multi-criteria decision analysis can support the adoption or prioritization of health interventions by using criteria to explicitly articulate the health organization's needs, limitations, and values in addition to evaluating evidence for safety and effectiveness. This paper seeks to describe the development of a framework to create agreed-upon criteria and decision tools to enhance a pre-existing local health technology assessment (HTA) decision support program. The authors compiled a list of published criteria from the literature, consulted with experts to refine the criteria list, and used a modified Delphi process with a group of key stakeholders to review, modify, and validate each criterion. In a workshop setting, the criteria were used to create decision tools. A set of user-validated criteria for new health technology evaluation and adoption was developed and integrated into the local HTA decision support program. Technology evaluation and decision guideline tools were created using these criteria to ensure that the decision process is systematic, consistent, and transparent. This framework can be used by others to develop decision-making criteria and tools to enhance similar technology adoption programs. The development of clear, user-validated criteria for evaluating new technologies adds a critical element to improve decision-making on technology adoption, and the decision tools ensure consistency, transparency, and real-world relevance.

  17. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    NASA Astrophysics Data System (ADS)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  18. Dissemination and implementation of an educational tool for veterans on complementary and alternative medicine: a case study.

    PubMed

    Held, Rachel Forster; Santos, Susan; Marki, Michelle; Helmer, Drew

    2016-09-02

    We developed and disseminated an educational DVD to introduce U.S. Veterans to independently-practiced complementary and alternative medicine (CAM) techniques and encourage CAM experimentation. The project's goal was to determine optimal dissemination methods to facilitate implementation within the Veteran's Health Administration. In the first phase, the DVD was disseminated using four methods: passive, provider-mediated, active, and peer-mediated. In the second, implementation phase, "champion" providers who supported CAM integrated dissemination into clinical practice. Qualitative data came from Veteran focus groups and semi-structured provider interviews. Data from both phases was triangulated to identify common themes. Effective dissemination requires engaging patients. Providers who most successfully integrated the DVD into practice already had CAM knowledge, and worked in settings where CAM was accepted clinical practice, or with leadership or infrastructure that supported a culture of CAM use. Institutional buy-in allowed for provider networking and effective implementation of the tool. Providers were given autonomy to determine the most appropriate dissemination strategies, which increased enthusiasm and use. Many of the lessons learned from this project can be applied to dissemination of any new educational tool within a healthcare setting. Results reiterate the importance of utilizing best practices for introducing educational tools within the healthcare context and the need for thoughtful, multi-faceted dissemination strategies.

  19. Youth Mental Health, Family Practice, and Knowledge Translation Video Games about Psychosis: Family Physicians’ Perspectives

    PubMed Central

    Ferrari, Manuela; Suzanne, Archie

    2017-01-01

    Objective Family practitioners face many challenges providing mental healthcare to youth. Digital technology may offer solutions, but the products often need to be adapted for primary care. This study reports on family physicians’ perspectives on the relevance and feasibility of a digital knowledge translation (KT) tool, a set of video games, designed to raise awareness about psychosis, marijuana use, and facilitate access to mental health services among youth. Method As part of an integrated knowledge translation project, five family physicians from a family health team participated in a focus group. The focus group delved into their perspectives on treating youth with mental health concerns while exploring their views on implementing the digital KT tool in their practice. Qualitative data was analyzed using thematic analysis to identify patterns, concepts, and themes in the transcripts. Results Three themes were identified: (a) challenges in assessing youth with mental health concerns related to training, time constraints, and navigating the system; (b) feedback on the KT tool; and, (c) ideas on how to integrate it into a primary care practice. Conclusions Family practitioners felt that the proposed video game KT tool could be used to address youth’s mental health and addictions issues in primary care settings. PMID:29056980

  20. Improving transcriptome construction in non-model organisms: integrating manual and automated gene definition in Emiliania huxleyi.

    PubMed

    Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra

    2014-02-22

    The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly where genomes are not available or incomplete. Despite the large numbers of transcriptome assemblies that have been performed, quality control of the transcript building process, particularly on the protein level, is rarely performed if ever. To test and improve the quality of the automated transcriptome reconstruction, we used manually defined and curated genes, several of them experimentally validated. Several approaches to transcript construction were utilized, based on the available data: a draft genome, high quality RNAseq reads, and ESTs. In order to maximize the contribution of the various data, we integrated methods including de novo and genome based assembly, as well as EST clustering. After each step a set of manually curated genes was used for quality assessment of the transcripts. The interplay between the automated pipeline and the quality control indicated which additional processes were required to improve the transcriptome reconstruction. We discovered that E. huxleyi has a very high percentage of non-canonical splice junctions, and relatively high rates of intron retention, which caused unique issues with the currently available tools. While individual tools missed genes and artificially joined overlapping transcripts, combining the results of several tools improved the completeness and quality considerably. The final collection, created from the integration of several quality control and improvement rounds, was compared to the manually defined set both on the DNA and protein levels, and resulted in an improvement of 20% versus any of the read-based approaches alone. To the best of our knowledge, this is the first time that an automated transcript definition is subjected to quality control using manually defined and curated genes and thereafter the process is improved. We recommend using a set of manually curated genes to troubleshoot transcriptome reconstruction.

  1. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  2. Interactive Electronic Decision Trees for the Integrated Primary Care Management of Febrile Children in Low Resource Settings - Review of existing tools.

    PubMed

    Keitel, Kristina; D'Acremont, Valérie

    2018-04-20

    The lack of effective, integrated diagnostic tools pose a major challenge to the primary care management of febrile childhood illnesses. These limitations are especially evident in low-resource settings and are often inappropriately compensated by antimicrobial over-prescription. Interactive electronic decision trees (IEDTs) have the potential to close these gaps: guiding antibiotic use and better identifying serious disease. This narrative review summarizes existing IEDTs, to provide an overview of their degree of validation, as well as to identify gaps in current knowledge and prospects for future innovation. Structured literature review in PubMed and Embase complemented by google search and contact with developers. Six integrated IEDTs were identified: three (eIMCI, REC, and Bangladesh digital IMCI) based on Integrated Management of Childhood Illnesses (IMCI); four (SL eCCM, MEDSINC, e-iCCM, and D-Tree eCCM) on Integrated Community Case Management (iCCM); two (ALMANACH, MSFeCARE) with a modified IMCI content; and one (ePOCT) that integrates novel content with biomarker testing. The types of publications and evaluation studies varied greatly: the content and evidence-base was published for two (ALMANACH and ePOCT), ALMANACH and ePOCT were validated in efficacy studies. Other types of evaluations, such as compliance, acceptability were available for D-Tree eCCM, eIMCI, ALMANACH. Several evaluations are still ongoing. Future prospects include conducting effectiveness and impact studies using data gathered through larger studies to adapt the medical content to local epidemiology, improving the software and sensors, and Assessing factors that influence compliance and scale-up. IEDTs are valuable tools that have the potential to improve management of febrile children in primary care and increase the rational use of diagnostics and antimicrobials. Next steps in the evidence pathway should be larger effectiveness and impact studies (including cost analysis) and continuous integration of clinically useful diagnostic and treatment innovations. Copyright © 2018. Published by Elsevier Ltd.

  3. An Assessment of IMPAC - Integrated Methodology for Propulsion and Airframe Controls

    NASA Technical Reports Server (NTRS)

    Walker, G. P.; Wagner, E. A.; Bodden, D. S.

    1996-01-01

    This report documents the work done under a NASA sponsored contract to transition to industry technologies developed under the NASA Lewis Research Center IMPAC (Integrated Methodology for Propulsion and Airframe Control) program. The critical steps in IMPAC are exercised on an example integrated flight/propulsion control design for linear airframe/engine models of a conceptual STOVL (Short Take-Off and Vertical Landing) aircraft, and MATRIXX (TM) executive files to implement each step are developed. The results from the example study are analyzed and lessons learned are listed along with recommendations that will improve the application of each design step. The end product of this research is a set of software requirements for developing a user-friendly control design tool which will automate the steps in the IMPAC methodology. Prototypes for a graphical user interface (GUI) are sketched to specify how the tool will interact with the user, and it is recommended to build the tool around existing computer aided control design software packages.

  4. Increasing efficacy of primary care-based counseling for diabetes prevention: rationale and design of the ADAPT (Avoiding Diabetes Thru Action Plan Targeting) trial.

    PubMed

    Mann, Devin M; Lin, Jenny J

    2012-01-23

    Studies have shown that lifestyle behavior changes are most effective to prevent onset of diabetes in high-risk patients. Primary care providers are charged with encouraging behavior change among their patients at risk for diabetes, yet the practice environment and training in primary care often do not support effective provider counseling. The goal of this study is to develop an electronic health record-embedded tool to facilitate shared patient-provider goal setting to promote behavioral change and prevent diabetes. The ADAPT (Avoiding Diabetes Thru Action Plan Targeting) trial leverages an innovative system that integrates evidence-based interventions for behavioral change with already-existing technology to enhance primary care providers' effectiveness to counsel about lifestyle behavior changes. Using principles of behavior change theory, the multidisciplinary design team utilized in-depth interviews and in vivo usability testing to produce a prototype diabetes prevention counseling system embedded in the electronic health record. The core element of the tool is a streamlined, shared goal-setting module within the electronic health record system. The team then conducted a series of innovative, "near-live" usability testing simulations to refine the tool and enhance workflow integration. The system also incorporates a pre-encounter survey to elicit patients' behavior-change goals to help tailor patient-provider goal setting during the clinical encounter and to encourage shared decision making. Lastly, the patients interact with a website that collects their longitudinal behavior data and allows them to visualize their progress over time and compare their progress with other study members. The finalized ADAPT system is now being piloted in a small randomized control trial of providers using the system with prediabetes patients over a six-month period. The ADAPT system combines the influential powers of shared goal setting and feedback, tailoring, modeling, contracting, reminders, and social comparisons to integrate evidence-based behavior-change principles into the electronic health record to maximize provider counseling efficacy during routine primary care clinical encounters. If successful, the ADAPT system may represent an adaptable and scalable technology-enabled behavior-change tool for all primary care providers. ClinicalTrials.gov Identifier NCT01473654.

  5. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Kunst, O.; Cubasch, U.

    2014-12-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: guest password: miklip

  6. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich

    2015-04-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: click on "Guest"

  7. Translating person-centered care into practice: A comparative analysis of motivational interviewing, illness-integration support, and guided self-determination.

    PubMed

    Zoffmann, Vibeke; Hörnsten, Åsa; Storbækken, Solveig; Graue, Marit; Rasmussen, Bodil; Wahl, Astrid; Kirkevold, Marit

    2016-03-01

    Person-centred care [PCC] can engage people in living well with a chronic condition. However, translating PCC into practice is challenging. We aimed to compare the translational potentials of three approaches: motivational interviewing [MI], illness integration support [IIS] and guided self-determination [GSD]. Comparative analysis included eight components: (1) philosophical origin; (2) development in original clinical setting; (3) theoretical underpinnings; (4) overarching goal and supportive processes; (5) general principles, strategies or tools for engaging peoples; (6) health care professionals' background and training; (7) fidelity assessment; (8) reported effects. Although all approaches promoted autonomous motivation, they differed in other ways. Their original settings explain why IIS and GSD strive for life-illness integration, whereas MI focuses on managing ambivalence. IIS and GSD were based on grounded theories, and MI was intuitively developed. All apply processes and strategies to advance professionals' communication skills and engagement; GSD includes context-specific reflection sheets. All offer training programs; MI and GSD include fidelity tools. Each approach has a primary application: MI, when ambivalence threatens positive change; IIS, when integrating newly diagnosed chronic conditions; and GSD, when problem solving is difficult, or deadlocked. Professionals must critically consider the context in their choice of approach. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Proceedings Papers of the AFSC (Air Force Systems Command) Avionics Standardization Conference (2nd) Held at Dayton, Ohio on 30 November-2 December 1982. Volume 1.

    DTIC Science & Technology

    1982-11-01

    Avionic Systems Integration Facilities, Mark van den Broek 1113 and Paul M. Vicen, AFLC/LOE Planning of Operational Software Implementation Tool...classified as software tools, including: * o" Operating System " Language Processors (compilers, assem’blers, link editors) o Source Editors " Debug Systems ...o Data Base Systems o Utilities o Etc . This talk addresses itself to the current set of tools provided JOVIAL iJ73 1750A application programmners by

  9. NASA PDS IMG: Accessing Your Planetary Image Data

    NASA Astrophysics Data System (ADS)

    Padams, J.; Grimes, K.; Hollins, G.; Lavoie, S.; Stanboli, A.; Wagstaff, K.

    2018-04-01

    The Planetary Data System Cartography and Imaging Sciences Node provides a number of tools and services to integrate the 700+ TB of image data so information can be correlated across missions, instruments, and data sets and easily accessed by the science community.

  10. Software Management Environment (SME) concepts and architecture, revision 1

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1992-01-01

    This document presents the concepts and architecture of the Software Management Environment (SME), developed for the Software Engineering Branch of the Flight Dynamic Division (FDD) of GSFC. The SME provides an integrated set of experience-based management tools that can assist software development managers in managing and planning flight dynamics software development projects. This document provides a high-level description of the types of information required to implement such an automated management tool.

  11. Tool Indicates Contact Angles In Bearing Raceways

    NASA Technical Reports Server (NTRS)

    Akian, Richard A.; Butner, Myles F.

    1995-01-01

    Tool devised for use in measuring contact angles between balls and races in previously operated ball bearings. Used on both inner and outer raceways of bearings having cross-sectional widths between approximately 0.5 and 2.0 in. Consists of integral protractor mounted in vertical plane on bracket equipped with leveling screws and circular level indicator. Protractor includes rotatable indicator needle and set of disks of various sizes to fit various raceway curvatures.

  12. Community-based field implementation scenarios of a short message service reporting tool for lymphatic filariasis case estimates in Africa and Asia.

    PubMed

    Mableson, Hayley E; Martindale, Sarah; Stanton, Michelle C; Mackenzie, Charles; Kelly-Hope, Louise A

    2017-01-01

    Lymphatic filariasis (LF) is a neglected tropical disease (NTD) targeted for global elimination by 2020. Currently there is considerable international effort to scale-up morbidity management activities in endemic countries, however there remains a need for rapid, cost-effective methods and adaptable tools for obtaining estimates of people presenting with clinical manifestations of LF, namely lymphoedema and hydrocele. The mHealth tool ' MeasureSMS-Morbidity ' allows health workers in endemic areas to use their own mobile phones to send clinical information in a simple format using short message service (SMS). The experience gained through programmatic use of the tool in five endemic countries across a diversity of settings in Africa and Asia is used here to present implementation scenarios that are suitable for adapting the tool for use in a range of different programmatic, endemic, demographic and health system settings. A checklist of five key factors and sub-questions was used to determine and define specific community-based field implementation scenarios for using the MeasureSMS-Morbidity tool in a range of settings. These factors included: (I) tool feasibility (acceptability; community access and ownership); (II) LF endemicity (high; low prevalence); (III) population demography (urban; rural); (IV) health system structure (human resources; community access); and (V) integration with other diseases (co-endemicity). Based on experiences in Bangladesh, Ethiopia, Malawi, Nepal and Tanzania, four implementation scenarios were identified as suitable for using the MeasureSMS-Morbidity tool for searching and reporting LF clinical case data across a range of programmatic, endemic, demographic and health system settings. These include: (I) urban, high endemic setting with two-tier reporting; (II) rural, high endemic setting with one-tier reporting; (III) rural, high endemic setting with two-tier reporting; and (IV) low-endemic, urban and rural setting with one-tier reporting. A decision-making framework built from the key factors and questions, and the resulting four implementation scenarios is proposed as a means of using the MeasureSMS-Morbidity tool. This framework will help national LF programmes consider appropriate methods to implement a survey using this tool to improve estimates of the clinical burden of LF. Obtaining LF case estimates is a vital step towards the elimination of LF as a public health problem in endemic countries.

  13. Effective use of metadata in the integration and analysis of multi-dimensional optical data

    NASA Astrophysics Data System (ADS)

    Pastorello, G. Z.; Gamon, J. A.

    2012-12-01

    Data discovery and integration relies on adequate metadata. However, creating and maintaining metadata is time consuming and often poorly addressed or avoided altogether, leading to problems in later data analysis and exchange. This is particularly true for research fields in which metadata standards do not yet exist or are under development, or within smaller research groups without enough resources. Vegetation monitoring using in-situ and remote optical sensing is an example of such a domain. In this area, data are inherently multi-dimensional, with spatial, temporal and spectral dimensions usually being well characterized. Other equally important aspects, however, might be inadequately translated into metadata. Examples include equipment specifications and calibrations, field/lab notes and field/lab protocols (e.g., sampling regimen, spectral calibration, atmospheric correction, sensor view angle, illumination angle), data processing choices (e.g., methods for gap filling, filtering and aggregation of data), quality assurance, and documentation of data sources, ownership and licensing. Each of these aspects can be important as metadata for search and discovery, but they can also be used as key data fields in their own right. If each of these aspects is also understood as an "extra dimension," it is possible to take advantage of them to simplify the data acquisition, integration, analysis, visualization and exchange cycle. Simple examples include selecting data sets of interest early in the integration process (e.g., only data collected according to a specific field sampling protocol) or applying appropriate data processing operations to different parts of a data set (e.g., adaptive processing for data collected under different sky conditions). More interesting scenarios involve guided navigation and visualization of data sets based on these extra dimensions, as well as partitioning data sets to highlight relevant subsets to be made available for exchange. The DAX (Data Acquisition to eXchange) Web-based tool uses a flexible metadata representation model and takes advantage of multi-dimensional data structures to translate metadata types into data dimensions, effectively reshaping data sets according to available metadata. With that, metadata is tightly integrated into the acquisition-to-exchange cycle, allowing for more focused exploration of data sets while also increasing the value of, and incentives for, keeping good metadata. The tool is being developed and tested with optical data collected in different settings, including laboratory, field, airborne, and satellite platforms.

  14. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  15. Understanding Kidney Disease: Toward the Integration of Regulatory Networks Across Species

    PubMed Central

    Ju, Wenjun; Brosius, Frank C.

    2010-01-01

    Animal models have long been useful in investigating both normal and abnormal human physiology. Systems biology provides a relatively new set of approaches to identify similarities and differences between animal models and humans that may lead to a more comprehensive understanding of human kidney pathophysiology. In this review, we briefly describe how genome-wide analyses of mouse models have helped elucidate features of human kidney diseases, discuss strategies to achieve effective network integration, and summarize currently available web-based tools that may facilitate integration of data across species. The rapid progress in systems biology and orthology, as well as the advent of web-based tools to facilitate these processes, now make it possible to take advantage of knowledge from distant animal species in targeted identification of regulatory networks that may have clinical relevance for human kidney diseases. PMID:21044762

  16. An Integrative Review of Pediatric Fall Risk Assessment Tools.

    PubMed

    DiGerolamo, Kimberly; Davis, Katherine Finn

    Patient fall prevention begins with accurate risk assessment. However, sustained improvements in prevention and quality of care include use of validated fall risk assessment tools (FRATs). The goal of FRATs is to identify patients at highest risk. Adult FRATs are often borrowed from to create tools for pediatric patients. Though factors associated with pediatric falls in the hospital setting are similar to those in adults, such as mobility, medication use, and cognitive impairment, adult FRATs and the factors associated with them do not adequately assess risk in children. Articles were limited to English language, ages 0-21years, and publish date 2006-2015. The search yielded 22 articles. Ten were excluded as the population was primarily adult or lacked discussion of a FRAT. Critical appraisal and findings were synthesized using the Johns Hopkins Nursing evidence appraisal system. Twelve articles relevant to fall prevention in the pediatric hospital setting that discussed fall risk assessment and use of a FRAT were reviewed. Comparison between and accuracy of FRATs is challenged when different classifications, definitions, risk stratification, and inclusion criteria are used. Though there are several pediatric FRATs published in the literature, none have been found to be reliable and valid across institutions and diverse populations. This integrative review highlights the importance of choosing a FRAT based on an institution's identified risk factors and validating the tool for one's own patient population as well as using the tool in conjunction with nursing clinical judgment to guide interventions. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  18. ENVIRONMENTAL SYSTEMS MANAGEMENT AS APPLIED TO WATERSHEDS, UTILIZING REMOTE SENSING, DECISION SUPPORT AND VISUALIZATION

    EPA Science Inventory

    Environmental Systems Management as a conceptual framework and as a set of interdisciplinary analytical approaches will be described within the context of sustainable watershed management, within devergent complex ecosystems. A specific subset of integrated tools are deployed to...

  19. Making Stories, Making Sense.

    ERIC Educational Resources Information Center

    Rubin, Andee

    1980-01-01

    Describes a set of tools (called Story Maker, Pre-Fab Story Maker, and Story Maker Maker) for teaching creative writing that takes advantage of the potential power of the social situation in the classroom, focuses on higher-level structures in text, and integrates reading and writing in school. (AEA)

  20. Transdisciplinary Research on Cancer-Healing Systems Between Biomedicine and the Maya of Guatemala: A Tool for Reciprocal Reflexivity in a Multi-Epistemological Setting.

    PubMed

    Berger-González, Mónica; Stauffacher, Michael; Zinsstag, Jakob; Edwards, Peter; Krütli, Pius

    2016-01-01

    Transdisciplinarity (TD) is a participatory research approach in which actors from science and society work closely together. It offers means for promoting knowledge integration and finding solutions to complex societal problems, and can be applied within a multiplicity of epistemic systems. We conducted a TD process from 2011 to 2014 between indigenous Mayan medical specialists from Guatemala and Western biomedical physicians and scientists to study cancer. Given the immense cultural gap between the partners, it was necessary to develop new methods to overcome biases induced by ethnocentric behaviors and power differentials. This article describes this intercultural cooperation and presents a method of reciprocal reflexivity (Bidirectional Emic-Etic tool) developed to overcome them. As a result of application, researchers observed successful knowledge integration at the epistemic level, the social-organizational level, and the communicative level throughout the study. This approach may prove beneficial to others engaged in facilitating participatory health research in complex intercultural settings. © The Author(s) 2015.

  1. A comparison of participation outcome measures and the International Classification of Functioning, Disability and Health Core Sets for traumatic brain injury.

    PubMed

    Chung, Pearl; Yun, Sarah Jin; Khan, Fary

    2014-02-01

    To compare the contents of participation outcome measures in traumatic brain injury with the International Classification of Functioning, Disability and Health (ICF) Core Sets for traumatic brain injury. A systematic search with an independent review process selected relevant articles to identify outcome measures in participation in traumatic brain injury. Instruments used in two or more studies were linked to the ICF categories, which identified categories in participation for comparison with the ICF Core Sets for traumatic brain injury. Selected articles (n = 101) identified participation instruments used in two or more studies (n = 9): Community Integration Questionnaire, Craig Handicap Assessment and Reporting Technique, Mayo-Portland Adaptability Inventory-4 Participation Index, Sydney Psychosocial Reintegration Scale Version-2, Participation Assessment with Recombined Tool-Objective, Community Integration Measure, Participation Objective Participation Subjective, Community Integration Questionnaire-2, and Quality of Community Integration Questionnaire. Each instrument was linked to 4-35 unique second-level ICF categories, of which 39-100% related to participation. Instruments addressed 86-100% and 50-100% of the participation categories in the Comprehensive and Brief ICF Core Sets for traumatic brain injury, respectively. Participation measures in traumatic brain injury were compared with the ICF Core Sets for traumatic brain injury. The ICF Core Sets for traumatic brain injury could contribute to the development and selection of participation measures.

  2. NGS-based approach to determine the presence of HPV and their sites of integration in human cancer genome.

    PubMed

    Chandrani, P; Kulkarni, V; Iyer, P; Upadhyay, P; Chaubal, R; Das, P; Mulherkar, R; Singh, R; Dutt, A

    2015-06-09

    Human papilloma virus (HPV) accounts for the most common cause of all virus-associated human cancers. Here, we describe the first graphic user interface (GUI)-based automated tool 'HPVDetector', for non-computational biologists, exclusively for detection and annotation of the HPV genome based on next-generation sequencing data sets. We developed a custom-made reference genome that comprises of human chromosomes along with annotated genome of 143 HPV types as pseudochromosomes. The tool runs on a dual mode as defined by the user: a 'quick mode' to identify presence of HPV types and an 'integration mode' to determine genomic location for the site of integration. The input data can be a paired-end whole-exome, whole-genome or whole-transcriptome data set. The HPVDetector is available in public domain for download: http://www.actrec.gov.in/pi-webpages/AmitDutt/HPVdetector/HPVDetector.html. On the basis of our evaluation of 116 whole-exome, 23 whole-transcriptome and 2 whole-genome data, we were able to identify presence of HPV in 20 exomes and 4 transcriptomes of cervical and head and neck cancer tumour samples. Using the inbuilt annotation module of HPVDetector, we found predominant integration of viral gene E7, a known oncogene, at known 17q21, 3q27, 7q35, Xq28 and novel sites of integration in the human genome. Furthermore, co-infection with high-risk HPVs such as 16 and 31 were found to be mutually exclusive compared with low-risk HPV71. HPVDetector is a simple yet precise and robust tool for detecting HPV from tumour samples using variety of next-generation sequencing platforms including whole genome, whole exome and transcriptome. Two different modes (quick detection and integration mode) along with a GUI widen the usability of HPVDetector for biologists and clinicians with minimal computational knowledge.

  3. Exploring Challenges and Opportunities of Coproduction: USDA Climate Hub Efforts to Integrate Coproduction with Applied Research and Decision Support Tool Development in the Northwest

    NASA Astrophysics Data System (ADS)

    Roesch-McNally, G.; Prendeville, H. R.

    2017-12-01

    A lack of coproduction, the joint production of new technologies or knowledge among technical experts and other groups, is arguably one of the reasons why much scientific information and resulting decision support systems are not very usable. Increasingly, public agencies and academic institutions are emphasizing the importance of coproduction of scientific knowledge and decision support systems in order to facilitate greater engagement between the scientific community and key stakeholder groups. Coproduction has been embraced as a way for the scientific community to develop actionable scientific information that will assist end users in solving real-world problems. Increasing the level of engagement and stakeholder buy-in to the scientific process is increasingly necessary, particularly in the context of growing politicization of science and the scientific process. Coproduction can be an effective way to build trust and can build-on and integrate local and traditional knowledge. Employing coproduction strategies may enable the development of more relevant and useful information and decision support tools that address stakeholder challenges at relevant scales. The USDA Northwest Climate Hub has increasingly sought ways to integrate coproduction in the development of both applied research projects and the development of decision support systems. Integrating coproduction, however, within existing institutions is not always simple, given that coproduction is often more focused on process than products and products are, for better or worse, often the primary focus of applied research and tool development projects. The USDA Northwest Climate Hub sought to integrate coproduction into our FY2017 call for proposal process. As a result we have a set of proposals and fledgling projects that fall along the engagement continuum (see Figure 1- attached). We will share the challenges and opportunities that emerged from this purposeful integration of coproduction into the work that we prioritized for funding. This effort highlights strategies for how federal agencies might consider how and whether to codify coproduction tenets into their collaborations and agenda setting.

  4. The potential impact of integrated malaria transmission control on entomologic inoculation rate in highly endemic areas.

    PubMed

    Killeen, G F; McKenzie, F E; Foy, B D; Schieffelin, C; Billingsley, P F; Beier, J C

    2000-05-01

    We have used a relatively simple but accurate model for predicting the impact of integrated transmission control on the malaria entomologic inoculation rate (EIR) at four endemic sites from across sub-Saharan Africa and the southwest Pacific. The simulated campaign incorporated modestly effective vaccine coverage, bed net use, and larval control. The results indicate that such campaigns would reduce EIRs at all four sites by 30- to 50-fold. Even without the vaccine, 15- to 25-fold reductions of EIR were predicted, implying that integrated control with a few modestly effective tools can meaningfully reduce malaria transmission in a range of endemic settings. The model accurately predicts the effects of bed nets and indoor spraying and demonstrates that they are the most effective tools available for reducing EIR. However, the impact of domestic adult vector control is amplified by measures for reducing the rate of emergence of vectors or the level of infectiousness of the human reservoir. We conclude that available tools, including currently neglected methods for larval control, can reduce malaria transmission intensity enough to alleviate mortality. Integrated control programs should be implemented to the fullest extent possible, even in areas of intense transmission, using simple models as decision-making tools. However, we also conclude that to eliminate malaria in many areas of intense transmission is beyond the scope of methods which developing nations can currently afford. New, cost-effective, practical tools are needed if malaria is ever to be eliminated from highly endemic areas.

  5. PH5 for integrating and archiving different data types

    NASA Astrophysics Data System (ADS)

    Azevedo, Steve; Hess, Derick; Beaudoin, Bruce

    2016-04-01

    PH5 is IRIS PASSCAL's file organization of HDF5 used for seismic data. The extensibility and portability of HDF5 allows the PH5 format to evolve and operate on a variety of platforms and interfaces. To make PH5 even more flexible, the seismic metadata is separated from the time series data in order to achieve gains in performance as well as ease of use and to simplify user interaction. This separation affords easy updates to metadata after the data are archived without having to access waveform data. To date, PH5 is currently used for integrating and archiving active source, passive source, and onshore-offshore seismic data sets with the IRIS Data Management Center (DMC). Active development to make PH5 fully compatible with FDSN web services and deliver StationXML is near completion. We are also exploring the feasibility of utilizing QuakeML for active seismic source representation. The PH5 software suite, PIC KITCHEN, comprises in-field tools that include data ingestion (e.g. RefTek format, SEG-Y, and SEG-D), meta-data management tools including QC, and a waveform review tool. These tools enable building archive ready data in-field during active source experiments greatly decreasing the time to produce research ready data sets. Once archived, our online request page generates a unique web form and pre-populates much of it based on the metadata provided to it from the PH5 file. The data requester then can intuitively select the extraction parameters as well as data subsets they wish to receive (current output formats include SEG-Y, SAC, mseed). The web interface then passes this on to the PH5 processing tools to generate the requested seismic data, and e-mail the requester a link to the data set automatically as soon as the data are ready. PH5 file organization was originally designed to hold seismic time series data and meta-data from controlled source experiments using RefTek data loggers. The flexibility of HDF5 has enabled us to extend the use of PH5 in several areas one of which is using PH5 to handle very large data sets. PH5 is also good at integrating data from various types of seismic experiments such as OBS, onshore-offshore, controlled source, and passive recording. HDF5 is capable of holding practically any type of digital data so integrating GPS data with seismic data is possible. Since PH5 is a common format and data contained in HDF5 is accessible randomly it has been easy to extend to include new input and output data formats as community needs arise.

  6. Model-based setup assistant for progressive tools

    NASA Astrophysics Data System (ADS)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  7. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    PubMed

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  8. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  9. Indico central - events organisation, ergonomics and collaboration tools integration

    NASA Astrophysics Data System (ADS)

    Benito Gonzélez López, José; Ferreira, José Pedro; Baron, Thomas

    2010-04-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  10. geneCommittee: a web-based tool for extensively testing the discriminatory power of biologically relevant gene sets in microarray data classification.

    PubMed

    Reboiro-Jato, Miguel; Arrais, Joel P; Oliveira, José Luis; Fdez-Riverola, Florentino

    2014-01-30

    The diagnosis and prognosis of several diseases can be shortened through the use of different large-scale genome experiments. In this context, microarrays can generate expression data for a huge set of genes. However, to obtain solid statistical evidence from the resulting data, it is necessary to train and to validate many classification techniques in order to find the best discriminative method. This is a time-consuming process that normally depends on intricate statistical tools. geneCommittee is a web-based interactive tool for routinely evaluating the discriminative classification power of custom hypothesis in the form of biologically relevant gene sets. While the user can work with different gene set collections and several microarray data files to configure specific classification experiments, the tool is able to run several tests in parallel. Provided with a straightforward and intuitive interface, geneCommittee is able to render valuable information for diagnostic analyses and clinical management decisions based on systematically evaluating custom hypothesis over different data sets using complementary classifiers, a key aspect in clinical research. geneCommittee allows the enrichment of microarrays raw data with gene functional annotations, producing integrated datasets that simplify the construction of better discriminative hypothesis, and allows the creation of a set of complementary classifiers. The trained committees can then be used for clinical research and diagnosis. Full documentation including common use cases and guided analysis workflows is freely available at http://sing.ei.uvigo.es/GC/.

  11. NEEMO 21: Tools, Techniques, Technologies & Training for Science Exploration EVA

    NASA Technical Reports Server (NTRS)

    Graff, Trevor

    2016-01-01

    The 21st mission of the NASA Extreme Environment Mission Operations (NEEMO) was a highly integrated operational test and evaluation of tools, techniques, technologies, and training for science driven exploration during Extravehicular Activity (EVA).The 16-day mission was conducted from the Aquarius habitat, an underwater laboratory, off the coast of Key Largo, FL. The unique facility, authentic science objectives, and diverse skill-sets of the crew/team facilitate the planning and design for future space exploration.

  12. Sharing clinical information across care settings: the birth of an integrated assessment system

    PubMed Central

    Gray, Leonard C; Berg, Katherine; Fries, Brant E; Henrard, Jean-Claude; Hirdes, John P; Steel, Knight; Morris, John N

    2009-01-01

    Background Population ageing, the emergence of chronic illness, and the shift away from institutional care challenge conventional approaches to assessment systems which traditionally are problem and setting specific. Methods From 2002, the interRAI research collaborative undertook development of a suite of assessment tools to support assessment and care planning of persons with chronic illness, frailty, disability, or mental health problems across care settings. The suite constitutes an early example of a "third generation" assessment system. Results The rationale and development strategy for the suite is described, together with a description of potential applications. To date, ten instruments comprise the suite, each comprising "core" items shared among the majority of instruments and "optional" items that are specific to particular care settings or situations. Conclusion This comprehensive suite offers the opportunity for integrated multi-domain assessment, enabling electronic clinical records, data transfer, ease of interpretation and streamlined training. PMID:19402891

  13. Hymenoptera Genome Database: integrating genome annotations in HymenopteraMine

    PubMed Central

    Elsik, Christine G.; Tayal, Aditi; Diesh, Colin M.; Unni, Deepak R.; Emery, Marianne L.; Nguyen, Hung N.; Hagen, Darren E.

    2016-01-01

    We report an update of the Hymenoptera Genome Database (HGD) (http://HymenopteraGenome.org), a model organism database for insect species of the order Hymenoptera (ants, bees and wasps). HGD maintains genomic data for 9 bee species, 10 ant species and 1 wasp, including the versions of genome and annotation data sets published by the genome sequencing consortiums and those provided by NCBI. A new data-mining warehouse, HymenopteraMine, based on the InterMine data warehousing system, integrates the genome data with data from external sources and facilitates cross-species analyses based on orthology. New genome browsers and annotation tools based on JBrowse/WebApollo provide easy genome navigation, and viewing of high throughput sequence data sets and can be used for collaborative genome annotation. All of the genomes and annotation data sets are combined into a single BLAST server that allows users to select and combine sequence data sets to search. PMID:26578564

  14. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.

  15. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  16. Anatomical information in radiation treatment planning.

    PubMed

    Kalet, I J; Wu, J; Lease, M; Austin-Seymour, M M; Brinkley, J F; Rosse, C

    1999-01-01

    We report on experience and insights gained from prototyping, for clinical radiation oncologists, a new access tool for the University of Washington Digital Anatomist information resources. This access tool is designed to integrate with a radiation therapy planning (RTP) system in use in a clinical setting. We hypothesize that the needs of practitioners in a clinical setting are different from the needs of students, the original targeted users of the Digital Anatomist system, but that a common knowledge resource can serve both. Our prototype was designed to help define those differences and study the feasibility of a full anatomic reference system that will support both clinical radiation therapy and all the existing educational applications.

  17. Customer-experienced rapid prototyping

    NASA Astrophysics Data System (ADS)

    Zhang, Lijuan; Zhang, Fu; Li, Anbo

    2008-12-01

    In order to describe accurately and comprehend quickly the perfect GIS requirements, this article will integrate the ideas of QFD (Quality Function Deployment) and UML (Unified Modeling Language), and analyze the deficiency of prototype development model, and will propose the idea of the Customer-Experienced Rapid Prototyping (CE-RP) and describe in detail the process and framework of the CE-RP, from the angle of the characteristics of Modern-GIS. The CE-RP is mainly composed of Customer Tool-Sets (CTS), Developer Tool-Sets (DTS) and Barrier-Free Semantic Interpreter (BF-SI) and performed by two roles of customer and developer. The main purpose of the CE-RP is to produce the unified and authorized requirements data models between customer and software developer.

  18. Pedagogical Approaches for Technology-Integrated Science Teaching

    ERIC Educational Resources Information Center

    Hennessy, Sara; Wishart, Jocelyn; Whitelock, Denise; Deaney, Rosemary; Brawn, Richard; la Velle, Linda; McFarlane, Angela; Ruthven, Kenneth; Winterbottom, Mark

    2007-01-01

    The two separate projects described have examined how teachers exploit computer-based technologies in supporting learning of science at secondary level. This paper examines how pedagogical approaches associated with these technological tools are adapted to both the cognitive and structuring resources available in the classroom setting. Four…

  19. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    NASA Astrophysics Data System (ADS)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    The recently adopted Sustainable Development Goals (SDGs) - a set of 17 measurable and time-bound goals with 169 associated targets for 2030 - are highly inclusive challenges before the world community ranging from eliminating poverty to human rights, inequality, a secure world and protection of the environment. Each individual goal or target by themselves present enormous tasks, taken together they are overwhelming. There strong and weak interlinkages, hence trade-offs and complementarities among goals and targets. Some targets may affect several goals while other goals and targets may conflict or be mutually exclusive (Ref). Meeting each of these requires the judicious exploitation of resource, with energy playing an important role. Such complexity demands to be addressed in an integrated way using systems analysis tools to support informed policy formulation, planning, allocation of scarce resources, monitoring progress, effectiveness and review at different scales. There is no one size fits all methodology that conceivably could include all goal and targets simultaneously. But there are methodologies encapsulating critical subsets of the goal and targets with strong interlinkages with a 'soft' reflection on the weak interlinkages. Universal food security or sustainable energy for all inherently support goals and targets on human rights and equality but possibly at the cost of biodiversity or desertification. Integrated analysis and planning tools are not yet commonplace at national universities - or indeed in many policy making organs. What is needed is a fundamental realignment of institutions and integrations of their planning processes and decision making. We introduce a series of open source tools to support the SDG planning and implementation process. The Global User-friendly CLEW Open Source (GLUCOSE) tool optimizes resource interactions and constraints; The Global Electrification Tool kit (GETit) provides the first global spatially explicit electrification simulator; A national CLEW tool allows for the optimization of national level integrated resource use and Macro-CLEW presents the same allowing for detailed economic-biophysical interactions. Finally open Model Management Infrastructure (MoManI) is presented that allows for the rapid prototyping of new additions to, or new resource optimization tools. Collectively these tools provide insights to some fifteen of the SDGs and are made publicly available with support to governments and academic institutions.

  20. Software Management Environment (SME): Components and algorithms

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1994-01-01

    This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'

  1. CuGene as a tool to view and explore genomic data

    NASA Astrophysics Data System (ADS)

    Haponiuk, Michał; Pawełkowicz, Magdalena; Przybecki, Zbigniew; Nowak, Robert M.

    2017-08-01

    Integrated CuGene is an easy-to-use, open-source, on-line tool that can be used to browse, analyze, and query genomic data and annotations. It places annotation tracks beneath genome coordinate positions, allowing rapid visual correlation of different types of information. It also allows users to upload and display their own experimental results or annotation sets. An important functionality of the application is a possibility to find similarity between sequences by applying four different algorithms of different accuracy. The presented tool was tested on real genomic data and is extensively used by Polish Consortium of Cucumber Genome Sequencing.

  2. kpLogo: positional k-mer analysis reveals hidden specificity in biological sequences

    PubMed Central

    2017-01-01

    Abstract Motifs of only 1–4 letters can play important roles when present at key locations within macromolecules. Because existing motif-discovery tools typically miss these position-specific short motifs, we developed kpLogo, a probability-based logo tool for integrated detection and visualization of position-specific ultra-short motifs from a set of aligned sequences. kpLogo also overcomes the limitations of conventional motif-visualization tools in handling positional interdependencies and utilizing ranked or weighted sequences increasingly available from high-throughput assays. kpLogo can be found at http://kplogo.wi.mit.edu/. PMID:28460012

  3. iVAX: An integrated toolkit for the selection and optimization of antigens and the design of epitope-driven vaccines.

    PubMed

    Moise, Leonard; Gutierrez, Andres; Kibria, Farzana; Martin, Rebecca; Tassone, Ryan; Liu, Rui; Terry, Frances; Martin, Bill; De Groot, Anne S

    2015-01-01

    Computational vaccine design, also known as computational vaccinology, encompasses epitope mapping, antigen selection and immunogen design using computational tools. The iVAX toolkit is an integrated set of tools that has been in development since 1998 by De Groot and Martin. It comprises a suite of immunoinformatics algorithms for triaging candidate antigens, selecting immunogenic and conserved T cell epitopes, eliminating regulatory T cell epitopes, and optimizing antigens for immunogenicity and protection against disease. iVAX has been applied to vaccine development programs for emerging infectious diseases, cancer antigens and biodefense targets. Several iVAX vaccine design projects have had success in pre-clinical studies in animal models and are progressing toward clinical studies. The toolkit now incorporates a range of immunoinformatics tools for infectious disease and cancer immunotherapy vaccine design. This article will provide a guide to the iVAX approach to computational vaccinology.

  4. BioVLAB-MMIA: a cloud environment for microRNA and mRNA integrated analysis (MMIA) on Amazon EC2.

    PubMed

    Lee, Hyungro; Yang, Youngik; Chae, Heejoon; Nam, Seungyoon; Choi, Donghoon; Tangchaisin, Patanachai; Herath, Chathura; Marru, Suresh; Nephew, Kenneth P; Kim, Sun

    2012-09-01

    MicroRNAs, by regulating the expression of hundreds of target genes, play critical roles in developmental biology and the etiology of numerous diseases, including cancer. As a vast amount of microRNA expression profile data are now publicly available, the integration of microRNA expression data sets with gene expression profiles is a key research problem in life science research. However, the ability to conduct genome-wide microRNA-mRNA (gene) integration currently requires sophisticated, high-end informatics tools, significant expertise in bioinformatics and computer science to carry out the complex integration analysis. In addition, increased computing infrastructure capabilities are essential in order to accommodate large data sets. In this study, we have extended the BioVLAB cloud workbench to develop an environment for the integrated analysis of microRNA and mRNA expression data, named BioVLAB-MMIA. The workbench facilitates computations on the Amazon EC2 and S3 resources orchestrated by the XBaya Workflow Suite. The advantages of BioVLAB-MMIA over the web-based MMIA system include: 1) readily expanded as new computational tools become available; 2) easily modifiable by re-configuring graphic icons in the workflow; 3) on-demand cloud computing resources can be used on an "as needed" basis; 4) distributed orchestration supports complex and long running workflows asynchronously. We believe that BioVLAB-MMIA will be an easy-to-use computing environment for researchers who plan to perform genome-wide microRNA-mRNA (gene) integrated analysis tasks.

  5. Using Laboratory Homework to Facilitate Skill Integration and Assess Understanding in Intermediate Physics Courses

    NASA Astrophysics Data System (ADS)

    Johnston, Marty; Jalkio, Jeffrey

    2013-04-01

    By the time students have reached the intermediate level physics courses they have been exposed to a broad set of analytical, experimental, and computational skills. However, their ability to independently integrate these skills into the study of a physical system is often weak. To address this weakness and assess their understanding of the underlying physical concepts we have introduced laboratory homework into lecture based, junior level theoretical mechanics and electromagnetics courses. A laboratory homework set replaces a traditional one and emphasizes the analysis of a single system. In an exercise, students use analytical and computational tools to predict the behavior of a system and design a simple measurement to test their model. The laboratory portion of the exercises is straight forward and the emphasis is on concept integration and application. The short student reports we collect have revealed misconceptions that were not apparent in reviewing the traditional homework and test problems. Work continues on refining the current problems and expanding the problem sets.

  6. Managing and Communicating Operational Workflow: Designing and Implementing an Electronic Outpatient Whiteboard.

    PubMed

    Steitz, Bryan D; Weinberg, Stuart T; Danciu, Ioana; Unertl, Kim M

    2016-01-01

    Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings.

  7. Current Development Status of an Integrated Tool for Modeling Quasi-static Deformation in the Solid Earth

    NASA Astrophysics Data System (ADS)

    Williams, C. A.; Dicaprio, C.; Simons, M.

    2003-12-01

    With the advent of projects such as the Plate Boundary Observatory and future InSAR missions, spatially dense geodetic data of high quality will provide an increasingly detailed picture of the movement of the earth's surface. To interpret such information, powerful and easily accessible modeling tools are required. We are presently developing such a tool that we feel will meet many of the needs for evaluating quasi-static earth deformation. As a starting point, we begin with a modified version of the finite element code TECTON, which has been specifically designed to solve tectonic problems involving faulting and viscoelastic/plastic earth behavior. As our first priority, we are integrating the code into the GeoFramework, which is an extension of the Python-based Pyre modeling framework. The goal of this framework is to provide simplified user interfaces for powerful modeling codes, to provide easy access to utilities such as meshers and visualization tools, and to provide a tight integration between different modeling tools so they can interact with each other. The initial integration of the code into this framework is essentially complete, and a more thorough integration, where Python-based drivers control the entire solution, will be completed in the near future. We have an evolving set of priorities that we expect to solidify as we receive more input from the modeling community. Current priorities include the development of linear and quadratic tetrahedral elements, the development of a parallelized version of the code using the PETSc libraries, the addition of more complex rheologies, realistic fault friction models, adaptive time stepping, and spherical geometries. In this presentation we describe current progress toward our various priorities, briefly describe the structure of the code within the GeoFramework, and demonstrate some sample applications.

  8. Simulation of networks of spiking neurons: A review of tools and strategies

    PubMed Central

    Brette, Romain; Rudolph, Michelle; Carnevale, Ted; Hines, Michael; Beeman, David; Bower, James M.; Diesmann, Markus; Morrison, Abigail; Goodman, Philip H.; Harris, Frederick C.; Zirpe, Milind; Natschläger, Thomas; Pecevski, Dejan; Ermentrout, Bard; Djurfeldt, Mikael; Lansner, Anders; Rochel, Olivier; Vieville, Thierry; Muller, Eilif; Davison, Andrew P.; El Boustani, Sami

    2009-01-01

    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin–Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks. PMID:17629781

  9. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth D. Luff

    2002-06-30

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). Luff Exploration Company is applying these tools for analysis of carbonate reservoirs in the southern Williston Basin. The integrated software programs are designed to be used by small team consisting of an engineer, geologist and geophysicist. The software tools are flexible and robust, allowing application in many environments for hydrocarbon reservoirs. Keystone elements of the software tools include clustering and neural-network techniques. The tools are used to transform seismic attribute data to reservoir characteristics such as storage (phi-h), probable oil-water contacts, structural depths andmore » structural growth history. When these reservoir characteristics are combined with neural network or fuzzy logic solvers, they can provide a more complete description of the reservoir. This leads to better estimates of hydrocarbons in place, areal limits and potential for infill or step-out drilling. These tools were developed and tested using seismic, geologic and well data from the Red River Play in Bowman County, North Dakota and Harding County, South Dakota. The geologic setting for the Red River Formation is shallow-shelf carbonate at a depth from 8000 to 10,000 ft.« less

  10. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth D. Luff

    2002-09-30

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). Luff Exploration Company is applying these tools for analysis of carbonate reservoirs in the southern Williston Basin. The integrated software programs are designed to be used by small team consisting of an engineer, geologist and geophysicist. The software tools are flexible and robust, allowing application in many environments for hydrocarbon reservoirs. Keystone elements of the software tools include clustering and neural-network techniques. The tools are used to transform seismic attribute data to reservoir characteristics such as storage (phi-h), probable oil-water contacts, structural depths andmore » structural growth history. When these reservoir characteristics are combined with neural network or fuzzy logic solvers, they can provide a more complete description of the reservoir. This leads to better estimates of hydrocarbons in place, areal limits and potential for infill or step-out drilling. These tools were developed and tested using seismic, geologic and well data from the Red River Play in Bowman County, North Dakota and Harding County, South Dakota. The geologic setting for the Red River Formation is shallow-shelf carbonate at a depth from 8000 to 10,000 ft.« less

  11. ISAAC - InterSpecies Analysing Application using Containers.

    PubMed

    Baier, Herbert; Schultz, Jörg

    2014-01-15

    Information about genes, transcripts and proteins is spread over a wide variety of databases. Different tools have been developed using these databases to identify biological signals in gene lists from large scale analysis. Mostly, they search for enrichments of specific features. But, these tools do not allow an explorative walk through different views and to change the gene lists according to newly upcoming stories. To fill this niche, we have developed ISAAC, the InterSpecies Analysing Application using Containers. The central idea of this web based tool is to enable the analysis of sets of genes, transcripts and proteins under different biological viewpoints and to interactively modify these sets at any point of the analysis. Detailed history and snapshot information allows tracing each action. Furthermore, one can easily switch back to previous states and perform new analyses. Currently, sets can be viewed in the context of genomes, protein functions, protein interactions, pathways, regulation, diseases and drugs. Additionally, users can switch between species with an automatic, orthology based translation of existing gene sets. As todays research usually is performed in larger teams and consortia, ISAAC provides group based functionalities. Here, sets as well as results of analyses can be exchanged between members of groups. ISAAC fills the gap between primary databases and tools for the analysis of large gene lists. With its highly modular, JavaEE based design, the implementation of new modules is straight forward. Furthermore, ISAAC comes with an extensive web-based administration interface including tools for the integration of third party data. Thus, a local installation is easily feasible. In summary, ISAAC is tailor made for highly explorative interactive analyses of gene, transcript and protein sets in a collaborative environment.

  12. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    USDA-ARS?s Scientific Manuscript database

    Interactive modules for data exploration and visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data sets with a user-friendly interface. Individual modules were designed to provide toolsets to enable interactive ...

  13. Integrated description of agricultural field experiments and production: the ICASA version 2.0 data standards

    USDA-ARS?s Scientific Manuscript database

    Agricultural research increasingly seeks to quantify complex interactions of processes for a wide range of environmental conditions and crop management scenarios, leading to investigation where multiple sets of experimental data are examined using tools such as simulation and regression. The use of ...

  14. Methods and Strategies: Science Notebooks as Learning Tools

    ERIC Educational Resources Information Center

    Fulton, Lori

    2017-01-01

    Writing in science is a natural way to integrate science and literacy and meet the goals set by the "Next Generation Science Standards" ("NGSS") and the "Common Core State Standards" ("CCSS"), which call for learners to be engaged with the language of science. This means that students should record…

  15. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  16. Development approach to an enterprise-wide medication reconciliation tool in a free-standing pediatric hospital with commercial best-of-breed systems.

    PubMed

    Yu, Feliciano B; Leising, Scott; Turner, Scott

    2007-10-11

    Medication reconciliation is essential to providing a safer patient environment during transitions of care in the clinical setting. Current solutions include a mixed-bag of paper and electronic processes. Best-of-breed health information systems architecture poses a specific challenge to organizations that have limited software development resources. Using readily available service-oriented technology, a prototype for an integrated medication reconciliation tool is developed for use in an academic pediatric hospital with commercial systems.

  17. Reducing Information Overload in Large Seismic Data Sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their effortsmore » to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.« less

  18. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    NASA Astrophysics Data System (ADS)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  19. A case study on the application of International Classification of Functioning, Disability and Health (ICF)-based tools for vocational rehabilitation in spinal cord injury.

    PubMed

    Glässel, Andrea; Rauch, Alexandra; Selb, Melissa; Emmenegger, Karl; Lückenkemper, Miriam; Escorpizo, Reuben

    2012-01-01

    Vocational rehabilitation (VR) plays a key role in bringing persons with acquired disabilities back to work, while encouraging employment participation. The purpose of this case study is to illustrate the systematic application of International Classification of Functioning, Disability, and Health (ICF)-based documentation tools by using ICF Core Sets in VR shown with a case example of a client with traumatic spinal cord injury (SCI). The client was a 26-year-old male with paraplegia (7th thoracic level), working in the past as a mover. This case study describes the integration of the ICF Core Sets for VR into an interdisciplinary rehabilitation program by using ICF-based documentation tools. Improvements in the client's impairments, activity limitations, and participation restrictions were observed following rehabilitation. Goals in different areas of functioning were achieved. The use of the ICF Core Sets in VR allows a comprehensive assessment of the client's level of functioning and intervention planning. Specifically, the Brief ICF Core Set in VR can provide domains for intervention relevant to each member of an interdisciplinary team and hence, can facilitate the VR management process in a SCI center in Switzerland.

  20. Boutiques: a flexible framework to integrate command-line applications in computing platforms.

    PubMed

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-05-01

    We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science.

  1. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  2. Integrated Decision Tools for Sustainable Watershed/Ground Water and Crop Health using Predictive Weather, Remote Sensing, and Irrigation Decision Tools

    NASA Astrophysics Data System (ADS)

    Jones, A. S.; Andales, A.; McGovern, C.; Smith, G. E. B.; David, O.; Fletcher, S. J.

    2017-12-01

    US agricultural and Govt. lands have a unique co-dependent relationship, particularly in the Western US. More than 30% of all irrigated US agricultural output comes from lands sustained by the Ogallala Aquifer in the western Great Plains. Six US Forest Service National Grasslands reside within the aquifer region, consisting of over 375,000 ha (3,759 km2) of USFS managed lands. Likewise, National Forest lands are the headwaters to many intensive agricultural regions. Our Ogallala Aquifer team is enhancing crop irrigation decision tools with predictive weather and remote sensing data to better manage water for irrigated crops within these regions. An integrated multi-model software framework is used to link irrigation decision tools, resulting in positive management benefits on natural water resources. Teams and teams-of-teams can build upon these multi-disciplinary multi-faceted modeling capabilities. For example, the CSU Catalyst for Innovative Partnerships program has formed a new multidisciplinary team that will address "Rural Wealth Creation" focusing on the many integrated links between economic, agricultural production and management, natural resource availabilities, and key social aspects of govt. policy recommendations. By enhancing tools like these with predictive weather and other related data (like in situ measurements, hydrologic models, remotely sensed data sets, and (in the near future) linking to agro-economic and life cycle assessment models) this work demonstrates an integrated data-driven future vision of inter-meshed dynamic systems that can address challenging multi-system problems. We will present the present state of the work and opportunities for future involvement.

  3. THE POTENTIAL IMPACT OF INTEGRATED MALARIA TRANSMISSION CONTROL ON ENTOMOLOGIC INOCULATION RATE IN HIGHLY ENDEMIC AREAS

    PubMed Central

    KILLEEN, GERRY F.; McKENZIE, F. ELLIS; FOY, BRIAN D.; SCHIEFFELIN, CATHERINE; BILLINGSLEY, PETER F.; BEIER, JOHN C.

    2008-01-01

    We have used a relatively simple but accurate model for predicting the impact of integrated transmission control on the malaria entomologic inoculation rate (EIR) at four endemic sites from across sub-Saharan Africa and the southwest Pacific. The simulated campaign incorporated modestly effective vaccine coverage, bed net use, and larval control. The results indicate that such campaigns would reduce EIRs at all four sites by 30- to 50-fold. Even without the vaccine, 15- to 25-fold reductions of EIR were predicted, implying that integrated control with a few modestly effective tools can meaningfully reduce malaria transmission in a range of endemic settings. The model accurately predicts the effects of bed nets and indoor spraying and demonstrates that they are the most effective tools available for reducing EIR. However, the impact of domestic adult vector control is amplified by measures for reducing the rate of emergence of vectors or the level of infectiousness of the human reservoir. We conclude that available tools, including currently neglected methods for larval control, can reduce malaria transmission intensity enough to alleviate mortality. Integrated control programs should be implemented to the fullest extent possible, even in areas of intense transmission, using simple models as decision-making tools. However, we also conclude that to eliminate malaria in many areas of intense transmission is beyond the scope of methods which developing nations can currently afford. New, cost-effective, practical tools are needed if malaria is ever to be eliminated from highly endemic areas. PMID:11289662

  4. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  5. Caries risk assessment tool and prevention protocol for public health nurses in mother and child health centers, Israel.

    PubMed

    Natapov, Lena; Dekel-Markovich, Dan; Granit-Palmon, Hadas; Aflalo, Efrat; Zusman, Shlomo Paul

    2018-01-01

    Dental caries is the most prevalent chronic disease in children. Caries risk assessment tools enable the dentists, physicians, and nondental health care providers to assess the individual's risk. Intervention by nurses in primary care settings can contribute to the establishment of oral health habits and prevention of dental disease. In Israel, Mother and Child Health Centers provide free preventive services for pregnant women and children by public health nurses. A caries prevention program in health centers started in 2015. Nurses underwent special training regarding caries prevention. A customized Caries Risk Assessment tool and Prevention Protocol for nurses, based on the AAPD tool, was introduced. A two-step evaluation was conducted which included a questionnaire and in-depth phone interviews. Twenty-eight (out of 46) health centers returned a completed questionnaire. Most nurses believed that oral health preventive services should be incorporated into their daily work. In the in-depth phone interviews, nurses stated that the integration of the program into their busy daily schedule was realistic and appropriate. The lack of specific dental module for computer program was mentioned as an implementation difficulty. The wide use of our tool by nurses supports its simplicity and feasibility which enables quick calculation and informed decision making. The nurses readily embraced the tool and it became an integral part of their toolkit. We provide public health nurses with a caries risk assessment tool and prevention protocol thus integrating oral health into general health of infants and toddlers. © 2017 Wiley Periodicals, Inc.

  6. Use of social media in graduate-level medical humanities education: two pilot studies from Penn State College of Medicine.

    PubMed

    George, Daniel R; Dellasega, Cheryl

    2011-01-01

    Social media strategies in education have gained attention for undergraduate students, but there has been relatively little application with graduate populations in medicine. To use and evaluate the integration of new social media tools into the curricula of two graduate-level medical humanities electives offered to 4th-year students at Penn State College of Medicine. Instructors selected five social media tools--Twitter, YouTube, Flickr, blogging and Skype--to promote student learning. At the conclusion of each course, students provided quantitative and qualitative course evaluation. Students gave high favourability ratings to both courses, and expressed that the integration of social media into coursework augmented learning and collaboration. Others identified challenges including: demands on time, concerns about privacy and lack of facility with technology. Integrating social media tools into class activities appeared to offer manifold benefits over traditional classroom methods, including real-time communication outside of the classroom, connecting with medical experts, collaborative opportunities and enhanced creativity. Social media can augment learning opportunities within humanities curriculum in medical schools, and help students acquire tools and skill-sets for problem solving, networking, and collaboration. Command of technologies will be increasingly important to the practice of medicine in the twenty-first century.

  7. The Implementation of a Multi-Backend Database System (MDBS). Part I. Software Engineering Strategies and Efforts Towards a Prototype MDBS.

    DTIC Science & Technology

    1983-06-01

    for DEC PDPll systems. MAINSAIL was developed and is marketed with a set of integrated tools for program development. The syntax of the language is...stack, and to test for stack-full and stack-empty conditions. This technique is useful in enforcing data integrity and in con- trolling concurrent...and market MAINSAIL. The language is distinguished by its portability. The same compiler and runtime system, both written in MAINSAIL, are the basis

  8. A New Simulation Framework for Autonomy in Robotic Missions

    NASA Technical Reports Server (NTRS)

    Flueckiger, Lorenzo; Neukom, Christian

    2003-01-01

    Autonomy is a key factor in remote robotic exploration and there is significant activity addressing the application of autonomy to remote robots. It has become increasingly important to have simulation tools available to test the autonomy algorithms. While indus1;rial robotics benefits from a variety of high quality simulation tools, researchers developing autonomous software are still dependent primarily on block-world simulations. The Mission Simulation Facility I(MSF) project addresses this shortcoming with a simulation toolkit that will enable developers of autonomous control systems to test their system s performance against a set of integrated, standardized simulations of NASA mission scenarios. MSF provides a distributed architecture that connects the autonomous system to a set of simulated components replacing the robot hardware and its environment.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arion is a library and tool set that enables researchers to holistically define test system models. To define a complex system for testing an algorithm or control requires expertise across multiple domains. Simulating a complex system requires the integration of multiple simulators and test hardware, each with their own specification languages and concepts. This requires extensive set of knowledge and capabilities. Arion was developed to alleviate this challenge. Arion is a library of Java libraries that abstracts the concepts from supported simulators into a cohesive model language that allows someone to build models to their needed level of fidelity andmore » expertise. Arion is also a software tool that translates the users model back into the specification languages of the simulators and test hardware needed for execution.« less

  10. Integrative structural annotation of de novo RNA-Seq provides an accurate reference gene set of the enormous genome of the onion (Allium cepa L.)

    PubMed Central

    Kim, Seungill; Kim, Myung-Shin; Kim, Yong-Min; Yeom, Seon-In; Cheong, Kyeongchae; Kim, Ki-Tae; Jeon, Jongbum; Kim, Sunggil; Kim, Do-Sun; Sohn, Seong-Han; Lee, Yong-Hwan; Choi, Doil

    2015-01-01

    The onion (Allium cepa L.) is one of the most widely cultivated and consumed vegetable crops in the world. Although a considerable amount of onion transcriptome data has been deposited into public databases, the sequences of the protein-coding genes are not accurate enough to be used, owing to non-coding sequences intermixed with the coding sequences. We generated a high-quality, annotated onion transcriptome from de novo sequence assembly and intensive structural annotation using the integrated structural gene annotation pipeline (ISGAP), which identified 54,165 protein-coding genes among 165,179 assembled transcripts totalling 203.0 Mb by eliminating the intron sequences. ISGAP performed reliable annotation, recognizing accurate gene structures based on reference proteins, and ab initio gene models of the assembled transcripts. Integrative functional annotation and gene-based SNP analysis revealed a whole biological repertoire of genes and transcriptomic variation in the onion. The method developed in this study provides a powerful tool for the construction of reference gene sets for organisms based solely on de novo transcriptome data. Furthermore, the reference genes and their variation described here for the onion represent essential tools for molecular breeding and gene cloning in Allium spp. PMID:25362073

  11. Modelling pH evolution and lactic acid production in the growth medium of a lactic acid bacterium: application to set a biological TTI.

    PubMed

    Ellouze, M; Pichaud, M; Bonaiti, C; Coroller, L; Couvert, O; Thuault, D; Vaillant, R

    2008-11-30

    Time temperature integrators or indicators (TTIs) are effective tools making the continuous monitoring of the time temperature history of chilled products possible throughout the cold chain. Their correct setting is of critical importance to ensure food quality. The objective of this study was to develop a model to facilitate accurate settings of the CRYOLOG biological TTI, TRACEO. Experimental designs were used to investigate and model the effects of the temperature, the TTI inoculum size, pH, and water activity on its response time. The modelling process went through several steps addressing growth, acidification and inhibition phenomena in dynamic conditions. The model showed satisfactory results and validations in industrial conditions gave clear evidence that such a model is a valuable tool, not only to predict accurate response times of TRACEO, but also to propose precise settings to manufacture the appropriate TTI to trace a particular food according to a given time temperature scenario.

  12. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    NASA Astrophysics Data System (ADS)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be inferred. Lastly, this work draws attention to the use of the IDCT suite as an education resource for inspiring and training new students and engineers in the field of disaster risk reduction.

  13. Semi-automatic Data Integration using Karma

    NASA Astrophysics Data System (ADS)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.

    2017-12-01

    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of Karma specifically for the geosciences. In particular, we show how Karma can be used intuitively to obtain the mapping model between case study data sources and a publicly available and expressive target ontology that has been designed to capture a broad set of concepts in geoscience with standardized, easily searchable names.

  14. Knowledge-based approach for generating target system specifications from a domain model

    NASA Technical Reports Server (NTRS)

    Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan

    1992-01-01

    Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.

  15. Approaches to Fungal Genome Annotation

    PubMed Central

    Haas, Brian J.; Zeng, Qiandong; Pearson, Matthew D.; Cuomo, Christina A.; Wortman, Jennifer R.

    2011-01-01

    Fungal genome annotation is the starting point for analysis of genome content. This generally involves the application of diverse methods to identify features on a genome assembly such as protein-coding and non-coding genes, repeats and transposable elements, and pseudogenes. Here we describe tools and methods leveraged for eukaryotic genome annotation with a focus on the annotation of fungal nuclear and mitochondrial genomes. We highlight the application of the latest technologies and tools to improve the quality of predicted gene sets. The Broad Institute eukaryotic genome annotation pipeline is described as one example of how such methods and tools are integrated into a sequencing center’s production genome annotation environment. PMID:22059117

  16. Internationalization Assessment in Schools: Theoretical Contributions and Practical Implications

    ERIC Educational Resources Information Center

    Yemini, Miri

    2012-01-01

    Cosmopolitan, international capital has become an integral ingredient in the set of competencies considered to provide a competitive edge and to be required for affective citizenship in the 21st century. Recently, internationalization of education has become a more common phenomenon in local schools around the world, serving as a tool to provide…

  17. Internationalization of Secondary Education--Lessons from Israeli Palestinian-Arab Schools in Tel Aviv-Jaffa

    ERIC Educational Resources Information Center

    Yemini, Miri

    2014-01-01

    Cosmopolitan capital became an integral ingredient in the set of competencies considered to provide a competitive edge for effective citizenship in the 21st century. Recently, internationalization of education became a more common phenomenon in secondary schools, serving as a tool to provide youth with cosmopolitan capital and relevant…

  18. Outdoor Learning: Curriculum Imperatives and Community Relevance in a Rural Setting

    ERIC Educational Resources Information Center

    Maposah-Kandemiri, Myra; Higgins, Peter; McLaughlin, Pat

    2009-01-01

    This paper presents a review of practice in the use of the outdoors, and its potential in the teaching of environmental education at Muenzaniso, a Zimbabwean primary school. The school uses permaculture and Integrated Land Use Design as tools for sustainable environmental management. Evidence suggests that pressing community and curriculum…

  19. Integration of Computers into a Course on Biostatistics

    ERIC Educational Resources Information Center

    Gjerde, Craig L.

    1977-01-01

    The biostatistics course for undergraduate medical and dental students at the University of Connecticut Health Center is taught by the Keller Plan, and students can use computers to analyze data sets and to score their unit tests. The computer is an essential tool for data analysis and an attractive option for test scoring. (LBH)

  20. APMS 3.0 Flight Analyst Guide: Aviation Performance Measuring System

    NASA Technical Reports Server (NTRS)

    Jay, Griff; Prothero, Gary; Romanowski, Timothy; Lynch, Robert; Lawrence, Robert; Rosenthal, Loren

    2004-01-01

    The Aviation Performance Measuring System (APMS) is a method-embodied in software-that uses mathematical algorithms and related procedures to analyze digital flight data extracted from aircraft flight data recorders. APMS consists of an integrated set of tools used to perform two primary functions: a) Flight Data Importation b) Flight Data Analysis.

  1. Interferometric surface mapping with variable sensitivity.

    PubMed

    Jaerisch, W; Makosch, G

    1978-03-01

    In the photolithographic process, presently employed for the production of integrated circuits, sets of correlated masks are used for exposing the photoresist on silicon wafers. Various sets of masks which are printed in different printing tools must be aligned correctly with respect to the structures produced on the wafer in previous process steps. Even when perfect alignment is considered, displacements and distortions of the printed wafer patterns occur. They are caused by imperfections of the printing tools or/and wafer deformations resulting from high temperature processes. Since the electrical properties of the final integrated circuits and therefore the manufacturing yield depend to a great extent on the precision at which such patterns are superimposed, simple and fast overlay measurements and flatness measurements as well are very important in IC-manufacturing. A simple optical interference method for flatness measurements will be described which can be used under manufacturing conditions. This method permits testing of surface height variations by nearly grazing light incidence by absence of a physical reference plane. It can be applied to polished surfaces and rough surfaces as well.

  2. Space Communications and Navigation (SCaN) Network Simulation Tool Development and Its Use Cases

    NASA Technical Reports Server (NTRS)

    Jennings, Esther; Borgen, Richard; Nguyen, Sam; Segui, John; Stoenescu, Tudor; Wang, Shin-Ywan; Woo, Simon; Barritt, Brian; Chevalier, Christine; Eddy, Wesley

    2009-01-01

    In this work, we focus on the development of a simulation tool to assist in analysis of current and future (proposed) network architectures for NASA. Specifically, the Space Communications and Navigation (SCaN) Network is being architected as an integrated set of new assets and a federation of upgraded legacy systems. The SCaN architecture for the initial missions for returning humans to the moon and beyond will include the Space Network (SN) and the Near-Earth Network (NEN). In addition to SCaN, the initial mission scenario involves a Crew Exploration Vehicle (CEV), the International Space Station (ISS) and NASA Integrated Services Network (NISN). We call the tool being developed the SCaN Network Integration and Engineering (SCaN NI&E) Simulator. The intended uses of such a simulator are: (1) to characterize performance of particular protocols and configurations in mission planning phases; (2) to optimize system configurations by testing a larger parameter space than may be feasible in either production networks or an emulated environment; (3) to test solutions in order to find issues/risks before committing more significant resources needed to produce real hardware or flight software systems. We describe two use cases of the tool: (1) standalone simulation of CEV to ISS baseline scenario to determine network performance, (2) participation in Distributed Simulation Integration Laboratory (DSIL) tests to perform function testing and verify interface and interoperability of geographically dispersed simulations/emulations.

  3. Measuring food and nutrition security: tools and considerations for use among people living with HIV.

    PubMed

    Fielden, Sarah J; Anema, Aranka; Fergusson, Pamela; Muldoon, Katherine; Grede, Nils; de Pee, Saskia

    2014-10-01

    As an increasing number of countries implement integrated food and nutrition security (FNS) and HIV programs, global stakeholders need clarity on how to best measure FNS at the individual and household level. This paper reviews prominent FNS measurement tools, and describes considerations for interpretation in the context of HIV. There exist a range of FNS measurement tools and many have been adapted for use in HIV-endemic settings. Considerations in selecting appropriate tools include sub-types (food sufficiency, dietary diversity and food safety); scope/level of application; and available resources. Tools need to reflect both the needs of PLHIV and affected households and FNS program objectives. Generalized food sufficiency and dietary diversity tools may provide adequate measures of FNS in PLHIV for programmatic applications. Food consumption measurement tools provide further data for clinical or research applications. Measurement of food safety is an important, but underdeveloped aspect of assessment, especially for PLHIV.

  4. Meta-analysis of screening and case finding tools for depression in cancer: evidence based recommendations for clinical practice on behalf of the Depression in Cancer Care consensus group.

    PubMed

    Mitchell, Alex J; Meader, Nick; Davies, Evan; Clover, Kerrie; Carter, Gregory L; Loscalzo, Matthew J; Linden, Wolfgang; Grassi, Luigi; Johansen, Christoffer; Carlson, Linda E; Zabora, James

    2012-10-01

    To examine the validity of screening and case-finding tools used in the identification of depression as defined by an ICD10/DSM-IV criterion standard. We identified 63 studies involving 19 tools (in 33 publications) designed to help clinicians identify depression in cancer settings. We used a standardized rating system. We excluded 11 tools without at least two independent studies, leaving 8 tools for comparison. Across all cancer stages there were 56 diagnostic validity studies (n=10,009). For case-finding, one stem question, two stem questions and the BDI-II all had level 2 evidence (2a, 2b and 2c respectively) and given their better acceptability we gave the stem questions a grade B recommendation. For screening, two stem questions had level 1b evidence (with high acceptability) and the BDI-II had level 2c evidence. For every 100 people screened in advanced cancer, the two questions would accurately detect 18 cases, while missing only 1 and correctly reassure 74 with 7 falsely identified. For every 100 people screened in non-palliative settings the BDI-II would accurately detect 17 cases, missing 2 and correctly re-assure 70, with 11 falsely identified as cases. The main cautions are the reliance on DSM-IV definitions of major depression, the large number of small studies and the paucity of data for many tools in specific settings. Although no single tool could be offered unqualified support, several tools are likely to improve upon unassisted clinical recognition. In clinical practice, all tools should form part of an integrated approach involving further follow-up, clinical assessment and evidence based therapy. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. A Research Agenda for Malaria Eradication: Modeling

    PubMed Central

    2011-01-01

    Malaria modeling can inform policy and guide research for malaria elimination and eradication from local implementation to global policy. A research and development agenda for malaria modeling is proposed, to support operations and to enhance the broader eradication research agenda. Models are envisioned as an integral part of research, planning, and evaluation, and modelers should ideally be integrated into multidisciplinary teams to update the models iteratively, communicate their appropriate use, and serve the needs of other research scientists, public health specialists, and government officials. A competitive and collaborative framework will result in policy recommendations from multiple, independently derived models and model systems that share harmonized databases. As planned, modeling results will be produced in five priority areas: (1) strategic planning to determine where and when resources should be optimally allocated to achieve eradication; (2) management plans to minimize the evolution of drug and pesticide resistance; (3) impact assessments of new and needed tools to interrupt transmission; (4) technical feasibility assessments to determine appropriate combinations of tools, an associated set of target intervention coverage levels, and the expected timelines for achieving a set of goals in different socio-ecological settings and different health systems; and (5) operational feasibility assessments to weigh the economic costs, capital investments, and human resource capacities required. PMID:21283605

  6. Decoding the genome with an integrative analysis tool: combinatorial CRM Decoder.

    PubMed

    Kang, Keunsoo; Kim, Joomyeong; Chung, Jae Hoon; Lee, Daeyoup

    2011-09-01

    The identification of genome-wide cis-regulatory modules (CRMs) and characterization of their associated epigenetic features are fundamental steps toward the understanding of gene regulatory networks. Although integrative analysis of available genome-wide information can provide new biological insights, the lack of novel methodologies has become a major bottleneck. Here, we present a comprehensive analysis tool called combinatorial CRM decoder (CCD), which utilizes the publicly available information to identify and characterize genome-wide CRMs in a species of interest. CCD first defines a set of the epigenetic features which is significantly associated with a set of known CRMs as a code called 'trace code', and subsequently uses the trace code to pinpoint putative CRMs throughout the genome. Using 61 genome-wide data sets obtained from 17 independent mouse studies, CCD successfully catalogued ∼12 600 CRMs (five distinct classes) including polycomb repressive complex 2 target sites as well as imprinting control regions. Interestingly, we discovered that ∼4% of the identified CRMs belong to at least two different classes named 'multi-functional CRM', suggesting their functional importance for regulating spatiotemporal gene expression. From these examples, we show that CCD can be applied to any potential genome-wide datasets and therefore will shed light on unveiling genome-wide CRMs in various species.

  7. Web scraping technologies in an API world.

    PubMed

    Glez-Peña, Daniel; Lourenço, Anália; López-Fernández, Hugo; Reboiro-Jato, Miguel; Fdez-Riverola, Florentino

    2014-09-01

    Web services are the de facto standard in biomedical data integration. However, there are data integration scenarios that cannot be fully covered by Web services. A number of Web databases and tools do not support Web services, and existing Web services do not cover for all possible user data demands. As a consequence, Web data scraping, one of the oldest techniques for extracting Web contents, is still in position to offer a valid and valuable service to a wide range of bioinformatics applications, ranging from simple extraction robots to online meta-servers. This article reviews existing scraping frameworks and tools, identifying their strengths and limitations in terms of extraction capabilities. The main focus is set on showing how straightforward it is today to set up a data scraping pipeline, with minimal programming effort, and answer a number of practical needs. For exemplification purposes, we introduce a biomedical data extraction scenario where the desired data sources, well-known in clinical microbiology and similar domains, do not offer programmatic interfaces yet. Moreover, we describe the operation of WhichGenes and PathJam, two bioinformatics meta-servers that use scraping as means to cope with gene set enrichment analysis. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  8. Synthetic biology: applying biological circuits beyond novel therapies.

    PubMed

    Dobrin, Anton; Saxena, Pratik; Fussenegger, Martin

    2016-04-18

    Synthetic biology, an engineering, circuit-driven approach to biology, has developed whole new classes of therapeutics. Unfortunately, these advances have thus far been undercapitalized upon by basic researchers. As discussed herein, using synthetic circuits, one can undertake exhaustive investigations of the endogenous circuitry found in nature, develop novel detectors and better temporally and spatially controlled inducers. One could detect changes in DNA, RNA, protein or even transient signaling events, in cell-based systems, in live mice, and in humans. Synthetic biology has also developed inducible systems that can be induced chemically, optically or using radio waves. This induction has been re-wired to lead to changes in gene expression, RNA stability and splicing, protein stability and splicing, and signaling via endogenous pathways. Beyond simple detectors and inducible systems, one can combine these modalities and develop novel signal integration circuits that can react to a very precise pre-programmed set of conditions or even to multiple sets of precise conditions. In this review, we highlight some tools that were developed in which these circuits were combined such that the detection of a particular event automatically triggered a specific output. Furthermore, using novel circuit-design strategies, circuits have been developed that can integrate multiple inputs together in Boolean logic gates composed of up to 6 inputs. We highlight the tools available and what has been developed thus far, and highlight how some clinical tools can be very useful in basic science. Most of the systems that are presented can be integrated together; and the possibilities far exceed the number of currently developed strategies.

  9. Integrated Tools for Future Distributed Engine Control Technologies

    NASA Technical Reports Server (NTRS)

    Culley, Dennis; Thomas, Randy; Saus, Joseph

    2013-01-01

    Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.

  10. Toward Treatment Integrity: Developing an Approach to Measure the Treatment Integrity of a Dialectical Behavior Therapy Intervention With Homeless Youth in the Community.

    PubMed

    McCay, Elizabeth; Carter, Celina; Aiello, Andria; Quesnel, Susan; Howes, Carol; Johansson, Bjorn

    2016-10-01

    The current paper discusses an approach to measuring treatment integrity of dialectical behavioral therapy (DBT) when implemented within two programs providing services to street-involved youth in the community. Measuring treatment integrity is a critical component of effective implementation of evidence-based interventions in clinical practice, since sound treatment integrity increases confidence in client outcomes and intervention replicability. Despite being an essential part of implementation science, few studies report on treatment integrity, with limited research addressing either measurement tools or maintenance of treatment integrity. To address the lack of available treatment integrity measures, researchers in the current study developed and piloted a treatment integrity measure which pertain to the individual and group components of DBT. A total of 20 recordings were assessed using the treatment integrity measure. Results indicate that the community agency staff (e.g. youth workers, social workers & nurses) implemented the intervention as intended; increasing confidence in the outcome variables, the staffs' training and the replicability of the intervention. This article offers one approach to addressing treatment integrity when implementing evidence-based interventions, such as DBT in a community setting, and discusses the need for effective and feasible integrity measures that can be adopted in order to strengthen mental health practice in community settings. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. A computer vision system for the recognition of trees in aerial photographs

    NASA Technical Reports Server (NTRS)

    Pinz, Axel J.

    1991-01-01

    Increasing problems of forest damage in Central Europe set the demand for an appropriate forest damage assessment tool. The Vision Expert System (VES) is presented which is capable of finding trees in color infrared aerial photographs. Concept and architecture of VES are discussed briefly. The system is applied to a multisource test data set. The processing of this multisource data set leads to a multiple interpretation result for one scene. An integration of these results will provide a better scene description by the vision system. This is achieved by an implementation of Steven's correlation algorithm.

  12. Hypogeal geological survey in the "Grotta del Re Tiberio" natural cave (Apennines, Italy): a valid tool for reconstructing the structural setting

    NASA Astrophysics Data System (ADS)

    Ghiselli, Alice; Merazzi, Marzio; Strini, Andrea; Margutti, Roberto; Mercuriali, Michele

    2011-06-01

    As karst systems are natural windows to the underground, speleology, combined with geological surveys, can be useful tools for helping understand the geological evolution of karst areas. In order to enhance the reconstruction of the structural setting in a gypsum karst area (Vena del Gesso, Romagna Apennines), a detailed analysis has been carried out on hypogeal data. Structural features (faults, fractures, tectonic foliations, bedding) have been mapped in the "Grotta del Re Tiberio" cave, in the nearby gypsum quarry tunnels and open pit benches. Five fracture systems and six fault systems have been identified. The fault systems have been further analyzed through stereographic projections and geometric-kinematic evaluations in order to reconstruct the relative chronology of these structures. This analysis led to the detection of two deformation phases. The results permitted linking of the hypogeal data with the surface data both at a local and regional scale. At the local scale, fracture data collected in the underground have been compared with previous authors' surface data coming from the quarry area. The two data sets show a very good correspondence, as every underground fracture system matches with one of the surface fracture system. Moreover, in the cave, a larger number of fractures belonging to each system could be mapped. At the regional scale, the two deformation phases detected can be integrated in the structural setting of the study area, thereby enhancing the tectonic interpretation of the area ( e.g., structures belonging to a new deformation phase, not reported before, have been identified underground). The structural detailed hypogeal survey has, thus, provided very useful data, both by integrating the existing information and revealing new data not detected at the surface. In particular, some small structures ( e.g., displacement markers and short fractures) are better preserved in the hypogeal environment than on the surface where the outcropping gypsum is more exposed to dissolution and recrystallization. The hypogeal geological survey, therefore, can be considered a powerful tool for integrating the surface and log data in order to enhance the reconstruction of the deformational history and to get a three-dimensional model of the bedrock in karst areas.

  13. Managing and Communicating Operational Workflow

    PubMed Central

    Weinberg, Stuart T.; Danciu, Ioana; Unertl, Kim M.

    2016-01-01

    Summary Background Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. Objective To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). Methods The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. Results The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. Conclusions The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings. PMID:27081407

  14. Promoting scientific collaboration and research through integrated social networking capabilities within the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.

    2009-04-01

    LiDAR (Light Distance And Ranging) topography data offer earth scientists the opportunity to study the earth's surface at very high resolutions. As a result, the popularity of these data is growing dramatically. However, the management, distribution, and analysis of community LiDAR data sets is a challenge due to their massive size (multi-billion point, mutli-terabyte). We have also found that many earth science users of these data sets lack the computing resources and expertise required to process these data. We have developed the OpenTopography Portal to democratize access to these large and computationally challenging data sets. The OpenTopography Portal uses cyberinfrastructure technology developed by the GEON project to provide access to LiDAR data in a variety of formats. LiDAR data products available range from simple Google Earth visualizations of LiDAR-derived hillshades to 1 km2 tiles of standard digital elevation model (DEM) products as well as LiDAR point cloud data and user generated custom-DEMs. We have found that the wide spectrum of LiDAR users have variable scientific applications, computing resources and technical experience and thus require a data system with multiple distribution mechanisms and platforms to serve a broader range of user communities. Because the volume of LiDAR topography data available is rapidly expanding, and data analysis techniques are evolving, there is a need for the user community to be able to communicate and interact to share knowledge and experiences. To address this need, the OpenTopography Portal enables social networking capabilities through a variety of collaboration tools, web 2.0 technologies and customized usage pattern tracking. Fundamentally, these tools offer users the ability to communicate, to access and share documents, participate in discussions, and to keep up to date on upcoming events and emerging technologies. The OpenTopography portal achieves the social networking capabilities by integrating various software technologies and platforms. These include the Expression Engine Content Management System (CMS) that comes with pre-packaged collaboration tools like blogs and wikis, the Gridsphere portal framework that contains the primary GEON LiDAR System portlet with user job monitoring capabilities and a java web based discussion forum (Jforums) application all seamlessly integrated under one portal. The OpenTopography Portal also provides integrated authentication mechanism between the various CMS collaboration tools and the core gridsphere based portlets. The integration of these various technologies allows for enhanced user interaction capabilities within the portal. By integrating popular collaboration tools like discussion forums and blogs we can promote conversation and openness among users. The ability to ask question and share expertise in forum discussions allows users to easily find information and interact with users facing similar challenges. The OpenTopography Blog enables our domain experts to post ideas, news items, commentary, and other resources in order to foster discussion and information sharing. The content management capabilities of the portal allow for easy updates to information in the form of publications, documents, and news articles. Access to the most current information fosters better decision-making. As has become the standard for web 2.0 technologies, the OpenTopography Portal is fully RSS enabled to allow users of the portal to keep track of news items, forum discussions, blog updates, and system outages. We are currently exploring how the information captured by user and job monitoring components of the Gridsphere based GEON LiDAR System can be harnessed to provide a recommender system that will help users to identify appropriate processing parameters and to locate related documents and data. By seamlessly integrating the various platforms and technologies under one single portal, we can take advantage of popular online collaboration tools that are either stand alone or software platform restricted. The availability of these collaboration tools along with the data will foster more community interaction and increase the strength and vibrancy of the LiDAR topography user community.

  15. Complex ambulatory settings demand scheduling systems.

    PubMed

    Ross, K M

    1998-01-01

    Practice management systems are becoming more and more complex, as they are asked to integrate all aspects of patient and resource management. Although patient scheduling is a standard expectation in any ambulatory environment, facilities and equipment resource scheduling are additional functionalities of scheduling systems. Because these functions were not typically managed in manual patient scheduling, often the result was resource mismanagement, along with a potential negative impact on utilization, patient flow and provider productivity. As ambulatory organizations have become more seasoned users of practice management software, the value of resource scheduling has become apparent. Appointment scheduling within a fully integrated practice management system is recognized as an enhancement of scheduling itself and provides additional tools to manage other information needs. Scheduling, as one component of patient information management, provides additional tools in these areas.

  16. Comparative map and trait viewer (CMTV): an integrated bioinformatic tool to construct consensus maps and compare QTL and functional genomics data across genomes and experiments.

    PubMed

    Sawkins, M C; Farmer, A D; Hoisington, D; Sullivan, J; Tolopko, A; Jiang, Z; Ribaut, J-M

    2004-10-01

    In the past few decades, a wealth of genomic data has been produced in a wide variety of species using a diverse array of functional and molecular marker approaches. In order to unlock the full potential of the information contained in these independent experiments, researchers need efficient and intuitive means to identify common genomic regions and genes involved in the expression of target phenotypic traits across diverse conditions. To address this need, we have developed a Comparative Map and Trait Viewer (CMTV) tool that can be used to construct dynamic aggregations of a variety of types of genomic datasets. By algorithmically determining correspondences between sets of objects on multiple genomic maps, the CMTV can display syntenic regions across taxa, combine maps from separate experiments into a consensus map, or project data from different maps into a common coordinate framework using dynamic coordinate translations between source and target maps. We present a case study that illustrates the utility of the tool for managing large and varied datasets by integrating data collected by CIMMYT in maize drought tolerance research with data from public sources. This example will focus on one of the visualization features for Quantitative Trait Locus (QTL) data, using likelihood ratio (LR) files produced by generic QTL analysis software and displaying the data in a unique visual manner across different combinations of traits, environments and crosses. Once a genomic region of interest has been identified, the CMTV can search and display additional QTLs meeting a particular threshold for that region, or other functional data such as sets of differentially expressed genes located in the region; it thus provides an easily used means for organizing and manipulating data sets that have been dynamically integrated under the focus of the researcher's specific hypothesis.

  17. Hymenoptera Genome Database: integrating genome annotations in HymenopteraMine.

    PubMed

    Elsik, Christine G; Tayal, Aditi; Diesh, Colin M; Unni, Deepak R; Emery, Marianne L; Nguyen, Hung N; Hagen, Darren E

    2016-01-04

    We report an update of the Hymenoptera Genome Database (HGD) (http://HymenopteraGenome.org), a model organism database for insect species of the order Hymenoptera (ants, bees and wasps). HGD maintains genomic data for 9 bee species, 10 ant species and 1 wasp, including the versions of genome and annotation data sets published by the genome sequencing consortiums and those provided by NCBI. A new data-mining warehouse, HymenopteraMine, based on the InterMine data warehousing system, integrates the genome data with data from external sources and facilitates cross-species analyses based on orthology. New genome browsers and annotation tools based on JBrowse/WebApollo provide easy genome navigation, and viewing of high throughput sequence data sets and can be used for collaborative genome annotation. All of the genomes and annotation data sets are combined into a single BLAST server that allows users to select and combine sequence data sets to search. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Integrating evo-devo with ecology for a better understanding of phenotypic evolution

    PubMed Central

    Emília Santos, M.; Berger, Chloé S.; Refki, Peter N.

    2015-01-01

    Evolutionary developmental biology (evo-devo) has provided invaluable contributions to our understanding of the mechanistic relationship between genotypic and phenotypic change. Similarly, evolutionary ecology has greatly advanced our understanding of the relationship between the phenotype and the environment. To fully understand the evolution of organismal diversity, a thorough integration of these two fields is required. This integration remains highly challenging because model systems offering a rich ecological and evolutionary background, together with the availability of developmental genetic tools and genomic resources, are scarce. In this review, we introduce the semi-aquatic bugs (Gerromorpha, Heteroptera) as original models well suited to study why and how organisms diversify. The Gerromorpha invaded water surfaces over 200 mya and diversified into a range of remarkable new forms within this new ecological habitat. We summarize the biology and evolutionary history of this group of insects and highlight a set of characters associated with the habitat change and the diversification that followed. We further discuss the morphological, behavioral, molecular and genomic tools available that together make semi-aquatic bugs a prime model for integration across disciplines. We present case studies showing how the implementation and combination of these approaches can advance our understanding of how the interaction between genotypes, phenotypes and the environment drives the evolution of distinct morphologies. Finally, we explain how the same set of experimental designs can be applied in other systems to address similar biological questions. PMID:25750411

  19. Integrating evo-devo with ecology for a better understanding of phenotypic evolution.

    PubMed

    Santos, M Emília; Berger, Chloé S; Refki, Peter N; Khila, Abderrahman

    2015-11-01

    Evolutionary developmental biology (evo-devo) has provided invaluable contributions to our understanding of the mechanistic relationship between genotypic and phenotypic change. Similarly, evolutionary ecology has greatly advanced our understanding of the relationship between the phenotype and the environment. To fully understand the evolution of organismal diversity, a thorough integration of these two fields is required. This integration remains highly challenging because model systems offering a rich ecological and evolutionary background, together with the availability of developmental genetic tools and genomic resources, are scarce. In this review, we introduce the semi-aquatic bugs (Gerromorpha, Heteroptera) as original models well suited to study why and how organisms diversify. The Gerromorpha invaded water surfaces over 200 mya and diversified into a range of remarkable new forms within this new ecological habitat. We summarize the biology and evolutionary history of this group of insects and highlight a set of characters associated with the habitat change and the diversification that followed. We further discuss the morphological, behavioral, molecular and genomic tools available that together make semi-aquatic bugs a prime model for integration across disciplines. We present case studies showing how the implementation and combination of these approaches can advance our understanding of how the interaction between genotypes, phenotypes and the environment drives the evolution of distinct morphologies. Finally, we explain how the same set of experimental designs can be applied in other systems to address similar biological questions. © The Author 2015. Published by Oxford University Press.

  20. NeAT: a toolbox for the analysis of biological networks, clusters, classes and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Sand, Olivier; Janky, Rekin's; Vanderstocken, Gilles; Deville, Yves; van Helden, Jacques

    2008-07-01

    The network analysis tools (NeAT) (http://rsat.ulb.ac.be/neat/) provide a user-friendly web access to a collection of modular tools for the analysis of networks (graphs) and clusters (e.g. microarray clusters, functional classes, etc.). A first set of tools supports basic operations on graphs (comparison between two graphs, neighborhood of a set of input nodes, path finding and graph randomization). Another set of programs makes the connection between networks and clusters (graph-based clustering, cliques discovery and mapping of clusters onto a network). The toolbox also includes programs for detecting significant intersections between clusters/classes (e.g. clusters of co-expression versus functional classes of genes). NeAT are designed to cope with large datasets and provide a flexible toolbox for analyzing biological networks stored in various databases (protein interactions, regulation and metabolism) or obtained from high-throughput experiments (two-hybrid, mass-spectrometry and microarrays). The web interface interconnects the programs in predefined analysis flows, enabling to address a series of questions about networks of interest. Each tool can also be used separately by entering custom data for a specific analysis. NeAT can also be used as web services (SOAP/WSDL interface), in order to design programmatic workflows and integrate them with other available resources.

  1. A life scientist's gateway to distributed data management and computing: the PathPort/ToolBus framework.

    PubMed

    Eckart, J Dana; Sobral, Bruno W S

    2003-01-01

    The emergent needs of the bioinformatics community challenge current information systems. The pace of biological data generation far outstrips Moore's Law. Therefore, a gap continues to widen between the capabilities to produce biological (molecular and cell) data sets and the capability to manage and analyze these data sets. As a result, Federal investments in large data set generation produces diminishing returns in terms of the community's capabilities of understanding biology and leveraging that understanding to make scientific and technological advances that improve society. We are building an open framework to address various data management issues including data and tool interoperability, nomenclature and data communication standardization, and database integration. PathPort, short for Pathogen Portal, employs a generic, web-services based framework to deal with some of the problems identified by the bioinformatics community. The motivating research goal of a scalable system to provide data management and analysis for key pathosystems, especially relating to molecular data, has resulted in a generic framework using two major components. On the server-side, we employ web-services. On the client-side, a Java application called ToolBus acts as a client-side "bus" for contacting data and tools and viewing results through a single, consistent user interface.

  2. iProphet: Multi-level Integrative Analysis of Shotgun Proteomic Data Improves Peptide and Protein Identification Rates and Error Estimates*

    PubMed Central

    Shteynberg, David; Deutsch, Eric W.; Lam, Henry; Eng, Jimmy K.; Sun, Zhi; Tasman, Natalie; Mendoza, Luis; Moritz, Robert L.; Aebersold, Ruedi; Nesvizhskii, Alexey I.

    2011-01-01

    The combination of tandem mass spectrometry and sequence database searching is the method of choice for the identification of peptides and the mapping of proteomes. Over the last several years, the volume of data generated in proteomic studies has increased dramatically, which challenges the computational approaches previously developed for these data. Furthermore, a multitude of search engines have been developed that identify different, overlapping subsets of the sample peptides from a particular set of tandem mass spectrometry spectra. We present iProphet, the new addition to the widely used open-source suite of proteomic data analysis tools Trans-Proteomics Pipeline. Applied in tandem with PeptideProphet, it provides more accurate representation of the multilevel nature of shotgun proteomic data. iProphet combines the evidence from multiple identifications of the same peptide sequences across different spectra, experiments, precursor ion charge states, and modified states. It also allows accurate and effective integration of the results from multiple database search engines applied to the same data. The use of iProphet in the Trans-Proteomics Pipeline increases the number of correctly identified peptides at a constant false discovery rate as compared with both PeptideProphet and another state-of-the-art tool Percolator. As the main outcome, iProphet permits the calculation of accurate posterior probabilities and false discovery rate estimates at the level of sequence identical peptide identifications, which in turn leads to more accurate probability estimates at the protein level. Fully integrated with the Trans-Proteomics Pipeline, it supports all commonly used MS instruments, search engines, and computer platforms. The performance of iProphet is demonstrated on two publicly available data sets: data from a human whole cell lysate proteome profiling experiment representative of typical proteomic data sets, and from a set of Streptococcus pyogenes experiments more representative of organism-specific composite data sets. PMID:21876204

  3. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  4. Blending technology in teaching advanced health assessment in a family nurse practitioner program: using personal digital assistants in a simulation laboratory.

    PubMed

    Elliott, Lydia; DeCristofaro, Claire; Carpenter, Alesia

    2012-09-01

    This article describes the development and implementation of integrated use of personal handheld devices (personal digital assistants, PDAs) and high-fidelity simulation in an advanced health assessment course in a graduate family nurse practitioner (NP) program. A teaching tool was developed that can be utilized as a template for clinical case scenarios blending these separate technologies. Review of the evidence-based literature, including peer-reviewed articles and reviews. Blending the technologies of high-fidelity simulation and handheld devices (PDAs) provided a positive learning experience for graduate NP students in a teaching laboratory setting. Combining both technologies in clinical case scenarios offered a more real-world learning experience, with a focus on point-of-care service and integration of interview and physical assessment skills with existing standards of care and external clinical resources. Faculty modeling and advance training with PDA technology was crucial to success. Faculty developed a general template tool and systems-based clinical scenarios integrating PDA and high-fidelity simulation. Faculty observations, the general template tool, and one scenario example are included in this article. ©2012 The Author(s) Journal compilation ©2012 American Academy of Nurse Practitioners.

  5. Questioning context: a set of interdisciplinary questions for investigating contextual factors affecting health decision making

    PubMed Central

    Charise, Andrea; Witteman, Holly; Whyte, Sarah; Sutton, Erica J.; Bender, Jacqueline L.; Massimi, Michael; Stephens, Lindsay; Evans, Joshua; Logie, Carmen; Mirza, Raza M.; Elf, Marie

    2011-01-01

    Abstract Objective  To combine insights from multiple disciplines into a set of questions that can be used to investigate contextual factors affecting health decision making. Background  Decision‐making processes and outcomes may be shaped by a range of non‐medical or ‘contextual’ factors particular to an individual including social, economic, political, geographical and institutional conditions. Research concerning contextual factors occurs across many disciplines and theoretical domains, but few conceptual tools have attempted to integrate and translate this wide‐ranging research for health decision‐making purposes. Methods  To formulate this tool we employed an iterative, collaborative process of scenario development and question generation. Five hypothetical health decision‐making scenarios (preventative, screening, curative, supportive and palliative) were developed and used to generate a set of exploratory questions that aim to highlight potential contextual factors across a range of health decisions. Findings  We present an exploratory tool consisting of questions organized into four thematic domains – Bodies, Technologies, Place and Work (BTPW) – articulating wide‐ranging contextual factors relevant to health decision making. The BTPW tool encompasses health‐related scholarship and research from a range of disciplines pertinent to health decision making, and identifies concrete points of intersection between its four thematic domains. Examples of the practical application of the questions are also provided. Conclusions  These exploratory questions provide an interdisciplinary toolkit for identifying the complex contextual factors affecting decision making. The set of questions comprised by the BTPW tool may be applied wholly or partially in the context of clinical practice, policy development and health‐related research. PMID:21029277

  6. Recent Advances in Nanobiotechnology and High-Throughput Molecular Techniques for Systems Biomedicine

    PubMed Central

    Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho

    2013-01-01

    Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnology-based materials and living cells in both in vitro and in vivo settings. PMID:24258011

  7. Recent advances in nanobiotechnology and high-throughput molecular techniques for systems biomedicine.

    PubMed

    Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho

    2013-12-01

    Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnologybased materials and living cells in both in vitro and in vivo settings.

  8. FREEWAT: an HORIZON 2020 project to build open source tools for water management.

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Mehl, S.; Velasco Mansilla, V.

    2015-12-01

    FREEWAT is an HORIZON 2020 EU project. FREEWAT main result will be an open source and public domain GIS integrated modelling environment for the simulation of water quantity and quality in surface water and groundwater with an integrated water management and planning module. FREEWAT aims at promoting water resource management by simplifying the application of the Water Framework Directive and related Directives. Specific objectives of the project are: to coordinate previous EU and national funded research to integrate existing software modules for water management in a single environment into the GIS based FREEWAT and to support the FREEWAT application in an innovative participatory approach gathering technical staff and relevant stakeholders (policy and decision makers) in designing scenarios for application of water policies. The open source characteristics of the platform allow to consider this an initiative "ad includendum", as further institutions or developers may contribute to the development. Core of the platform is the SID&GRID framework (GIS integrated physically-based distributed numerical hydrological model based on a modified version of MODFLOW 2005; Rossetto et al. 2013) in its version ported to QGIS desktop. Activities are carried out on two lines: (i) integration of modules to fulfill the end-users requirements, including tools for producing feasibility and management plans; (ii) a set of activities to fix bugs and to provide a well-integrated interface for the different tools implemented. Further capabilities to be integrated are: - module for water management and planning; - calibration, uncertainty and sensitivity analysis; - module for solute transport in unsaturated zone; - module for crop growth and water requirements in agriculture; - tools for groundwater quality issues and for the analysis, interpretation and visualization of hydrogeological data. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT main impact will be on enhancing science- and participatory approach and evidence-based decision making in water resource management, hence producing relevant and appropriate outcomes for policy implementation. Large stakeholders involvement is thought to guarantee results dissemination and exploitation.

  9. ExEP yield modeling tool and validation test results

    NASA Astrophysics Data System (ADS)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  10. [Technical improvement of cohort constitution in administrative health databases: Providing a tool for integration and standardization of data applicable in the French National Health Insurance Database (SNIIRAM)].

    PubMed

    Ferdynus, C; Huiart, L

    2016-09-01

    Administrative health databases such as the French National Heath Insurance Database - SNIIRAM - are a major tool to answer numerous public health research questions. However the use of such data requires complex and time-consuming data management. Our objective was to develop and make available a tool to optimize cohort constitution within administrative health databases. We developed a process to extract, transform and load (ETL) data from various heterogeneous sources in a standardized data warehouse. This data warehouse is architected as a star schema corresponding to an i2b2 star schema model. We then evaluated the performance of this ETL using data from a pharmacoepidemiology research project conducted in the SNIIRAM database. The ETL we developed comprises a set of functionalities for creating SAS scripts. Data can be integrated into a standardized data warehouse. As part of the performance assessment of this ETL, we achieved integration of a dataset from the SNIIRAM comprising more than 900 million lines in less than three hours using a desktop computer. This enables patient selection from the standardized data warehouse within seconds of the request. The ETL described in this paper provides a tool which is effective and compatible with all administrative health databases, without requiring complex database servers. This tool should simplify cohort constitution in health databases; the standardization of warehouse data facilitates collaborative work between research teams. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  11. Inspecting Engineering Samples

    NASA Image and Video Library

    2017-12-08

    Goddard's Ritsko Wins 2011 SAVE Award The winner of the 2011 SAVE Award is Matthew Ritsko, a Goddard financial manager. His tool lending library would track and enable sharing of expensive space-flight tools and hardware after projects no longer need them. This set of images represents the types of tools used at NASA. To read more go to: www.nasa.gov/topics/people/features/ritsko-save.html Dr. Doug Rabin (Code 671) and PI La Vida Cooper (Code 564) inspect engineering samples of the HAS-2 imager which will be tested and readout using a custom ASIC with a 16-bit ADC (analog to digital converter) and CDS (correlated double sampling) circuit designed by the Code 564 ASIC group as a part of an FY10 IRAD. The purpose of the IRAD was to develop and high resolution digitizer for Heliophysics applications such as imaging. Future goals for the collaboration include characterization testing and eventually a sounding rocket flight of the integrated system. *ASIC= Application Specific Integrated Circuit NASA/GSFC/Chris Gunn

  12. Fluorescent tagged episomals for stoichiometric induced pluripotent stem cell reprogramming.

    PubMed

    Schmitt, Christopher E; Morales, Blanca M; Schmitz, Ellen M H; Hawkins, John S; Lizama, Carlos O; Zape, Joan P; Hsiao, Edward C; Zovein, Ann C

    2017-06-05

    Non-integrating episomal vectors have become an important tool for induced pluripotent stem cell reprogramming. The episomal vectors carrying the "Yamanaka reprogramming factors" (Oct4, Klf, Sox2, and L-Myc + Lin28) are critical tools for non-integrating reprogramming of cells to a pluripotent state. However, the reprogramming process remains highly stochastic, and is hampered by an inability to easily identify clones that carry the episomal vectors. We modified the original set of vectors to express spectrally separable fluorescent proteins to allow for enrichment of transfected cells. The vectors were then tested against the standard original vectors for reprogramming efficiency and for the ability to enrich for stoichiometric ratios of factors. The reengineered vectors allow for cell sorting based on reprogramming factor expression. We show that these vectors can assist in tracking episomal expression in individual cells and can select the reprogramming factor dosage. Together, these modified vectors are a useful tool for understanding the reprogramming process and improving induced pluripotent stem cell isolation efficiency.

  13. PRANAS: A New Platform for Retinal Analysis and Simulation.

    PubMed

    Cessac, Bruno; Kornprobst, Pierre; Kraria, Selim; Nasser, Hassan; Pamplona, Daniela; Portelli, Geoffrey; Viéville, Thierry

    2017-01-01

    The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr), standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints) takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space) receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.

  14. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  15. Boutiques: a flexible framework to integrate command-line applications in computing platforms

    PubMed Central

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-01-01

    Abstract We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science. PMID:29718199

  16. An integrative view of phylogenetic comparative methods: connections to population genetics, community ecology, and paleobiology.

    PubMed

    Pennell, Matthew W; Harmon, Luke J

    2013-06-01

    Recent innovations in phylogenetic comparative methods (PCMs) have spurred a renaissance of research into the causes and consequences of large-scale patterns of biodiversity. In this paper, we review these advances. We also highlight the potential of comparative methods to integrate across fields and focus on three examples where such integration might be particularly valuable: quantitative genetics, community ecology, and paleobiology. We argue that PCMs will continue to be a key set of tools in evolutionary biology, shedding new light on how evolutionary processes have shaped patterns of biodiversity through deep time. © 2013 New York Academy of Sciences.

  17. Integrity and security in an Ada runtime environment

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    A review is provided of the Formal Methods group discussions. It was stated that integrity is not a pure mathematical dual of security. The input data is part of the integrity domain. The group provided a roadmap for research. One item of the roadmap and the final position statement are closely related to the space shuttle and space station. The group's position is to use a safe subset of Ada. Examples of safe sets include the Army Secure Operating System and the Penelope Ada verification tool. It is recommended that a conservative attitude is required when writing Ada code for life and property critical systems.

  18. Building An Integrated Neurodegenerative Disease Database At An Academic Health Center

    PubMed Central

    Xie, Sharon X.; Baek, Young; Grossman, Murray; Arnold, Steven E.; Karlawish, Jason; Siderowf, Andrew; Hurtig, Howard; Elman, Lauren; McCluskey, Leo; Van Deerlin, Vivianna; Lee, Virginia M.-Y.; Trojanowski, John Q.

    2010-01-01

    Background It is becoming increasingly important to study common and distinct etiologies, clinical and pathological features, and mechanisms related to neurodegenerative diseases such as Alzheimer’s disease (AD), Parkinson’s disease (PD), amyotrophic lateral sclerosis (ALS), and frontotemporal lobar degeneration (FTLD). These comparative studies rely on powerful database tools to quickly generate data sets which match diverse and complementary criteria set by the studies. Methods In this paper, we present a novel Integrated NeuroDegenerative Disease (INDD) database developed at the University of Pennsylvania (Penn) through a consortium of Penn investigators. Since these investigators work on AD, PD, ALS and FTLD, this allowed us to achieve the goal of developing an INDD database for these major neurodegenerative disorders. We used Microsoft SQL Server as the platform with built-in “backwards” functionality to provide Access as a front-end client to interface with the database. We used PHP hypertext Preprocessor to create the “front end” web interface and then integrated individual neurodegenerative disease databases using a master lookup table. We also present methods of data entry, database security, database backups, and database audit trails for this INDD database. Results We compare the results of a biomarker study using the INDD database to those using an alternative approach by querying individual database separately. Conclusions We have demonstrated that the Penn INDD database has the ability to query multiple database tables from a single console with high accuracy and reliability. The INDD database provides a powerful tool for generating data sets in comparative studies across several neurodegenerative diseases. PMID:21784346

  19. Barriers to Integrating Mental Health Services in Community-Based Primary Care Settings in Mexico City: A Qualitative Analysis.

    PubMed

    Martinez, William; Galván, Jorge; Saavedra, Nayelhi; Berenzon, Shoshana

    2017-05-01

    Despite the high prevalence of mental disorders in Mexico, minimal mental health services are available and there are large gaps in mental health treatment. Community-based primary care settings are often the first contact between patients and the health system and thus could serve as important settings for assessing and treating mental disorders. However, no formal assessment has been undertaken regarding the feasibility of implementing these services in Mexico. Before tools are developed to undertake such an assessment, a more nuanced understanding of the microprocesses affecting mental health service delivery must be acquired. A qualitative study used semistructured interviews to gather information from 25 staff in 19 community-based primary care clinics in Mexico City. Semistructured interviews were analyzed by using the meaning categorization method. In a second phase of coding, emerging themes were compared with an established typology of barriers to health care access. Primary care staff reported a number of significant barriers to implementing mental health services in primary care clinics, an already fragile and underfunded system. Barriers included the following broad thematic categories: service issues, language and cultural issues, care recipient characteristics, and issues with lack of knowledge. Results indicate that the implementation of mental health services in primary care clinics in Mexico will be difficult. However, the information in this study can help inform the integration of mental health into community-based primary care in Mexico through the development of adequate evaluative tools to assess the feasibility and progress of integrating these services.

  20. An interactive distance solution for stroke rehabilitation in the home setting - A feasibility study.

    PubMed

    Palmcrantz, Susanne; Borg, Jörgen; Sommerfeld, Disa; Plantin, Jeanette; Wall, Anneli; Ehn, Maria; Sjölinder, Marie; Boman, Inga-Lill

    2017-09-01

    In this study an interactive distance solution (called the DISKO tool) was developed to enable home-based motor training after stroke. The overall aim was to explore the feasibility and safety of using the DISKO-tool, customized for interactive stroke rehabilitation in the home setting, in different rehabilitation phases after stroke. Fifteen patients in three different stages in the continuum of rehabilitation after stroke participated in a home-based training program using the DISKO-tool. The program included 15 training sessions with recurrent follow-ups by the integrated application for video communication with a physiotherapist. Safety and feasibility were assessed from patients, physiotherapists, and a technician using logbooks, interviews, and a questionnaire. Qualitative content analysis and descriptive statistics were used in the analysis. Fourteen out of 15 patients finalized the training period with a mean of 19.5 minutes spent on training at each session. The DISKO-tool was found to be useful and safe by patients and physiotherapists. This study demonstrates the feasibility and safety of the DISKO-tool and provides guidance in further development and testing of interactive distance technology for home rehabilitation, to be used by health care professionals and patients in different phases of rehabilitation after stroke.

  1. Self-learning computers for surgical planning and prediction of postoperative alignment.

    PubMed

    Lafage, Renaud; Pesenti, Sébastien; Lafage, Virginie; Schwab, Frank J

    2018-02-01

    In past decades, the role of sagittal alignment has been widely demonstrated in the setting of spinal conditions. As several parameters can be affected, identifying the driver of the deformity is the cornerstone of a successful treatment approach. Despite the importance of restoring sagittal alignment for optimizing outcome, this task remains challenging. Self-learning computers and optimized algorithms are of great interest in spine surgery as in that they facilitate better planning and prediction of postoperative alignment. Nowadays, computer-assisted tools are part of surgeons' daily practice; however, the use of such tools remains to be time-consuming. NARRATIVE REVIEW AND RESULTS: Computer-assisted methods for the prediction of postoperative alignment consist of a three step analysis: identification of anatomical landmark, definition of alignment objectives, and simulation of surgery. Recently, complex rules for the prediction of alignment have been proposed. Even though this kind of work leads to more personalized objectives, the number of parameters involved renders it difficult for clinical use, stressing the importance of developing computer-assisted tools. The evolution of our current technology, including machine learning and other types of advanced algorithms, will provide powerful tools that could be useful in improving surgical outcomes and alignment prediction. These tools can combine different types of advanced technologies, such as image recognition and shape modeling, and using this technique, computer-assisted methods are able to predict spinal shape. The development of powerful computer-assisted methods involves the integration of several sources of information such as radiographic parameters (X-rays, MRI, CT scan, etc.), demographic information, and unusual non-osseous parameters (muscle quality, proprioception, gait analysis data). In using a larger set of data, these methods will aim to mimic what is actually done by spine surgeons, leading to real tailor-made solutions. Integrating newer technology can change the current way of planning/simulating surgery. The use of powerful computer-assisted tools that are able to integrate several parameters and learn from experience can change the traditional way of selecting treatment pathways and counseling patients. However, there is still much work to be done to reach a desired level as noted in other orthopedic fields, such as hip surgery. Many of these tools already exist in non-medical fields and their adaptation to spine surgery is of considerable interest.

  2. Design Tools for Reconfigurable Hardware in Orbit (RHinO)

    NASA Technical Reports Server (NTRS)

    French, Mathew; Graham, Paul; Wirthlin, Michael; Larchev, Gregory; Bellows, Peter; Schott, Brian

    2004-01-01

    The Reconfigurable Hardware in Orbit (RHinO) project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. These tools leverage an established FPGA design environment and focus primarily on space effects mitigation and power optimization. The project is creating software to automatically test and evaluate the single-event-upsets (SEUs) sensitivities of an FPGA design and insert mitigation techniques. Extensions into the tool suite will also allow evolvable algorithm techniques to reconfigure around single-event-latchup (SEL) events. In the power domain, tools are being created for dynamic power visualiization and optimization. Thus, this technology seeks to enable the use of Reconfigurable Hardware in Orbit, via an integrated design tool-suite aiming to reduce risk, cost, and design time of multimission reconfigurable space processors using SRAM-based FPGAs.

  3. New tools in cybertherapy: the VEPSY web site.

    PubMed

    Castelnuovo, Gianluca; Buselli, Claudio; De Ferrari, Roberta; Gaggioli, Andrea; Mantovani, Fabrizia; Molinari, Enrico; Villamira, Marco; Riva, Giuseppe

    2004-01-01

    In the last years the rapid development of the Internet and new communication technologies has had a great impact on psychology and psychotherapy. Psychotherapists seem to rely with more and more interest on the new technological tools such as videophone, audio and video chat, e-mail, SMS and the new Instant Messaging Tools (IMs). All these technologies outline a stimulating as well as complex scenario: in order to effectively exploit their potential, it is important to study which is the possible role played by the Internet-based tools inside a psychotherapeutic iter. Could the technology substitute the health care practitioners or are these tools only a resource in addition to the traditional ones in the therapist's hand? The major aim of this chapter is to provide a framework for the integration of old and new tools in mental health care. Different theoretical positions about the possible role played by e-therapy are reported showing the possible changes that psychotherapy will necessarily face in a cyber setting. The VEPSY website, an integration of different Internet-based tools developed within the VEPSY UPDATED Project, is described as an example of clinical application matching between old (and functional) practices with new (and promising) media for the treatment of different mental disorders. A rationale about the possible scenarios for the use of the VEPSY website in the clinical process is provided.

  4. OPATs: Omnibus P-value association tests.

    PubMed

    Chen, Chia-Wei; Yang, Hsin-Chou

    2017-07-10

    Combining statistical significances (P-values) from a set of single-locus association tests in genome-wide association studies is a proof-of-principle method for identifying disease-associated genomic segments, functional genes and biological pathways. We review P-value combinations for genome-wide association studies and introduce an integrated analysis tool, Omnibus P-value Association Tests (OPATs), which provides popular analysis methods of P-value combinations. The software OPATs programmed in R and R graphical user interface features a user-friendly interface. In addition to analysis modules for data quality control and single-locus association tests, OPATs provides three types of set-based association test: window-, gene- and biopathway-based association tests. P-value combinations with or without threshold and rank truncation are provided. The significance of a set-based association test is evaluated by using resampling procedures. Performance of the set-based association tests in OPATs has been evaluated by simulation studies and real data analyses. These set-based association tests help boost the statistical power, alleviate the multiple-testing problem, reduce the impact of genetic heterogeneity, increase the replication efficiency of association tests and facilitate the interpretation of association signals by streamlining the testing procedures and integrating the genetic effects of multiple variants in genomic regions of biological relevance. In summary, P-value combinations facilitate the identification of marker sets associated with disease susceptibility and uncover missing heritability in association studies, thereby establishing a foundation for the genetic dissection of complex diseases and traits. OPATs provides an easy-to-use and statistically powerful analysis tool for P-value combinations. OPATs, examples, and user guide can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/genetics/association/OPATs.htm. © The Author 2017. Published by Oxford University Press.

  5. A 1 GHz integrated circuit with carbon nanotube interconnects and silicon transistors.

    PubMed

    Close, Gael F; Yasuda, Shinichi; Paul, Bipul; Fujita, Shinobu; Wong, H-S Philip

    2008-02-01

    Due to their excellent electrical properties, metallic carbon nanotubes are promising materials for interconnect wires in future integrated circuits. Simulations have shown that the use of metallic carbon nanotube interconnects could yield more energy efficient and faster integrated circuits. The next step is to build an experimental prototype integrated circuit using carbon nanotube interconnects operating at high speed. Here, we report the fabrication of the first stand-alone integrated circuit combining silicon transistors and individual carbon nanotube interconnect wires on the same chip operating above 1 GHz. In addition to setting a milestone by operating above 1 GHz, this prototype is also a tool to investigate carbon nanotubes on a silicon-based platform at high frequencies, paving the way for future multi-GHz nanoelectronics.

  6. A Best Practice Guide to Assessment and Intervention for Autism Spectrum Disorder in Schools. Second Edition

    ERIC Educational Resources Information Center

    Wilkinson, Lee A.

    2016-01-01

    Fully updated to reflect DSM-5 and current assessment tools, procedures and research, this award-winning book provides a practical and scientifically-based approach to identifying, assessing, and treating children and adolescents with an Autism Spectrum Disorder (ASD) in school settings. Integrating current research evidence with theory and…

  7. Automated Patent Searching in the EPO: From Online Searching to Document Delivery.

    ERIC Educational Resources Information Center

    Nuyts, Annemie; Jonckheere, Charles

    The European Patent Office (EPO) has recently implemented the last part of its ambitious automation project aimed at creating an automated search environment for approximately 1200 EPO patent search examiners. The examiners now have at their disposal an integrated set of tools offering a full range of functionalities from online searching, via…

  8. A Realistic Data Warehouse Project: An Integration of Microsoft Access[R] and Microsoft Excel[R] Advanced Features and Skills

    ERIC Educational Resources Information Center

    King, Michael A.

    2009-01-01

    Business intelligence derived from data warehousing and data mining has become one of the most strategic management tools today, providing organizations with long-term competitive advantages. Business school curriculums and popular database textbooks cover data warehousing, but the examples and problem sets typically are small and unrealistic. The…

  9. Remote Learning for the Manipulation and Control of Robotic Cells

    ERIC Educational Resources Information Center

    Goldstain, Ofir; Ben-Gal, Irad; Bukchin, Yossi

    2007-01-01

    This work proposes an approach to remote learning of robotic cells based on internet and simulation tools. The proposed approach, which integrates remote-learning and tele-operation into a generic scheme, is designed to enable students and developers to set-up and manipulate a robotic cell remotely. Its implementation is based on a dedicated…

  10. User Identification and Tracking in an Educational Web Environment.

    ERIC Educational Resources Information Center

    Marzo-Lazaro, J. L.; Verdu-Carbo, T.; Fabregat-Gesa, R.

    This paper describes a solution to the user identification and tracking problem within an educational World Wide Web environment. The paper begins with an overview of the Teaching Support System project at the University of Girona (Spain); the main objective of the project is to create an integrated set of tools for teachers to use to create and…

  11. Dynamic Reaction Figures: An Integrative Vehicle for Understanding Chemical Reactions

    ERIC Educational Resources Information Center

    Schultz, Emeric

    2008-01-01

    A highly flexible learning tool, referred to as a dynamic reaction figure, is described. Application of these figures can (i) yield the correct chemical equation by simply following a set of menu driven directions; (ii) present the underlying "mechanism" in chemical reactions; and (iii) help to solve quantitative problems in a number of different…

  12. Demonstration of the Principles of Restriction Endonuclease Cleavage Reactions Using Thermostable Bfl I from "Anoxybacillus Flavithermus"

    ERIC Educational Resources Information Center

    Sharma, Prince; D'Souza, David R.; Bhandari, Deepali; Parashar, Vijay; Capalash, Neena

    2003-01-01

    Restriction enzymes are basic tools in recombinant DNA technology. To shape the molecular biology experiments, the students must know how to work with these molecular scissors. Here, we describe an integrated set of experiments, introduced in the "Advances in Molecular Biology and Biotechnology" postgraduate course, which covers the important…

  13. NED-1: integrated analyses for forest stewardship decisions

    Treesearch

    Mark J. Twery; H. Michael Rauscher; Deborah J. Bennett; Scott A. Thomasma; Susan L. Stout; James F. Palmer; Robin E. Hoffman; David S. DeCalesta; Eric Gustafson; J. Morgan Grove; Donald Nute; Geneho Kim; R. Peter Kollasch

    2000-01-01

    NED is a collective term for a set of software intended to help resource managers develop goals, assess current and potential conditions, and produce sustainable management plans for forest properties. The software tools are being developed by the USDA Forest Service, Northeastern and Southern Research Stations, in cooperation with many other collaborators. NED-1 is a...

  14. The Development and Implementation of U-Msg for College Students' English Learning

    ERIC Educational Resources Information Center

    Cheng, Yuh-Ming; Kuo, Sheng-Huang; Lou, Shi-Jer; Shih, Ru-Chu

    2016-01-01

    With the advance of mobile technology, mobile devices have become more portable and powerful with numerous useful tools in daily life. Thus, mobile learning has been widely involved in e-learning studies. Many studies point out that it is important to integrate both pedagogical and technical strengths of mobile technology into learning settings.…

  15. Learner Self-Regulation and Web 2.0 Tools Management in Personal Learning Environment

    ERIC Educational Resources Information Center

    Yen, Cherng-Jyh; Tu, Chih-Hsiung; Sujo-Montes, Laura E.; Armfield, Shadow W. J.; Chan, Junn-Yih

    2013-01-01

    Web 2.0 technology integration requires a higher level of self-regulated learning skills to create a Personal Learning Environment (PLE). This study examined each of the four aspects of learner self-regulation in online learning (i.e., environment structuring, goal setting, time management, & task strategies) as the predictor for level of…

  16. Integrated Land-Water-Energy assessment using the Foreseer Tool

    NASA Astrophysics Data System (ADS)

    Allwood, Julian; Konadu, Dennis; Mourao, Zenaida; Lupton, Rick; Richards, Keith; Fenner, Richard; Skelton, Sandy; McMahon, Richard

    2016-04-01

    This study presents an integrated energy and resource modelling and visualisation approach, ForeseerTM, which characterises the interdependencies and evaluates the land and water requirement for energy system pathways. The Foreseer Tool maps linked energy, water and land resource futures by outputting a set of Sankey diagrams for energy, water and land, showing the flow from basic resource (e.g. coal, surface water, and forested land) through transformations (e.g. fuel refining and desalination) to final services (e.g. sustenance, hygiene and transportation). By 'mapping' resources in this way, policy-makers can more easily understand the competing uses through the identification of the services it delivers (e.g. food production, landscaping, energy), the potential opportunities for improving the management of the resource and the connections with other resources which are often overlooked in a traditional sector-based management strategy. This paper will present a case study of the UK Carbon Plan, and highlights the need for integrated resource planning and policy development.

  17. The ReproGenomics Viewer: an integrative cross-species toolbox for the reproductive science community

    PubMed Central

    Darde, Thomas A.; Sallou, Olivier; Becker, Emmanuelle; Evrard, Bertrand; Monjeaud, Cyril; Le Bras, Yvan; Jégou, Bernard; Collin, Olivier; Rolland, Antoine D.; Chalmel, Frédéric

    2015-01-01

    We report the development of the ReproGenomics Viewer (RGV), a multi- and cross-species working environment for the visualization, mining and comparison of published omics data sets for the reproductive science community. The system currently embeds 15 published data sets related to gametogenesis from nine model organisms. Data sets have been curated and conveniently organized into broad categories including biological topics, technologies, species and publications. RGV's modular design for both organisms and genomic tools enables users to upload and compare their data with that from the data sets embedded in the system in a cross-species manner. The RGV is freely available at http://rgv.genouest.org. PMID:25883147

  18. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1993-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X DataSlice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  19. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1992-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X Data Slice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  20. Improving e-book access via a library-developed full-text search tool*

    PubMed Central

    Foust, Jill E.; Bergen, Phillip; Maxeiner, Gretchen L.; Pawlowski, Peter N.

    2007-01-01

    Purpose: This paper reports on the development of a tool for searching the contents of licensed full-text electronic book (e-book) collections. Setting: The Health Sciences Library System (HSLS) provides services to the University of Pittsburgh's medical programs and large academic health system. Brief Description: The HSLS has developed an innovative tool for federated searching of its e-book collections. Built using the XML-based Vivísimo development environment, the tool enables a user to perform a full-text search of over 2,500 titles from the library's seven most highly used e-book collections. From a single “Google-style” query, results are returned as an integrated set of links pointing directly to relevant sections of the full text. Results are also grouped into categories that enable more precise retrieval without reformulation of the search. Results/Evaluation: A heuristic evaluation demonstrated the usability of the tool and a web server log analysis indicated an acceptable level of usage. Based on its success, there are plans to increase the number of online book collections searched. Conclusion: This library's first foray into federated searching has produced an effective tool for searching across large collections of full-text e-books and has provided a good foundation for the development of other library-based federated searching products. PMID:17252065

  1. Pathway enrichment analysis approach based on topological structure and updated annotation of pathway.

    PubMed

    Yang, Qian; Wang, Shuyuan; Dai, Enyu; Zhou, Shunheng; Liu, Dianming; Liu, Haizhou; Meng, Qianqian; Jiang, Bin; Jiang, Wei

    2017-08-16

    Pathway enrichment analysis has been widely used to identify cancer risk pathways, and contributes to elucidating the mechanism of tumorigenesis. However, most of the existing approaches use the outdated pathway information and neglect the complex gene interactions in pathway. Here, we first reviewed the existing widely used pathway enrichment analysis approaches briefly, and then, we proposed a novel topology-based pathway enrichment analysis (TPEA) method, which integrated topological properties and global upstream/downstream positions of genes in pathways. We compared TPEA with four widely used pathway enrichment analysis tools, including database for annotation, visualization and integrated discovery (DAVID), gene set enrichment analysis (GSEA), centrality-based pathway enrichment (CePa) and signaling pathway impact analysis (SPIA), through analyzing six gene expression profiles of three tumor types (colorectal cancer, thyroid cancer and endometrial cancer). As a result, we identified several well-known cancer risk pathways that could not be obtained by the existing tools, and the results of TPEA were more stable than that of the other tools in analyzing different data sets of the same cancer. Ultimately, we developed an R package to implement TPEA, which could online update KEGG pathway information and is available at the Comprehensive R Archive Network (CRAN): https://cran.r-project.org/web/packages/TPEA/. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Simultaneous Scheduling of Jobs, AGVs and Tools Considering Tool Transfer Times in Multi Machine FMS By SOS Algorithm

    NASA Astrophysics Data System (ADS)

    Sivarami Reddy, N.; Ramamurthy, D. V., Dr.; Prahlada Rao, K., Dr.

    2017-08-01

    This article addresses simultaneous scheduling of machines, AGVs and tools where machines are allowed to share the tools considering transfer times of jobs and tools between machines, to generate best optimal sequences that minimize makespan in a multi-machine Flexible Manufacturing System (FMS). Performance of FMS is expected to improve by effective utilization of its resources, by proper integration and synchronization of their scheduling. Symbiotic Organisms Search (SOS) algorithm is a potent tool which is a better alternative for solving optimization problems like scheduling and proven itself. The proposed SOS algorithm is tested on 22 job sets with makespan as objective for scheduling of machines and tools where machines are allowed to share tools without considering transfer times of jobs and tools and the results are compared with the results of existing methods. The results show that the SOS has outperformed. The same SOS algorithm is used for simultaneous scheduling of machines, AGVs and tools where machines are allowed to share tools considering transfer times of jobs and tools to determine the best optimal sequences that minimize makespan.

  3. Institutional framework for integrated Pharmaceutical Benefits Management: results from a systematic review

    PubMed Central

    Hermanowski, Tomasz Roman; Drozdowska, Aleksandra Krystyna; Kowalczyk, Marta

    2015-01-01

    Objectives In this paper, we emphasised that effective management of health plans beneficiaries access to reimbursed medicines requires proper institutional set-up. The main objective was to identify and recommend an institutional framework of integrated pharmaceutical care providing effective, safe and equitable access to medicines. Method The institutional framework of drug policy was derived on the basis of publications obtained by systematic reviews. A comparative analysis concerning adaptation of coordinated pharmaceutical care services in the USA, the UK, Poland, Italy, Denmark and Germany was performed. Results While most European Union Member States promote the implementation of selected e-Health tools, like e-Prescribing, these efforts do not necessarily implement an integrated package. There is no single agent who would manage an insured patients’ access to medicines and health care in a coordinated manner, thereby increasing the efficiency and safety of drug policy. More attention should be paid by European Union Member States as to how to integrate various e-Health tools to enhance benefits to both individuals and societies. One solution could be to implement an integrated “pharmacy benefit management” model, which is well established in the USA and Canada and provides an integrated package of cost-containment methods, implemented within a transparent institutional framework and powered by strong motivation of the agent. PMID:26528099

  4. Research on the performance evaluation of agricultural products supply chain integrated operation

    NASA Astrophysics Data System (ADS)

    Jiang, Jiake; Wang, Xifu; Liu, Yang

    2017-04-01

    The agricultural product supply chain integrated operation can ensure the quality and efficiency of agricultural products, and achieve the optimal goal of low cost and high service. This paper establishes a performance evaluation index system of agricultural products supply chain integration operation based on the development status of agricultural products and SCOR, BSC and KPI model. And then, we constructing rough set theory and BP neural network comprehensive evaluation model with the aid of Rosetta and MATLAB tools and the case study is about the development of agricultural products integrated supply chain in Jing-Jin-Ji region. And finally, we obtain the corresponding performance results, and give some improvement measures and management recommendations to the managers.

  5. An Interactive, Integrated, Instructional Pathway to the LEAD Science Gateway

    NASA Astrophysics Data System (ADS)

    Yalda, S.; Clark, R.; Davis, L.; Wiziecki, E. N.

    2008-12-01

    Linked Environments for Atmospheric Discovery (LEAD) is a bold and revolutionary paradigm that through a Web-based Service Oriented Architecture (SOA) exposes the user to a rich environment of data, models, data mining and visualization and analysis tools, enabling the user to ask science questions of applications while the complexity of the software and middleware managing these applications is hidden from the user. From its inception in 2003, LEAD has championed goals that have context for the future of weather and related research and education. LEAD espouses to lowering the barrier for using complex end-to-end weather technologies by a) democratizing the availability of advanced weather technologies, b) empowering the user of these technologies to tackle a variety of problems, and c) facilitating learning and understanding. LEAD, as it exists today, is poised to enable a diverse community of scientists, educators, students, and operational practitioners. The project has been informed by atmospheric and computer scientists, educators, and educational consultants who, in search of new knowledge, understanding, ideas, and learning methodologies, seek easy access to new capabilities that allow for user-directed and interactive query and acquisition, simulation, assimilation, data mining, computational modeling, and visualization. As one component of the total LEAD effort, the LEAD education team has designed interactive, integrated, instructional pathways within a set of learning modules (LEAD-to-Learn) to facilitate, enhance, and enable the use of the LEAD gateway in the classroom. The LEAD education initiative focuses on the means to integrate data, tools, and services used by researchers into undergraduate meteorology education in order to provide an authentic and contextualized environment for teaching and learning. Educators, educational specialists, and students from meteorology and computer science backgrounds have collaborated on the design and development of learning materials, as well as new tools and features, to enhance the appearance and use of the LEAD portal gateway and its underlying cyberinfrastructure in an educational setting. The development of educational materials has centered on promoting the accessibility and use of meteorological data and analysis tools through the LEAD portal by providing instructional materials, additional custom designed tools that build off of Unidata's Integrated Data Viewer (IDV) (e.g. IDV Basic and NCDestroyer), and an interactive component that takes the user through specific tasks utilizing multiple tools. In fact, select improvements to parameter lists and domain subsetting have inspired IDV developers to incorporate changes in IDV revisions that are now available to the entire community. This collection of materials, demonstrations, interactive guides, student exercises, and customized tools, which are now available to the educator and student through the LEAD portal gateway, can serve as an instructional pathway for a set of guided, phenomenon-based exercises (e.g. fronts, lake-effect snows, etc.). This paper will provide an overview of the LEAD education and outreach efforts with a focus on the design of Web-based educational materials and instructional approaches for user interaction with the LEAD portal gateway and the underlying cyberinfrastructure, and will encourage educators, especially those involved in undergraduate meteorology education, to begin incorporating these capabilities into their course materials.

  6. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  7. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    PubMed

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  8. Goal setting: an integral component of effective diabetes care.

    PubMed

    Miller, Carla K; Bauman, Jennifer

    2014-08-01

    Goal setting is a widely used behavior change tool in diabetes education and training. Prior research found specific relatively difficult but attainable goals set within a specific timeframe improved performance in sports and at the workplace. However, the impact of goal setting in diabetes self-care has not received extensive attention. This review examined the mechanisms underlying behavioral change according to goal setting theory and evaluated the impact of goal setting in diabetes intervention studies. Eight studies were identified, which incorporated goal setting as the primary strategy to promote behavioral change in individual, group-based, and primary care settings among patients with type 2 diabetes. Improvements in diabetes-related self-efficacy, dietary intake, physical activity, and A1c were observed in some but not all studies. More systematic research is needed to determine the conditions and behaviors for which goal setting is most effective. Initial recommendations for using goal setting in diabetes patient encounters are offered.

  9. Embracing the Importance of FAIR Research Products - Findable, Accessible, Interoperable, and Reusable

    NASA Astrophysics Data System (ADS)

    Stall, S.

    2017-12-01

    Integrity and transparency within research is solidified by a complete set of research products that are findable, accessible, interoperable, and reusable. In other words, they follow the FAIR Guidelines developed by FORCE11.org. Your datasets, images, video, software, scripts, models, physical samples, and other tools and technology are an integral part of the narrative you tell about your research. These research products increasingly are being captured through workflow tools and preserved and connected through persistent identifiers across multiple repositories that keep them safe. They help secure, with your publications, the supporting evidence and integrity of the scientific record. This is the direction that Earth and space science as well as other disciplines is moving. Within our community, some science domains are further along, and others are taking more measured steps. AGU as a publisher is working to support the full scientific record with peer reviewed publications. Working with our community and all the Earth and space science journals, AGU is developing new policies to encourage researchers to plan for proper data preservation and provide data citations along with their research submission and to encourage adoption of best practices throughout the research workflow and data life cycle. Providing incentives, community standards, and easy-to-use tools are some important factors for helping researchers embrace the FAIR Guidelines and support transparency and integrity.

  10. Influenza Virus Database (IVDB): an integrated information resource and analysis platform for influenza virus research.

    PubMed

    Chang, Suhua; Zhang, Jiajie; Liao, Xiaoyun; Zhu, Xinxing; Wang, Dahai; Zhu, Jiang; Feng, Tao; Zhu, Baoli; Gao, George F; Wang, Jian; Yang, Huanming; Yu, Jun; Wang, Jing

    2007-01-01

    Frequent outbreaks of highly pathogenic avian influenza and the increasing data available for comparative analysis require a central database specialized in influenza viruses (IVs). We have established the Influenza Virus Database (IVDB) to integrate information and create an analysis platform for genetic, genomic, and phylogenetic studies of the virus. IVDB hosts complete genome sequences of influenza A virus generated by Beijing Institute of Genomics (BIG) and curates all other published IV sequences after expert annotation. Our Q-Filter system classifies and ranks all nucleotide sequences into seven categories according to sequence content and integrity. IVDB provides a series of tools and viewers for comparative analysis of the viral genomes, genes, genetic polymorphisms and phylogenetic relationships. A search system has been developed for users to retrieve a combination of different data types by setting search options. To facilitate analysis of global viral transmission and evolution, the IV Sequence Distribution Tool (IVDT) has been developed to display the worldwide geographic distribution of chosen viral genotypes and to couple genomic data with epidemiological data. The BLAST, multiple sequence alignment and phylogenetic analysis tools were integrated for online data analysis. Furthermore, IVDB offers instant access to pre-computed alignments and polymorphisms of IV genes and proteins, and presents the results as SNP distribution plots and minor allele distributions. IVDB is publicly available at http://influenza.genomics.org.cn.

  11. Modeling and Simulation of Phased Array Antennas to Support Next-Generation Satellite Design

    NASA Technical Reports Server (NTRS)

    Tchorowski, Nicole; Murawski, Robert; Manning, Robert; Fuentes, Michael

    2016-01-01

    Developing enhanced simulation capabilities has become a significant priority for the Space Communications and Navigation (SCaN) project at NASA as new space communications technologies are proposed to replace aging NASA communications assets, such as the Tracking and Data Relay Satellite System (TDRSS). When developing the architecture for these new space communications assets, it is important to develop updated modeling and simulation methodologies, such that competing architectures can be weighed against one another and the optimal path forward can be determined. There have been many simulation tools developed here at NASA for the simulation of single RF link budgets, or for the modeling and simulation of an entire network of spacecraft and their supporting SCaN network elements. However, the modeling capabilities are never fully complete and as new technologies are proposed, gaps are identified. One such gap is the ability to rapidly develop high fidelity simulation models of electronically steerable phased array systems. As future relay satellite architectures are proposed that include optical communications links, electronically steerable antennas will become more desirable due to the reduction in platform vibration introduced by mechanically steerable devices. In this research, we investigate how modeling of these antennas can be introduced into out overall simulation and modeling structure. The ultimate goal of this research is two-fold. First, to enable NASA engineers to model various proposed simulation architectures and determine which proposed architecture meets the given architectural requirements. Second, given a set of communications link requirements for a proposed satellite architecture, determine the optimal configuration for a phased array antenna. There is a variety of tools available that can be used to model phased array antennas. To meet our stated goals, the first objective of this research is to compare the subset of tools available to us, trading-off modeling fidelity of the tool with simulation performance. When comparing several proposed architectures, higher- fidelity modeling may be desirable, however, when iterating a proposed set of communication link requirements across ranges of phased array configuration parameters, the practicality of performance becomes a significant requirement. In either case, a minimum simulation - fidelity must be met, regardless of performance considerations, which will be discussed in this research. Given a suitable set of phased array modeling tools, this research then focuses on integration with current SCaN modeling and simulation tools. While properly modeling the antenna elements of a system are vital, this is only a small part of the end-to-end communication path between a satellite and the supporting ground station and/or relay satellite assets. To properly model a proposed simulation architecture, this toolset must be integrated with other commercial and government development tools, such that the overall architecture can be examined in terms of communications, reliability, and cost. In this research, integration with previously developed communication tools is investigated.

  12. Software tool for data mining and its applications

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  13. ORBIT: an integrated environment for user-customized bioinformatics tools.

    PubMed

    Bellgard, M I; Hiew, H L; Hunter, A; Wiebrands, M

    1999-10-01

    There are a large number of computational programs freely available to bioinformaticians via a client/server, web-based environment. However, the client interface to these tools (typically an html form page) cannot be customized from the client side as it is created by the service provider. The form page is usually generic enough to cater for a wide range of users. However, this implies that a user cannot set as 'default' advanced program parameters on the form or even customize the interface to his/her specific requirements or preferences. Currently, there is a lack of end-user interface environments that can be modified by the user when accessing computer programs available on a remote server running on an intranet or over the Internet. We have implemented a client/server system called ORBIT (Online Researcher's Bioinformatics Interface Tools) where individual clients can have interfaces created and customized to command-line-driven, server-side programs. Thus, Internet-based interfaces can be tailored to a user's specific bioinformatic needs. As interfaces are created on the client machine independent of the server, there can be different interfaces to the same server-side program to cater for different parameter settings. The interface customization is relatively quick (between 10 and 60 min) and all client interfaces are integrated into a single modular environment which will run on any computer platform supporting Java. The system has been developed to allow for a number of future enhancements and features. ORBIT represents an important advance in the way researchers gain access to bioinformatics tools on the Internet.

  14. Towards structured sharing of raw and derived neuroimaging data across existing resources

    PubMed Central

    Keator, D.B.; Helmer, K.; Steffener, J.; Turner, J.A.; Van Erp, T.G.M.; Gadde, S.; Ashish, N.; Burns, G.A.; Nichols, B.N.

    2013-01-01

    Data sharing efforts increasingly contribute to the acceleration of scientific discovery. Neuroimaging data is accumulating in distributed domain-specific databases and there is currently no integrated access mechanism nor an accepted format for the critically important meta-data that is necessary for making use of the combined, available neuroimaging data. In this manuscript, we present work from the Derived Data Working Group, an open-access group sponsored by the Biomedical Informatics Research Network (BIRN) and the International Neuroimaging Coordinating Facility (INCF) focused on practical tools for distributed access to neuroimaging data. The working group develops models and tools facilitating the structured interchange of neuroimaging meta-data and is making progress towards a unified set of tools for such data and meta-data exchange. We report on the key components required for integrated access to raw and derived neuroimaging data as well as associated meta-data and provenance across neuroimaging resources. The components include (1) a structured terminology that provides semantic context to data, (2) a formal data model for neuroimaging with robust tracking of data provenance, (3) a web service-based application programming interface (API) that provides a consistent mechanism to access and query the data model, and (4) a provenance library that can be used for the extraction of provenance data by image analysts and imaging software developers. We believe that the framework and set of tools outlined in this manuscript have great potential for solving many of the issues the neuroimaging community faces when sharing raw and derived neuroimaging data across the various existing database systems for the purpose of accelerating scientific discovery. PMID:23727024

  15. A Pragmatic Guide to the Setting up of Integrated Hypnotherapy Services in Primary Care and Clinical Settings.

    PubMed

    Entwistle, Paul Andrew

    2017-01-01

    Despite the continued debate and lack of a clear consensus about the true nature of the hypnotic phenomenon, hypnosis is increasingly being utilized successfully in many medical, health, and psychological spheres as a research method, motivational tool, and therapeutic modality. Significantly, however, although hypnotherapy is widely advertised, advocated, and employed in the private medical arena for the management and treatment of many physical and emotional disorders, too little appears to be being done to integrate hypnosis into primary care and national health medical services. This article discusses some of the reasons for the apparent reluctance of medical and scientific health professionals to consider incorporating hypnosis into their medical practice, including the practical problems inherent in using hypnosis in a medical context and some possible solutions.

  16. Financing dengue vaccine introduction in the Americas: challenges and opportunities.

    PubMed

    Constenla, Dagna; Clark, Samantha

    2016-01-01

    Dengue has escalated in the region of the Americas unabated despite major investments in integrated vector control and prevention strategies. An effective and affordable dengue vaccine can play a critical role in reducing the human and economic costs of the disease by preventing millions around the world from getting sick. However, there are considerable challenges on the path towards vaccine introduction. These include lack of sufficient financing tools, absence of capacity within national level decision-making bodies, and demands that new vaccines place on stressed health systems. Various financing models can be used to overcome these challenges including setting up procurement mechanisms, integrating regional and domestic taxes, and setting up low interest multilateral loans. In this paper we review these challenges and opportunities of financing dengue vaccine introduction in the Americas.

  17. Integrated Computational Solution for Predicting Skin Sensitization Potential of Molecules

    PubMed Central

    Desai, Aarti; Singh, Vivek K.; Jere, Abhay

    2016-01-01

    Introduction Skin sensitization forms a major toxicological endpoint for dermatology and cosmetic products. Recent ban on animal testing for cosmetics demands for alternative methods. We developed an integrated computational solution (SkinSense) that offers a robust solution and addresses the limitations of existing computational tools i.e. high false positive rate and/or limited coverage. Results The key components of our solution include: QSAR models selected from a combinatorial set, similarity information and literature-derived sub-structure patterns of known skin protein reactive groups. Its prediction performance on a challenge set of molecules showed accuracy = 75.32%, CCR = 74.36%, sensitivity = 70.00% and specificity = 78.72%, which is better than several existing tools including VEGA (accuracy = 45.00% and CCR = 54.17% with ‘High’ reliability scoring), DEREK (accuracy = 72.73% and CCR = 71.44%) and TOPKAT (accuracy = 60.00% and CCR = 61.67%). Although, TIMES-SS showed higher predictive power (accuracy = 90.00% and CCR = 92.86%), the coverage was very low (only 10 out of 77 molecules were predicted reliably). Conclusions Owing to improved prediction performance and coverage, our solution can serve as a useful expert system towards Integrated Approaches to Testing and Assessment for skin sensitization. It would be invaluable to cosmetic/ dermatology industry for pre-screening their molecules, and reducing time, cost and animal testing. PMID:27271321

  18. Integrative structural annotation of de novo RNA-Seq provides an accurate reference gene set of the enormous genome of the onion (Allium cepa L.).

    PubMed

    Kim, Seungill; Kim, Myung-Shin; Kim, Yong-Min; Yeom, Seon-In; Cheong, Kyeongchae; Kim, Ki-Tae; Jeon, Jongbum; Kim, Sunggil; Kim, Do-Sun; Sohn, Seong-Han; Lee, Yong-Hwan; Choi, Doil

    2015-02-01

    The onion (Allium cepa L.) is one of the most widely cultivated and consumed vegetable crops in the world. Although a considerable amount of onion transcriptome data has been deposited into public databases, the sequences of the protein-coding genes are not accurate enough to be used, owing to non-coding sequences intermixed with the coding sequences. We generated a high-quality, annotated onion transcriptome from de novo sequence assembly and intensive structural annotation using the integrated structural gene annotation pipeline (ISGAP), which identified 54,165 protein-coding genes among 165,179 assembled transcripts totalling 203.0 Mb by eliminating the intron sequences. ISGAP performed reliable annotation, recognizing accurate gene structures based on reference proteins, and ab initio gene models of the assembled transcripts. Integrative functional annotation and gene-based SNP analysis revealed a whole biological repertoire of genes and transcriptomic variation in the onion. The method developed in this study provides a powerful tool for the construction of reference gene sets for organisms based solely on de novo transcriptome data. Furthermore, the reference genes and their variation described here for the onion represent essential tools for molecular breeding and gene cloning in Allium spp. © The Author 2014. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  19. Visualising associations between paired ‘omics’ data sets

    PubMed Central

    2012-01-01

    Background Each omics platform is now able to generate a large amount of data. Genomics, proteomics, metabolomics, interactomics are compiled at an ever increasing pace and now form a core part of the fundamental systems biology framework. Recently, several integrative approaches have been proposed to extract meaningful information. However, these approaches lack of visualisation outputs to fully unravel the complex associations between different biological entities. Results The multivariate statistical approaches ‘regularized Canonical Correlation Analysis’ and ‘sparse Partial Least Squares regression’ were recently developed to integrate two types of highly dimensional ‘omics’ data and to select relevant information. Using the results of these methods, we propose to revisit few graphical outputs to better understand the relationships between two ‘omics’ data and to better visualise the correlation structure between the different biological entities. These graphical outputs include Correlation Circle plots, Relevance Networks and Clustered Image Maps. We demonstrate the usefulness of such graphical outputs on several biological data sets and further assess their biological relevance using gene ontology analysis. Conclusions Such graphical outputs are undoubtedly useful to aid the interpretation of these promising integrative analysis tools and will certainly help in addressing fundamental biological questions and understanding systems as a whole. Availability The graphical tools described in this paper are implemented in the freely available R package mixOmics and in its associated web application. PMID:23148523

  20. Simulating Humans as Integral Parts of Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Bruins, Anthony C.; Rice, Robert; Nguyen, Lac; Nguyen, Heidi; Saito, Tim; Russell, Elaine

    2006-01-01

    The Collaborative-Virtual Environment Simulation Tool (C-VEST) software was developed for use in a NASA project entitled "3-D Interactive Digital Virtual Human." The project is oriented toward the use of a comprehensive suite of advanced software tools in computational simulations for the purposes of human-centered design of spacecraft missions and of the spacecraft, space suits, and other equipment to be used on the missions. The C-VEST software affords an unprecedented suite of capabilities for three-dimensional virtual-environment simulations with plug-in interfaces for physiological data, haptic interfaces, plug-and-play software, realtime control, and/or playback control. Mathematical models of the mechanics of the human body and of the aforementioned equipment are implemented in software and integrated to simulate forces exerted on and by astronauts as they work. The computational results can then support the iterative processes of design, building, and testing in applied systems engineering and integration. The results of the simulations provide guidance for devising measures to counteract effects of microgravity on the human body and for the rapid development of virtual (that is, simulated) prototypes of advanced space suits, cockpits, and robots to enhance the productivity, comfort, and safety of astronauts. The unique ability to implement human-in-the-loop immersion also makes the C-VEST software potentially valuable for use in commercial and academic settings beyond the original space-mission setting.

  1. Design of Mobile Health Tools to Promote Goal Achievement in Self-Management Tasks

    PubMed Central

    Henderson, Geoffrey; Parmanto, Bambang

    2017-01-01

    Background Goal-setting within rehabilitation is a common practice ultimately geared toward helping patients make functional progress. Objective The purposes of this study were to (1) qualitatively analyze data from a wellness program for patients with spina bifida (SB) and spinal cord injury (SCI) in order to generate software requirements for a goal-setting module to support their complex goal-setting routines, (2) design a prototype of a goal-setting module within an existing mobile health (mHealth) system, and (3) identify what educational content might be necessary to integrate into the system. Methods A total of 750 goals were analyzed from patients with SB and SCI enrolled in a wellness program. These goals were qualitatively analyzed in order to operationalize a set of software requirements for an mHealth goal-setting module and identify important educational content. Results Those of male sex (P=.02) and with SCI diagnosis (P<.001) were more likely to achieve goals than females or those with SB. Temporality (P<.001) and type (P<.001) of goal were associated with likelihood that the goal would be achieved. Nearly all (210/213; 98.6%) of the fact-finding goals were achieved. There was no significant difference in achievement based on goal theme. Checklists, data tracking, and fact-finding tools were identified as three functionalities that could support goal-setting and achievement in an mHealth system. Based on the qualitative analysis, a list of software requirements for a goal-setting module was generated, and a prototype was developed. Targets for educational content were also generated. Conclusions Innovative mHealth tools can be developed to support commonly set goals by individuals with disabilities. PMID:28739558

  2. Variant Review with the Integrative Genomics Viewer.

    PubMed

    Robinson, James T; Thorvaldsdóttir, Helga; Wenger, Aaron M; Zehir, Ahmet; Mesirov, Jill P

    2017-11-01

    Manual review of aligned reads for confirmation and interpretation of variant calls is an important step in many variant calling pipelines for next-generation sequencing (NGS) data. Visual inspection can greatly increase the confidence in calls, reduce the risk of false positives, and help characterize complex events. The Integrative Genomics Viewer (IGV) was one of the first tools to provide NGS data visualization, and it currently provides a rich set of tools for inspection, validation, and interpretation of NGS datasets, as well as other types of genomic data. Here, we present a short overview of IGV's variant review features for both single-nucleotide variants and structural variants, with examples from both cancer and germline datasets. IGV is freely available at https://www.igv.org Cancer Res; 77(21); e31-34. ©2017 AACR . ©2017 American Association for Cancer Research.

  3. A test of the validity of the motivational interviewing treatment integrity code.

    PubMed

    Forsberg, Lars; Berman, Anne H; Kallmén, Håkan; Hermansson, Ulric; Helgason, Asgeir R

    2008-01-01

    To evaluate the Swedish version of the Motivational Interviewing Treatment Code (MITI), MITI coding was applied to tape-recorded counseling sessions. Construct validity was assessed using factor analysis on 120 MITI-coded sessions. Discriminant validity was assessed by comparing MITI coding of motivational interviewing (MI) sessions with information- and advice-giving sessions as well as by comparing MI-trained practitioners with untrained practitioners. A principal-axis factoring analysis yielded some evidence for MITI construct validity. MITI differentiated between practitioners with different levels of MI training as well as between MI practitioners and advice-giving counselors, thus supporting discriminant validity. MITI may be used as a training tool together with supervision to confirm and enhance MI practice in clinical settings. MITI can also serve as a tool for evaluating MI integrity in clinical research.

  4. How can knowledge discovery methods uncover spatio-temporal patterns in environmental data?

    NASA Astrophysics Data System (ADS)

    Wachowicz, Monica

    2000-04-01

    This paper proposes the integration of KDD, GVis and STDB as a long-term strategy, which will allow users to apply knowledge discovery methods for uncovering spatio-temporal patterns in environmental data. The main goal is to combine innovative techniques and associated tools for exploring very large environmental data sets in order to arrive at valid, novel, potentially useful, and ultimately understandable spatio-temporal patterns. The GeoInsight approach is described using the principles and key developments in the research domains of KDD, GVis, and STDB. The GeoInsight approach aims at the integration of these research domains in order to provide tools for performing information retrieval, exploration, analysis, and visualization. The result is a knowledge-based design, which involves visual thinking (perceptual-cognitive process) and automated information processing (computer-analytical process).

  5. Overview of integrative tools and methods in assessing ecological integrity in estuarine and coastal systems worldwide.

    PubMed

    Borja, Angel; Bricker, Suzanne B; Dauer, Daniel M; Demetriades, Nicolette T; Ferreira, João G; Forbes, Anthony T; Hutchings, Pat; Jia, Xiaoping; Kenchington, Richard; Carlos Marques, João; Zhu, Changbo

    2008-09-01

    In recent years, several sets of legislation worldwide (Oceans Act in USA, Australia or Canada; Water Framework Directive or Marine Strategy in Europe, National Water Act in South Africa, etc.) have been developed in order to address ecological quality or integrity, within estuarine and coastal systems. Most such legislation seeks to define quality in an integrative way, by using several biological elements, together with physico-chemical and pollution elements. Such an approach allows assessment of ecological status at the ecosystem level ('ecosystem approach' or 'holistic approach' methodologies), rather than at species level (e.g. mussel biomonitoring or Mussel Watch) or just at chemical level (i.e. quality objectives) alone. Increasing attention has been paid to the development of tools for different physico-chemical or biological (phytoplankton, zooplankton, benthos, algae, phanerogams, fishes) elements of the ecosystems. However, few methodologies integrate all the elements into a single evaluation of a water body. The need for such integrative tools to assess ecosystem quality is very important, both from a scientific and stakeholder point of view. Politicians and managers need information from simple and pragmatic, but scientifically sound methodologies, in order to show to society the evolution of a zone (estuary, coastal area, etc.), taking into account human pressures or recovery processes. These approaches include: (i) multidisciplinarity, inherent in the teams involved in their implementation; (ii) integration of biotic and abiotic factors; (iii) accurate and validated methods in determining ecological integrity; and (iv) adequate indicators to follow the evolution of the monitored ecosystems. While some countries increasingly use the establishment of marine parks to conserve marine biodiversity and ecological integrity, there is awareness (e.g. in Australia) that conservation and management of marine ecosystems cannot be restricted to Marine Protected Areas but must include areas outside such reserves. This contribution reviews the current situation of integrative ecological assessment worldwide, by presenting several examples from each of the continents: Africa, Asia, Australia, Europe and North America.

  6. Thermographic measurements of high-speed metal cutting

    NASA Astrophysics Data System (ADS)

    Mueller, Bernhard; Renz, Ulrich

    2002-03-01

    Thermographic measurements of a high-speed cutting process have been performed with an infrared camera. To realize images without motion blur the integration times were reduced to a few microseconds. Since the high tool wear influences the measured temperatures a set-up has been realized which enables small cutting lengths. Only single images have been recorded because the process is too fast to acquire a sequence of images even with the frame rate of the very fast infrared camera which has been used. To expose the camera when the rotating tool is in the middle of the camera image an experimental set-up with a light barrier and a digital delay generator with a time resolution of 1 ns has been realized. This enables a very exact triggering of the camera at the desired position of the tool in the image. Since the cutting depth is between 0.1 and 0.2 mm a high spatial resolution was also necessary which was obtained by a special close-up lens allowing a resolution of app. 45 microns. The experimental set-up will be described and infrared images and evaluated temperatures of a titanium alloy and a carbon steel will be presented for cutting speeds up to 42 m/s.

  7. The scope of cell phones in diabetes management in developing country health care settings.

    PubMed

    Ajay, Vamadevan S; Prabhakaran, Dorairaj

    2011-05-01

    Diabetes has emerged as a major public health concern in developing nations. Health systems in most developing countries are yet to integrate effective prevention and control programs for diabetes into routine health care services. Given the inadequate human resources and underfunctioning health systems, we need novel and innovative approaches to combat diabetes in developing-country settings. In this regard, the tremendous advances in telecommunication technology, particularly cell phones, can be harnessed to improve diabetes care. Cell phones could serve as a tool for collecting information on surveillance, service delivery, evidence-based care, management, and supply systems pertaining to diabetes from primary care settings in addition to providing health messages as part of diabetes education. As a screening/diagnostic tool for diabetes, cell phones can aid the health workers in undertaking screening and diagnostic and follow-up care for diabetes in the community. Cell phones are also capable of acting as a vehicle for continuing medical education; a decision support system for evidence-based management; and a tool for patient education, self-management, and compliance. However, for widespread use, we need robust evaluations of cell phone applications in existing practices and appropriate interventions in diabetes. © 2011 Diabetes Technology Society.

  8. The Scope of Cell Phones in Diabetes Management in Developing Country Health Care Settings

    PubMed Central

    Ajay, Vamadevan S; Prabhakaran, Dorairaj

    2011-01-01

    Diabetes has emerged as a major public health concern in developing nations. Health systems in most developing countries are yet to integrate effective prevention and control programs for diabetes into routine health care services. Given the inadequate human resources and underfunctioning health systems, we need novel and innovative approaches to combat diabetes in developing-country settings. In this regard, the tremendous advances in telecommunication technology, particularly cell phones, can be harnessed to improve diabetes care. Cell phones could serve as a tool for collecting information on surveillance, service delivery, evidence-based care, management, and supply systems pertaining to diabetes from primary care settings in addition to providing health messages as part of diabetes education. As a screening/diagnostic tool for diabetes, cell phones can aid the health workers in undertaking screening and diagnostic and follow-up care for diabetes in the community. Cell phones are also capable of acting as a vehicle for continuing medical education; a decision support system for evidence-based management; and a tool for patient education, self-management, and compliance. However, for widespread use, we need robust evaluations of cell phone applications in existing practices and appropriate interventions in diabetes. PMID:21722593

  9. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study

    PubMed Central

    Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data. PMID:28255331

  10. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study.

    PubMed

    Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  11. The Integration of Remote-Sensing Detection Techniques into the Operational Decision-Making of Marine Oil Spills

    NASA Astrophysics Data System (ADS)

    Garron, J.; Trainor, S.

    2017-12-01

    Remotely-sensed data collected from satellites, airplanes and unmanned aerial systems can be used in marine oil spills to identify the overall footprint, estimate fate and transport, and to identify resources at risk. Mandates for the use of best available technology exists for addressing marine oil spills under the jurisdiction of the USCG (33 CFR 155.1050), though clear pathways to familiarization of these technologies during a marine oil spill, or more importantly, between marine oil spills, does not. Similarly, remote-sensing scientists continue to experiment with highly tuned oil detection, fate and transport techniques that can benefit decision-making during a marine oil spill response, but the process of translating these prototypical tools to operational information remains undefined, leading most researchers to describe the "potential" of these new tools in an operational setting rather than their actual use, and decision-makers relying on traditional field observational methods. Arctic marine oil spills are no different in their mandates and the remote-sensing research undertaken, but are unique via the dark, cold, remote, infrastructure-free environment in which they can occur. These conditions increase the reliance of decision-makers in an Arctic oil spill on remotely-sensed data and tools for their manipulation. In the absence of another large-scale oil spill in the US, and limited literature on the subject, this study was undertaken to understand how remotely-sensed data and tools are being used in the Incident Command System of a marine oil spill now, with an emphasis on Arctic implementation. Interviews, oil spill scenario/drill observations and marine oil spill after action reports were collected and analyzed to determine the current state of remote-sensing data use for decision-making during a marine oil spill, and to define a set of recommendations for the process of integrating new remote-sensing tools and information in future oil spill responses. Using automated synthetic aperture radar analyses of oil spills in a common operational picture as a scientific case study, this presentation is a demonstration of how landscape-level scientific data can be integrated into Arctic planning and operational decision-making.

  12. Climate Science Centers: An "Existence Theorem" for a Federal-University Partnership to Develop Actionable and Needs-Driven Science Agendas

    NASA Astrophysics Data System (ADS)

    Moore, B., III

    2014-12-01

    Climate Science Centers: An "Existence Theorem" for a Federal-University Partnership to Develop Actionable and Needs-Driven Science Agendas. Berrien Moore III (University of Oklahoma) The South Central Climate Science Center (CSC) is one of eight regional centers established by the Department of the Interior (DoI) under Secretarial Order 3289 to address the impacts of climate change on America's water, land, and other natural and cultural resources. Under DoI leadership and funding, these CSCs will provide scientific information tools and techniques to study impacts of climate change synthesize and integrate climate change impact data develop tools that the DoI managers and partners can use when managing the DOI's land, water, fish and wildlife, and cultural heritage resources (emphasis added) The network of Climate Science Centers will provide decision makers with the science, tools, and information they need to address the impacts of climate variability and change on their areas of responsibility. Note from Webster, a tool is a device for doing work; it makes outcomes more realizable and more cost effective, and, in a word, better. Prior to the existence of CSCs, the university and federal scientific world certainly contained a large "set" of scientists with considerable strength in the physical, biological, natural, and social sciences to address the complexities and interdisciplinary nature of the challenges in the areas of climate variability, change, impacts, and adaptation. However, this set of scientists were hardly an integrated community let alone a focused team, but rather a collection of distinguished researchers, educators, and practitioners that were working with disparate though at times linked objectives, and they were rarely aligning themselves formally to an overarching strategic pathway. In addition, data, models, research results, tools, and products were generally somewhat "disconnected" from the broad range of stakeholders. I should note also that NOAA's Regional Integrated Sciences and Assessments ( RISA) program is an earlier "Existence Theorem" for a Federal-University Partnership to Develop Actionable and Needs-Driven Science Agendas. This contribution will discuss the important cultural shift that has flowed from Secretarial Order 3289.

  13. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis.

    PubMed

    Suter, Esther; Oelke, Nelly D; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana

    2017-11-13

    Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, "overall integration" tools may be useful for a broad assessment of the overall state of a system. Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework.

  14. Web tools for predictive toxicology model building.

    PubMed

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  15. Simulation Facilities and Test Beds for Galileo

    NASA Astrophysics Data System (ADS)

    Schlarmann, Bernhard Kl.; Leonard, Arian

    2002-01-01

    Galileo is the European satellite navigation system, financed by the European Space Agency (ESA) and the European Commission (EC). The Galileo System, currently under definition phase, will offer seamless global coverage, providing state-of-the-art positioning and timing services. Galileo services will include a standard service targeted at mass market users, an augmented integrity service, providing integrity warnings when fault occur and Public Regulated Services (ensuring a continuity of service for the public users). Other services are under consideration (SAR and integrated communications). Galileo will be interoperable with GPS, and will be complemented by local elements that will enhance the services for specific local users. In the frame of the Galileo definition phase, several system design and simulation facilities and test beds have been defined and developed for the coming phases of the project, respectively they are currently under development. These are mainly the following tools: Galileo Mission Analysis Simulator to design the Space Segment, especially to support constellation design, deployment and replacement. Galileo Service Volume Simulator to analyse the global performance requirements based on a coverage analysis for different service levels and degrades modes. Galileo System Simulation Facility is a sophisticated end-to-end simulation tool to assess the navigation performances for a complete variety of users under different operating conditions and different modes. Galileo Signal Validation Facility to evaluate signal and message structures for Galileo. Galileo System Test Bed (Version 1) to assess and refine the Orbit Determination &Time Synchronisation and Integrity algorithms, through experiments relying on GPS space infrastructure. This paper presents an overview on the so called "G-Facilities" and describes the use of the different system design tools during the project life cycle in order to design the system with respect to availability, continuity and integrity requirements. It gives more details on two of these system design tools: the Galileo Signal Validation Facility (GSVF) and the Galileo System Simulation Facility (GSSF). It will describe the operational use of these facilities within the complete set of design tools and especially the combined use of GSVF and GSSF will be described. Finally, this paper presents also examples and results obtained with these tools.

  16. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  17. RGmatch: matching genomic regions to proximal genes in omics data integration.

    PubMed

    Furió-Tarí, Pedro; Conesa, Ana; Tarazona, Sonia

    2016-11-22

    The integrative analysis of multiple genomics data often requires that genome coordinates-based signals have to be associated with proximal genes. The relative location of a genomic region with respect to the gene (gene area) is important for functional data interpretation; hence algorithms that match regions to genes should be able to deliver insight into this information. In this work we review the tools that are publicly available for making region-to-gene associations. We also present a novel method, RGmatch, a flexible and easy-to-use Python tool that computes associations either at the gene, transcript, or exon level, applying a set of rules to annotate each region-gene association with the region location within the gene. RGmatch can be applied to any organism as long as genome annotation is available. Furthermore, we qualitatively and quantitatively compare RGmatch to other tools. RGmatch simplifies the association of a genomic region with its closest gene. At the same time, it is a powerful tool because the rules used to annotate these associations are very easy to modify according to the researcher's specific interests. Some important differences between RGmatch and other similar tools already in existence are RGmatch's flexibility, its wide range of user options, compatibility with any annotatable organism, and its comprehensive and user-friendly output.

  18. SPARTA: Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis.

    PubMed

    Johnson, Benjamin K; Scholz, Matthew B; Teal, Tracy K; Abramovitch, Robert B

    2016-02-04

    Many tools exist in the analysis of bacterial RNA sequencing (RNA-seq) transcriptional profiling experiments to identify differentially expressed genes between experimental conditions. Generally, the workflow includes quality control of reads, mapping to a reference, counting transcript abundance, and statistical tests for differentially expressed genes. In spite of the numerous tools developed for each component of an RNA-seq analysis workflow, easy-to-use bacterially oriented workflow applications to combine multiple tools and automate the process are lacking. With many tools to choose from for each step, the task of identifying a specific tool, adapting the input/output options to the specific use-case, and integrating the tools into a coherent analysis pipeline is not a trivial endeavor, particularly for microbiologists with limited bioinformatics experience. To make bacterial RNA-seq data analysis more accessible, we developed a Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis (SPARTA). SPARTA is a reference-based bacterial RNA-seq analysis workflow application for single-end Illumina reads. SPARTA is turnkey software that simplifies the process of analyzing RNA-seq data sets, making bacterial RNA-seq analysis a routine process that can be undertaken on a personal computer or in the classroom. The easy-to-install, complete workflow processes whole transcriptome shotgun sequencing data files by trimming reads and removing adapters, mapping reads to a reference, counting gene features, calculating differential gene expression, and, importantly, checking for potential batch effects within the data set. SPARTA outputs quality analysis reports, gene feature counts and differential gene expression tables and scatterplots. SPARTA provides an easy-to-use bacterial RNA-seq transcriptional profiling workflow to identify differentially expressed genes between experimental conditions. This software will enable microbiologists with limited bioinformatics experience to analyze their data and integrate next generation sequencing (NGS) technologies into the classroom. The SPARTA software and tutorial are available at sparta.readthedocs.org.

  19. An evidence-based approach to the prevention and initial management of skin tears within the aged community setting: a best practice implementation project.

    PubMed

    Beechey, Rebekah; Priest, Laura; Peters, Micah; Moloney, Clint

    2015-06-12

    Maintaining skin integrity in a community setting is an ongoing issue, as research suggests that the prevalence of skin tears within the community is greater than that in an institutional setting. While skin tear prevention and management principles in these settings are similar to those in an acute care setting, consideration of the environmental and psychological factors of the client is pivotal to prevention in a community setting. Evidence suggests that home environment assessment, education for clients and care givers, and being proactive in improving activities of daily living in a community setting can significantly reduce the risk of sustaining skin tears. The aim of this implementation project was to assess and review current skin tear prevention and management practices within the community setting, and from this, to implement an evidence-based approach in the education of clients and staff on the prevention of skin tears. As well. the project aims to implement evidence-based principles to guide clinical practice in relation to the initial management of skin tears, and to determine strategies to overcome barriers and non-compliance. The project utilized the Joanna Brigg's Institute Practical Application of Clinical Evidence System audit tool for promoting changes in the community health setting. The implementation of this particular project is based in a region within Anglicare Southern Queensland. A small team was established and a baseline audit carried out. From this, multiple strategies were implemented to address non-compliance which included education resources for clients and caregivers, staff education sessions, and creating skin integrity kits to enable staff members to tend to skin tears, and from this a follow-up audit undertaken. Baseline audit results were slightly varied, from good to low compliance. From this, the need for staff and client education was highlighted. There were many improvements in the audit criteria following client and staff education sessions and staff self-directed learning packages. Future strategies required to sustain improvements in practice and make further progress are to introduce a readily available Anglicare Skin Integrity Assessment Tool to the nursing staff for undertaking new client admissions over 65 years, and to provide ongoing education to staff members, clients and care givers in order to reduce the prevalence of skin tears in the community setting. This implementation project demonstrated the importance of education of personal care workers, clients and their caregivers for prevention of skin tears in the community setting. This in turn created autonomy and empowered clients to take control of their health. The Joanna Briggs Institute.

  20. Acute care patient portals: a qualitative study of stakeholder perspectives on current practices.

    PubMed

    Collins, Sarah A; Rozenblum, Ronen; Leung, Wai Yin; Morrison, Constance Rc; Stade, Diana L; McNally, Kelly; Bourie, Patricia Q; Massaro, Anthony; Bokser, Seth; Dwyer, Cindy; Greysen, Ryan S; Agarwal, Priyanka; Thornton, Kevin; Dalal, Anuj K

    2017-04-01

    To describe current practices and stakeholder perspectives of patient portals in the acute care setting. We aimed to: (1) identify key features, (2) recognize challenges, (3) understand current practices for design, configuration, and use, and (4) propose new directions for investigation and innovation. Mixed methods including surveys, interviews, focus groups, and site visits with stakeholders at leading academic medical centers. Thematic analyses to inform development of an explanatory model and recommendations. Site surveys were administered to 5 institutions. Thirty interviews/focus groups were conducted at 4 site visits that included a total of 84 participants. Ten themes regarding content and functionality, engagement and culture, and access and security were identified, from which an explanatory model of current practices was developed. Key features included clinical data, messaging, glossary, patient education, patient personalization and family engagement tools, and tiered displays. Four actionable recommendations were identified by group consensus. Design, development, and implementation of acute care patient portals should consider: (1) providing a single integrated experience across care settings, (2) humanizing the patient-clinician relationship via personalization tools, (3) providing equitable access, and (4) creating a clear organizational mission and strategy to achieve outcomes of interest. Portals should provide a single integrated experience across the inpatient and ambulatory settings. Core functionality includes tools that facilitate communication, personalize the patient, and deliver education to advance safe, coordinated, and dignified patient-centered care. Our findings can be used to inform a "road map" for future work related to acute care patient portals. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  1. GeneTools--application for functional annotation and statistical hypothesis testing.

    PubMed

    Beisvag, Vidar; Jünge, Frode K R; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Laegreid, Astrid

    2006-10-24

    Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. GeneTools is the first "all in one" annotation tool, providing users with a rapid extraction of highly relevant gene annotation data for e.g. thousands of genes or clones at once. It allows a user to define and archive new GO annotations and it supports hypothesis testing related to GO category representations. GeneTools is freely available through www.genetools.no

  2. Multidisciplinary Optimization for Aerospace Using Genetic Optimization

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Hahn, Edward E.; Herrera, Claudia Y.

    2007-01-01

    In support of the ARMD guidelines NASA's Dryden Flight Research Center is developing a multidisciplinary design and optimization tool This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Optimization has made its way into many mainstream applications. For example NASTRAN(TradeMark) has its solution sequence 200 for Design Optimization, and MATLAB(TradeMark) has an Optimization Tool box. Other packages, such as ZAERO(TradeMark) aeroelastic panel code and the CFL3D(TradeMark) Navier-Stokes solver have no built in optimizer. The goal of the tool development is to generate a central executive capable of using disparate software packages ina cross platform network environment so as to quickly perform optimization and design tasks in a cohesive streamlined manner. A provided figure (Figure 1) shows a typical set of tools and their relation to the central executive. Optimization can take place within each individual too, or in a loop between the executive and the tool, or both.

  3. Practical Tips and Tools--Using Stories from the Field for Professional Development: "How-To" Guidelines from Reading to Reflection and Practice Integration

    ERIC Educational Resources Information Center

    Talmi, Ayelet

    2013-01-01

    Case studies provide numerous opportunities for professional development and can be particularly helpful in transdiciplinary training. This article offers suggestions for how to use the "Zero to Three" Journal's "Stories From the Field" series of articles across a variety of settings and roles such as clinical practice, program…

  4. Integrated workflows for spiking neuronal network simulations

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902

  5. Integrated workflows for spiking neuronal network simulations.

    PubMed

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.

  6. A Simple Tool for the Design and Analysis of Multiple-Reflector Antennas in a Multi-Disciplinary Environment

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.; Borgioli, Andrea

    2000-01-01

    The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.

  7. An integrative approach to ortholog prediction for disease-focused and other functional studies.

    PubMed

    Hu, Yanhui; Flockhart, Ian; Vinayagam, Arunachalam; Bergwitz, Clemens; Berger, Bonnie; Perrimon, Norbert; Mohr, Stephanie E

    2011-08-31

    Mapping of orthologous genes among species serves an important role in functional genomics by allowing researchers to develop hypotheses about gene function in one species based on what is known about the functions of orthologs in other species. Several tools for predicting orthologous gene relationships are available. However, these tools can give different results and identification of predicted orthologs is not always straightforward. We report a simple but effective tool, the Drosophila RNAi Screening Center Integrative Ortholog Prediction Tool (DIOPT; http://www.flyrnai.org/diopt), for rapid identification of orthologs. DIOPT integrates existing approaches, facilitating rapid identification of orthologs among human, mouse, zebrafish, C. elegans, Drosophila, and S. cerevisiae. As compared to individual tools, DIOPT shows increased sensitivity with only a modest decrease in specificity. Moreover, the flexibility built into the DIOPT graphical user interface allows researchers with different goals to appropriately 'cast a wide net' or limit results to highest confidence predictions. DIOPT also displays protein and domain alignments, including percent amino acid identity, for predicted ortholog pairs. This helps users identify the most appropriate matches among multiple possible orthologs. To facilitate using model organisms for functional analysis of human disease-associated genes, we used DIOPT to predict high-confidence orthologs of disease genes in Online Mendelian Inheritance in Man (OMIM) and genes in genome-wide association study (GWAS) data sets. The results are accessible through the DIOPT diseases and traits query tool (DIOPT-DIST; http://www.flyrnai.org/diopt-dist). DIOPT and DIOPT-DIST are useful resources for researchers working with model organisms, especially those who are interested in exploiting model organisms such as Drosophila to study the functions of human disease genes.

  8. Genetic engineering in Actinoplanes sp. SE50/110 - development of an intergeneric conjugation system for the introduction of actinophage-based integrative vectors.

    PubMed

    Gren, Tetiana; Ortseifen, Vera; Wibberg, Daniel; Schneiker-Bekel, Susanne; Bednarz, Hanna; Niehaus, Karsten; Zemke, Till; Persicke, Marcus; Pühler, Alfred; Kalinowski, Jörn

    2016-08-20

    The α-glucosidase inhibitor acarbose is used for treatment of diabetes mellitus type II, and is manufactured industrially with overproducing derivatives of Actinoplanes sp. SE50/110, reportedly obtained by conventional mutagenesis. Despite of high industrial significance, only limited information exists regarding acarbose metabolism, function and regulation of these processes, due to the absence of proper genetic engineering methods and tools developed for this strain. Here, a basic toolkit for genetic engineering of Actinoplanes sp. SE50/110 was developed, comprising a standardized protocol for a DNA transfer through Escherichia coli-Actinoplanes intergeneric conjugation and applied for the transfer of ϕC31, ϕBT1 and VWB actinophage-based integrative vectors. Integration sites, occurring once per genome for all vectors, were sequenced and characterized for the first time in Actinoplanes sp. SE50/110. Notably, in case of ϕC31 based vector pSET152, the integration site is highly conserved, while for ϕBT1 and the VWB based vectors pRT801 and pSOK804, respectively, no sequence similarities to those in other bacteria were detected. The studied plasmids were proven to be stable and neutral with respect to strain morphology and acarbose production, enabling future use for genetic manipulations of Actinoplanes sp. SE50/110. To further broaden the spectrum of available tools, a GUS reporter system, based on the pSET152 derived vector, was also established in Actinoplanes sp. SE50/110. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Manipulator interactive design with interconnected flexible elements

    NASA Technical Reports Server (NTRS)

    Singh, R. P.; Likins, P. W.

    1983-01-01

    This paper describes the development of an analysis tool for the interactive design of control systems for manipulators and similar electro-mechanical systems amenable to representation as structures in a topological chain. The chain consists of a series of elastic bodies subject to small deformations and arbitrary displacements. The bodies are connected by hinges which permit kinematic constraints, control, or relative motion with six degrees of freedom. The equations of motion for the chain configuration are derived via Kane's method, extended for application to interconnected flexible bodies with time-varying boundary conditions. A corresponding set of modal coordinates has been selected. The motion equations are imbedded within a simulation that transforms the vector-dyadic equations into scalar form for numerical integration. The simulation also includes a linear, time-invariant controler specified in transfer function format and a set of sensors and actuators that interface between the structure and controller. The simulation is driven by an interactive set-up program resulting in an easy-to-use analysis tool.

  10. TranscriptomeBrowser 3.0: introducing a new compendium of molecular interactions and a new visualization tool for the study of gene regulatory networks.

    PubMed

    Lepoivre, Cyrille; Bergon, Aurélie; Lopez, Fabrice; Perumal, Narayanan B; Nguyen, Catherine; Imbert, Jean; Puthier, Denis

    2012-01-31

    Deciphering gene regulatory networks by in silico approaches is a crucial step in the study of the molecular perturbations that occur in diseases. The development of regulatory maps is a tedious process requiring the comprehensive integration of various evidences scattered over biological databases. Thus, the research community would greatly benefit from having a unified database storing known and predicted molecular interactions. Furthermore, given the intrinsic complexity of the data, the development of new tools offering integrated and meaningful visualizations of molecular interactions is necessary to help users drawing new hypotheses without being overwhelmed by the density of the subsequent graph. We extend the previously developed TranscriptomeBrowser database with a set of tables containing 1,594,978 human and mouse molecular interactions. The database includes: (i) predicted regulatory interactions (computed by scanning vertebrate alignments with a set of 1,213 position weight matrices), (ii) potential regulatory interactions inferred from systematic analysis of ChIP-seq experiments, (iii) regulatory interactions curated from the literature, (iv) predicted post-transcriptional regulation by micro-RNA, (v) protein kinase-substrate interactions and (vi) physical protein-protein interactions. In order to easily retrieve and efficiently analyze these interactions, we developed In-teractomeBrowser, a graph-based knowledge browser that comes as a plug-in for Transcriptome-Browser. The first objective of InteractomeBrowser is to provide a user-friendly tool to get new insight into any gene list by providing a context-specific display of putative regulatory and physical interactions. To achieve this, InteractomeBrowser relies on a "cell compartments-based layout" that makes use of a subset of the Gene Ontology to map gene products onto relevant cell compartments. This layout is particularly powerful for visual integration of heterogeneous biological information and is a productive avenue in generating new hypotheses. The second objective of InteractomeBrowser is to fill the gap between interaction databases and dynamic modeling. It is thus compatible with the network analysis software Cytoscape and with the Gene Interaction Network simulation software (GINsim). We provide examples underlying the benefits of this visualization tool for large gene set analysis related to thymocyte differentiation. The InteractomeBrowser plugin is a powerful tool to get quick access to a knowledge database that includes both predicted and validated molecular interactions. InteractomeBrowser is available through the TranscriptomeBrowser framework and can be found at: http://tagc.univ-mrs.fr/tbrowser/. Our database is updated on a regular basis.

  11. GEMINI: Integrative Exploration of Genetic Variation and Genome Annotations

    PubMed Central

    Paila, Umadevi; Chapman, Brad A.; Kirchner, Rory; Quinlan, Aaron R.

    2013-01-01

    Modern DNA sequencing technologies enable geneticists to rapidly identify genetic variation among many human genomes. However, isolating the minority of variants underlying disease remains an important, yet formidable challenge for medical genetics. We have developed GEMINI (GEnome MINIng), a flexible software package for exploring all forms of human genetic variation. Unlike existing tools, GEMINI integrates genetic variation with a diverse and adaptable set of genome annotations (e.g., dbSNP, ENCODE, UCSC, ClinVar, KEGG) into a unified database to facilitate interpretation and data exploration. Whereas other methods provide an inflexible set of variant filters or prioritization methods, GEMINI allows researchers to compose complex queries based on sample genotypes, inheritance patterns, and both pre-installed and custom genome annotations. GEMINI also provides methods for ad hoc queries and data exploration, a simple programming interface for custom analyses that leverage the underlying database, and both command line and graphical tools for common analyses. We demonstrate GEMINI's utility for exploring variation in personal genomes and family based genetic studies, and illustrate its ability to scale to studies involving thousands of human samples. GEMINI is designed for reproducibility and flexibility and our goal is to provide researchers with a standard framework for medical genomics. PMID:23874191

  12. IRIS Earthquake Browser with Integration to the GEON IDV for 3-D Visualization of Hypocenters.

    NASA Astrophysics Data System (ADS)

    Weertman, B. R.

    2007-12-01

    We present a new generation of web based earthquake query tool - the IRIS Earthquake Browser (IEB). The IEB combines the DMC's large set of earthquake catalogs (provided by USGS/NEIC, ISC and the ANF) with the popular Google Maps web interface. With the IEB you can quickly and easily find earthquakes in any region of the globe. Using Google's detailed satellite images, earthquakes can be easily co-located with natural geographic features such as volcanoes as well as man made features such as commercial mines. A set of controls allow earthquakes to be filtered by time, magnitude, and depth range as well as catalog name, contributor name and magnitude type. Displayed events can be easily exported in NetCDF format into the GEON Integrated Data Viewer (IDV) where hypocenters may be visualized in three dimensions. Looking "under the hood", the IEB is based on AJAX technology and utilizes REST style web services hosted at the IRIS DMC. The IEB is part of a broader effort at the DMC aimed at making our data holdings available via web services. The IEB is useful both educationally and as a research tool.

  13. DaGO-Fun: tool for Gene Ontology-based functional analysis using term information content measures.

    PubMed

    Mazandu, Gaston K; Mulder, Nicola J

    2013-09-25

    The use of Gene Ontology (GO) data in protein analyses have largely contributed to the improved outcomes of these analyses. Several GO semantic similarity measures have been proposed in recent years and provide tools that allow the integration of biological knowledge embedded in the GO structure into different biological analyses. There is a need for a unified tool that provides the scientific community with the opportunity to explore these different GO similarity measure approaches and their biological applications. We have developed DaGO-Fun, an online tool available at http://web.cbio.uct.ac.za/ITGOM, which incorporates many different GO similarity measures for exploring, analyzing and comparing GO terms and proteins within the context of GO. It uses GO data and UniProt proteins with their GO annotations as provided by the Gene Ontology Annotation (GOA) project to precompute GO term information content (IC), enabling rapid response to user queries. The DaGO-Fun online tool presents the advantage of integrating all the relevant IC-based GO similarity measures, including topology- and annotation-based approaches to facilitate effective exploration of these measures, thus enabling users to choose the most relevant approach for their application. Furthermore, this tool includes several biological applications related to GO semantic similarity scores, including the retrieval of genes based on their GO annotations, the clustering of functionally related genes within a set, and term enrichment analysis.

  14. Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership

    NASA Astrophysics Data System (ADS)

    Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya

    CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.

  15. A review of Integrated Vehicle Health Management tools for legacy platforms: Challenges and opportunities

    NASA Astrophysics Data System (ADS)

    Esperon-Miguez, Manuel; John, Philip; Jennions, Ian K.

    2013-01-01

    Integrated Vehicle Health Management (IVHM) comprises a set of tools, technologies and techniques for automated detection, diagnosis and prognosis of faults in order to support platforms more efficiently. Specific challenges are faced when IVHM tools are to be retrofitted into legacy vehicles since major modifications are much more challenging than with platforms whose design can still be modified. The topics covered in this Review Paper include the state of the art of IVHM tools and how their characteristics match the requirements of legacy aircraft, a summary of problems faced in the past trying to retrofit IVHM tools both from a technical and organisational perspective and the current level of implementation of IVHM in industry. Although the technology has not reached the level necessary to implement IVHM to its full potential on every kind of component, significant progress has been achieved on rotating equipment, structures or electronics. Attempts to retrofit some of these tools in the past faced both technical difficulties and opposition by some stakeholders, the later being responsible for the failure of technically sound projects in more than one occasion. Nevertheless, despite these difficulties, products and services based on IVHM technology have started to be offered by the manufacturers and, what is more important, demanded by the operators, providing guidance on what the industry would demand from IVHM on legacy aircraft.

  16. Modeling biochemical transformation processes and information processing with Narrator.

    PubMed

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from http://www.narrator-tool.org.

  17. Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.

    PubMed

    Scott, Bradley; Wilcock, Anne

    2006-01-01

    Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.

  18. Point Analysis in Java applied to histological images of the perforant pathway: a user's account.

    PubMed

    Scorcioni, Ruggero; Wright, Susan N; Patrick Card, J; Ascoli, Giorgio A; Barrionuevo, Germán

    2008-01-01

    The freeware Java tool Point Analysis in Java (PAJ), created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (x2 objective) comprised the entire perforant pathway, while the high magnification set (x100 objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ.

  19. Providing an integrated clinical data view in a hospital information system that manages multimedia data.

    PubMed

    Dayhoff, R E; Maloney, D L; Kenney, T J; Fletcher, R D

    1991-01-01

    The VA's hospital information system, the Decentralized Hospital Computer Program (DHCP), is an integrated system based on a powerful set of software tools with shared data accessible from any of its application modules. It includes many functionally specific application subsystems such as laboratory, pharmacy, radiology, and dietetics. Physicians need applications that cross these application boundaries to provide useful and convenient patient data. One of these multi-specialty applications, the DHCP Imaging System, integrates multimedia data to provide clinicians with comprehensive patient-oriented information. User requirements for cross-disciplinary image access can be studied to define needs for similar text data access. Integration approaches must be evaluated both for their ability to deliver patient-oriented text data rapidly and their ability to integrate multimedia data objects. Several potential integration approaches are described as they relate to the DHCP Imaging System.

  20. Providing an integrated clinical data view in a hospital information system that manages multimedia data.

    PubMed Central

    Dayhoff, R. E.; Maloney, D. L.; Kenney, T. J.; Fletcher, R. D.

    1991-01-01

    The VA's hospital information system, the Decentralized Hospital Computer Program (DHCP), is an integrated system based on a powerful set of software tools with shared data accessible from any of its application modules. It includes many functionally specific application subsystems such as laboratory, pharmacy, radiology, and dietetics. Physicians need applications that cross these application boundaries to provide useful and convenient patient data. One of these multi-specialty applications, the DHCP Imaging System, integrates multimedia data to provide clinicians with comprehensive patient-oriented information. User requirements for cross-disciplinary image access can be studied to define needs for similar text data access. Integration approaches must be evaluated both for their ability to deliver patient-oriented text data rapidly and their ability to integrate multimedia data objects. Several potential integration approaches are described as they relate to the DHCP Imaging System. PMID:1807651

  1. Integrated energy balance analysis for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Tandler, John

    1991-01-01

    An integrated simulation model is described which characterizes the dynamic interaction of the energy transport subsystems of Space Station Freedom for given orbital conditions and for a given set of power and thermal loads. Subsystems included in the model are the Electric Power System (EPS), the Internal Thermal Control System (ITCS), the External Thermal Control System (ETCS), and the cabin Temperature and Humidity Control System (THC) (which includes the avionics air cooling, cabin air cooling, and intermodule ventilation systems). Models of the subsystems were developed in a number of system-specific modeling tools and validated. The subsystem models are then combined into integrated models to address a number of integrated performance issues involving the ability of the integrated energy transport system of Space Station Freedom to provide power, controlled cabin temperature and humidity, and equipment thermal control to support operations.

  2. BIG: a large-scale data integration tool for renal physiology.

    PubMed

    Zhao, Yue; Yang, Chin-Rang; Raghuram, Viswanathan; Parulekar, Jaya; Knepper, Mark A

    2016-10-01

    Due to recent advances in high-throughput techniques, we and others have generated multiple proteomic and transcriptomic databases to describe and quantify gene expression, protein abundance, or cellular signaling on the scale of the whole genome/proteome in kidney cells. The existence of so much data from diverse sources raises the following question: "How can researchers find information efficiently for a given gene product over all of these data sets without searching each data set individually?" This is the type of problem that has motivated the "Big-Data" revolution in Data Science, which has driven progress in fields such as marketing. Here we present an online Big-Data tool called BIG (Biological Information Gatherer) that allows users to submit a single online query to obtain all relevant information from all indexed databases. BIG is accessible at http://big.nhlbi.nih.gov/.

  3. From data to the decision: A software architecture to integrate predictive modelling in clinical settings.

    PubMed

    Martinez-Millana, A; Fernandez-Llatas, C; Sacchi, L; Segagni, D; Guillen, S; Bellazzi, R; Traver, V

    2015-08-01

    The application of statistics and mathematics over large amounts of data is providing healthcare systems with new tools for screening and managing multiple diseases. Nonetheless, these tools have many technical and clinical limitations as they are based on datasets with concrete characteristics. This proposition paper describes a novel architecture focused on providing a validation framework for discrimination and prediction models in the screening of Type 2 diabetes. For that, the architecture has been designed to gather different data sources under a common data structure and, furthermore, to be controlled by a centralized component (Orchestrator) in charge of directing the interaction flows among data sources, models and graphical user interfaces. This innovative approach aims to overcome the data-dependency of the models by providing a validation framework for the models as they are used within clinical settings.

  4. Atlas - a data warehouse for integrative bioinformatics.

    PubMed

    Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire M S; Ling, John; Ouellette, B F Francis

    2005-02-21

    We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: http://bioinformatics.ubc.ca/atlas/

  5. Atlas – a data warehouse for integrative bioinformatics

    PubMed Central

    Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire MS; Ling, John; Ouellette, BF Francis

    2005-01-01

    Background We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. Description The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. Conclusion The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: PMID:15723693

  6. Bayesian network interface for assisting radiology interpretation and education

    NASA Astrophysics Data System (ADS)

    Duda, Jeffrey; Botzolakis, Emmanuel; Chen, Po-Hao; Mohan, Suyash; Nasrallah, Ilya; Rauschecker, Andreas; Rudie, Jeffrey; Bryan, R. Nick; Gee, James; Cook, Tessa

    2018-03-01

    In this work, we present the use of Bayesian networks for radiologist decision support during clinical interpretation. This computational approach has the advantage of avoiding incorrect diagnoses that result from known human cognitive biases such as anchoring bias, framing effect, availability bias, and premature closure. To integrate Bayesian networks into clinical practice, we developed an open-source web application that provides diagnostic support for a variety of radiology disease entities (e.g., basal ganglia diseases, bone lesions). The Clinical tool presents the user with a set of buttons representing clinical and imaging features of interest. These buttons are used to set the value for each observed feature. As features are identified, the conditional probabilities for each possible diagnosis are updated in real time. Additionally, using sensitivity analysis, the interface may be set to inform the user which remaining imaging features provide maximum discriminatory information to choose the most likely diagnosis. The Case Submission tools allow the user to submit a validated case and the associated imaging features to a database, which can then be used for future tuning/testing of the Bayesian networks. These submitted cases are then reviewed by an assigned expert using the provided QC tool. The Research tool presents users with cases with previously labeled features and a chosen diagnosis, for the purpose of performance evaluation. Similarly, the Education page presents cases with known features, but provides real time feedback on feature selection.

  7. A Microsoft Excel® 2010 Based Tool for Calculating Interobserver Agreement

    PubMed Central

    Azulay, Richard L

    2011-01-01

    This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel®) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work. PMID:22649578

  8. A microsoft excel(®) 2010 based tool for calculating interobserver agreement.

    PubMed

    Reed, Derek D; Azulay, Richard L

    2011-01-01

    This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel(®)) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work.

  9. Parallel evolution of image processing tools for multispectral imagery

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Brumby, Steven P.; Perkins, Simon J.; Porter, Reid B.; Theiler, James P.; Young, Aaron C.; Szymanski, John J.; Bloch, Jeffrey J.

    2000-11-01

    We describe the implementation and performance of a parallel, hybrid evolutionary-algorithm-based system, which optimizes image processing tools for feature-finding tasks in multi-spectral imagery (MSI) data sets. Our system uses an integrated spatio-spectral approach and is capable of combining suitably-registered data from different sensors. We investigate the speed-up obtained by parallelization of the evolutionary process via multiple processors (a workstation cluster) and develop a model for prediction of run-times for different numbers of processors. We demonstrate our system on Landsat Thematic Mapper MSI , covering the recent Cerro Grande fire at Los Alamos, NM, USA.

  10. Picante: R tools for integrating phylogenies and ecology.

    PubMed

    Kembel, Steven W; Cowan, Peter D; Helmus, Matthew R; Cornwell, William K; Morlon, Helene; Ackerly, David D; Blomberg, Simon P; Webb, Campbell O

    2010-06-01

    Picante is a software package that provides a comprehensive set of tools for analyzing the phylogenetic and trait diversity of ecological communities. The package calculates phylogenetic diversity metrics, performs trait comparative analyses, manipulates phenotypic and phylogenetic data, and performs tests for phylogenetic signal in trait distributions, community structure and species interactions. Picante is a package for the R statistical language and environment written in R and C, released under a GPL v2 open-source license, and freely available on the web (http://picante.r-forge.r-project.org) and from CRAN (http://cran.r-project.org).

  11. Bridging the Water Policy and Management Silos: An Opportunity for Leveraged Capacity Building

    NASA Astrophysics Data System (ADS)

    Wegner, D. L.

    2017-12-01

    The global community is challenged by increasing demand and decreasing water supplies. Historically nations have focused on local or regional water development projects that meet specific needs, often without consideration of the impact on downstream transboundary water users or the watershed itself. Often these decisions have been based on small sets of project specific data with little assessment on river basin impacts. In the United States this disjointed approach to water has resulted in 26 federal agencies having roles in water management or regulation, 50 states addressing water rights and compliance, and a multitude of tribal and local entities intersecting the water process. This approach often manifests itself in a convoluted, disjointed and time-consuming approach. The last systematic and comprehensive review of nationwide water policy was the 1973 National Water Commission Report. A need exists for capacity building collaborative and integrative leadership and dialogue. NASA's Western Water Applications Office (WWAO) provides a unique opportunity to leverage water and terrain data with water agencies and policy makers. A supported WWAO can provide bridges between federal and state water agencies; provide consistent integrated hydrologic and terrain based data set acquired from multiple earth orbiting satellites and airborne platforms; provide data sets leveraged with academic and research based entities to develop specific integrative predictive tools; and evaluate hydrology information across multiple boundaries. It is the author's conclusion that the Western Water Applications Office can provide a value-added approach that will help translate transboundary water and earth terrain information to national policy decisions through education, increased efficiency, increased connectivity, improved coordination, and increased communication. To be effective the WWAO should embrace five objectives: (1) be technically and scientifically valid; (2) administratively supported; (3) financially sustainable; (4) politically achievable; and (5) focus on integration of innovative remote sensing and data analysis tools.

  12. Building an integrated neurodegenerative disease database at an academic health center.

    PubMed

    Xie, Sharon X; Baek, Young; Grossman, Murray; Arnold, Steven E; Karlawish, Jason; Siderowf, Andrew; Hurtig, Howard; Elman, Lauren; McCluskey, Leo; Van Deerlin, Vivianna; Lee, Virginia M-Y; Trojanowski, John Q

    2011-07-01

    It is becoming increasingly important to study common and distinct etiologies, clinical and pathological features, and mechanisms related to neurodegenerative diseases such as Alzheimer's disease, Parkinson's disease, amyotrophic lateral sclerosis, and frontotemporal lobar degeneration. These comparative studies rely on powerful database tools to quickly generate data sets that match diverse and complementary criteria set by them. In this article, we present a novel integrated neurodegenerative disease (INDD) database, which was developed at the University of Pennsylvania (Penn) with the help of a consortium of Penn investigators. Because the work of these investigators are based on Alzheimer's disease, Parkinson's disease, amyotrophic lateral sclerosis, and frontotemporal lobar degeneration, it allowed us to achieve the goal of developing an INDD database for these major neurodegenerative disorders. We used the Microsoft SQL server as a platform, with built-in "backwards" functionality to provide Access as a frontend client to interface with the database. We used PHP Hypertext Preprocessor to create the "frontend" web interface and then used a master lookup table to integrate individual neurodegenerative disease databases. We also present methods of data entry, database security, database backups, and database audit trails for this INDD database. Using the INDD database, we compared the results of a biomarker study with those using an alternative approach by querying individual databases separately. We have demonstrated that the Penn INDD database has the ability to query multiple database tables from a single console with high accuracy and reliability. The INDD database provides a powerful tool for generating data sets in comparative studies on several neurodegenerative diseases. Copyright © 2011 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  13. GDR (Genome Database for Rosaceae): integrated web-database for Rosaceae genomics and genetics data

    PubMed Central

    Jung, Sook; Staton, Margaret; Lee, Taein; Blenda, Anna; Svancara, Randall; Abbott, Albert; Main, Dorrie

    2008-01-01

    The Genome Database for Rosaceae (GDR) is a central repository of curated and integrated genetics and genomics data of Rosaceae, an economically important family which includes apple, cherry, peach, pear, raspberry, rose and strawberry. GDR contains annotated databases of all publicly available Rosaceae ESTs, the genetically anchored peach physical map, Rosaceae genetic maps and comprehensively annotated markers and traits. The ESTs are assembled to produce unigene sets of each genus and the entire Rosaceae. Other annotations include putative function, microsatellites, open reading frames, single nucleotide polymorphisms, gene ontology terms and anchored map position where applicable. Most of the published Rosaceae genetic maps can be viewed and compared through CMap, the comparative map viewer. The peach physical map can be viewed using WebFPC/WebChrom, and also through our integrated GDR map viewer, which serves as a portal to the combined genetic, transcriptome and physical mapping information. ESTs, BACs, markers and traits can be queried by various categories and the search result sites are linked to the mapping visualization tools. GDR also provides online analysis tools such as a batch BLAST/FASTA server for the GDR datasets, a sequence assembly server and microsatellite and primer detection tools. GDR is available at http://www.rosaceae.org. PMID:17932055

  14. Developing an Integrated Model Framework for the Assessment of Sustainable Agricultural Residue Removal Limits for Bioenergy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Muth, Jr.; Jared Abodeely; Richard Nelson

    Agricultural residues have significant potential as a feedstock for bioenergy production, but removing these residues can have negative impacts on soil health. Models and datasets that can support decisions about sustainable agricultural residue removal are available; however, no tools currently exist capable of simultaneously addressing all environmental factors that can limit availability of residue. The VE-Suite model integration framework has been used to couple a set of environmental process models to support agricultural residue removal decisions. The RUSLE2, WEPS, and Soil Conditioning Index models have been integrated. A disparate set of databases providing the soils, climate, and management practice datamore » required to run these models have also been integrated. The integrated system has been demonstrated for two example cases. First, an assessment using high spatial fidelity crop yield data has been run for a single farm. This analysis shows the significant variance in sustainably accessible residue across a single farm and crop year. A second example is an aggregate assessment of agricultural residues available in the state of Iowa. This implementation of the integrated systems model demonstrates the capability to run a vast range of scenarios required to represent a large geographic region.« less

  15. Web-site evaluation tools: a case study in reproductive health information.

    PubMed

    Aslani, Azam; Pournik, Omid; Abu-Hanna, Ameen; Eslami, Saeid

    2014-01-01

    Internet forms an opportunity to inform, teach, and connect professionals and patients. However, much information on Internet is incomplete, inaccurate, or misleading, and not only in the medical domain. Because of the potential for damage from misleading and inaccurate health information, many organizations and individuals have published or implemented scoring tools for evaluating the appropriateness or quality of these resources. The objective of this study is to identify and summarize scoring tools that have evaluated web-sites providing reproductive health information in order to compare them and recommend an overarching evaluation tool. We searched Ovid MEDLINE(R) (1946 to July 2013) and OVID Embase (1980 to July 2013); and included English language studies that have evaluated the quality of websites providing reproductive health information. Studies only assessing the content of websites were excluded. We identified 5 scoring tools: 1-The HON (health on the net) Code of Conduct for medical and health Web sites, 2-Silberg scores, 3-Hogne Sandvik scale, 4-Jim Kapoun's Criteria for Evaluating Web Pages, and 5-The Health Information Technology Institute (HITI) criteria. We have compared these scales and identified 14 criteria: authorship, ownership, currency, objectivity/content, transparency/source, interactivity, privacy/ethics, financial disclosure, navigability/links, complementarity, advertising policy, design, quantity, and accessibility. We integrated these criteria and introduced a new tool with 10 criteria. Website evaluation tools differ in their evaluation criteria and there is a lack of consensus about which to use; therefore, an integrated easy to use set of criteria is needed.

  16. Solfatara volcano subsurface imaging: two different approaches to process and interpret multi-variate data sets

    NASA Astrophysics Data System (ADS)

    Bernardinetti, Stefano; Bruno, Pier Paolo; Lavoué, François; Gresse, Marceau; Vandemeulebrouck, Jean; Revil, André

    2017-04-01

    The need to reduce model uncertainty and produce a more reliable geophysical imaging and interpretations is nowadays a fundamental task required to geophysics techniques applied in complex environments such as Solfatara Volcano. The use of independent geophysical methods allows to obtain many information on the subsurface due to the different sensitivities of the data towards parameters such as compressional and shearing wave velocities, bulk electrical conductivity, or density. The joint processing of these multiple physical properties can lead to a very detailed characterization of the subsurface and therefore enhance our imaging and our interpretation. In this work, we develop two different processing approaches based on reflection seismology and seismic P-wave tomography on one hand, and electrical data acquired over the same line, on the other hand. From these data, we obtain an image-guided electrical resistivity tomography and a post processing integration of tomographic results. The image-guided electrical resistivity tomography is obtained by regularizing the inversion of the electrical data with structural constraints extracted from a migrated seismic section using image processing tools. This approach enables to focus the reconstruction of electrical resistivity anomalies along the features visible in the seismic section, and acts as a guide for interpretation in terms of subsurface structures and processes. To integrate co-registrated P-wave velocity and electrical resistivity values, we apply a data mining tool, the k-means algorithm, to individuate relationships between the two set of variables. This algorithm permits to individuate different clusters with the objective to minimize the sum of squared Euclidean distances within each cluster and maximize it between clusters for the multivariate data set. We obtain a partitioning of the multivariate data set in a finite number of well-correlated clusters, representative of the optimum clustering of our geophysical variables (P-wave velocities and electrical resistivities). The result is an integrated tomography that shows a finite number of homogeneous geophysical facies, and therefore permits to highlight the main geological features of the subsurface.

  17. Development of a personalized decision aid for breast cancer risk reduction and management.

    PubMed

    Ozanne, Elissa M; Howe, Rebecca; Omer, Zehra; Esserman, Laura J

    2014-01-14

    Breast cancer risk reduction has the potential to decrease the incidence of the disease, yet remains underused. We report on the development a web-based tool that provides automated risk assessment and personalized decision support designed for collaborative use between patients and clinicians. Under Institutional Review Board approval, we evaluated the decision tool through a patient focus group, usability testing, and provider interviews (including breast specialists, primary care physicians, genetic counselors). This included demonstrations and data collection at two scientific conferences (2009 International Shared Decision Making Conference, 2009 San Antonio Breast Cancer Symposium). Overall, the evaluations were favorable. The patient focus group evaluations and usability testing (N = 34) provided qualitative feedback about format and design; 88% of these participants found the tool useful and 94% found it easy to use. 91% of the providers (N = 23) indicated that they would use the tool in their clinical setting. BreastHealthDecisions.org represents a new approach to breast cancer prevention care and a framework for high quality preventive healthcare. The ability to integrate risk assessment and decision support in real time will allow for informed, value-driven, and patient-centered breast cancer prevention decisions. The tool is being further evaluated in the clinical setting.

  18. oPOSSUM: integrated tools for analysis of regulatory motif over-representation

    PubMed Central

    Ho Sui, Shannan J.; Fulton, Debra L.; Arenillas, David J.; Kwon, Andrew T.; Wasserman, Wyeth W.

    2007-01-01

    The identification of over-represented transcription factor binding sites from sets of co-expressed genes provides insights into the mechanisms of regulation for diverse biological contexts. oPOSSUM, an internet-based system for such studies of regulation, has been improved and expanded in this new release. New features include a worm-specific version for investigating binding sites conserved between Caenorhabditis elegans and C. briggsae, as well as a yeast-specific version for the analysis of co-expressed sets of Saccharomyces cerevisiae genes. The human and mouse applications feature improvements in ortholog mapping, sequence alignments and the delineation of multiple alternative promoters. oPOSSUM2, introduced for the analysis of over-represented combinations of motifs in human and mouse genes, has been integrated with the original oPOSSUM system. Analysis using user-defined background gene sets is now supported. The transcription factor binding site models have been updated to include new profiles from the JASPAR database. oPOSSUM is available at http://www.cisreg.ca/oPOSSUM/ PMID:17576675

  19. Using the Saccharomyces Genome Database (SGD) for analysis of genomic information

    PubMed Central

    Skrzypek, Marek S.; Hirschman, Jodi

    2011-01-01

    Analysis of genomic data requires access to software tools that place the sequence-derived information in the context of biology. The Saccharomyces Genome Database (SGD) integrates functional information about budding yeast genes and their products with a set of analysis tools that facilitate exploring their biological details. This unit describes how the various types of functional data available at SGD can be searched, retrieved, and analyzed. Starting with the guided tour of the SGD Home page and Locus Summary page, this unit highlights how to retrieve data using YeastMine, how to visualize genomic information with GBrowse, how to explore gene expression patterns with SPELL, and how to use Gene Ontology tools to characterize large-scale datasets. PMID:21901739

  20. QuakeSim 2.0

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay W.; Lyzenga, Gregory A.; Granat, Robert A.; Norton, Charles D.; Rundle, John B.; Pierce, Marlon E.; Fox, Geoffrey C.; McLeod, Dennis; Ludwig, Lisa Grant

    2012-01-01

    QuakeSim 2.0 improves understanding of earthquake processes by providing modeling tools and integrating model applications and various heterogeneous data sources within a Web services environment. QuakeSim is a multisource, synergistic, data-intensive environment for modeling the behavior of earthquake faults individually, and as part of complex interacting systems. Remotely sensed geodetic data products may be explored, compared with faults and landscape features, mined by pattern analysis applications, and integrated with models and pattern analysis applications in a rich Web-based and visualization environment. Integration of heterogeneous data products with pattern informatics tools enables efficient development of models. Federated database components and visualization tools allow rapid exploration of large datasets, while pattern informatics enables identification of subtle, but important, features in large data sets. QuakeSim is valuable for earthquake investigations and modeling in its current state, and also serves as a prototype and nucleus for broader systems under development. The framework provides access to physics-based simulation tools that model the earthquake cycle and related crustal deformation. Spaceborne GPS and Inter ferometric Synthetic Aperture (InSAR) data provide information on near-term crustal deformation, while paleoseismic geologic data provide longerterm information on earthquake fault processes. These data sources are integrated into QuakeSim's QuakeTables database system, and are accessible by users or various model applications. UAVSAR repeat pass interferometry data products are added to the QuakeTables database, and are available through a browseable map interface or Representational State Transfer (REST) interfaces. Model applications can retrieve data from Quake Tables, or from third-party GPS velocity data services; alternatively, users can manually input parameters into the models. Pattern analysis of GPS and seismicity data has proved useful for mid-term forecasting of earthquakes, and for detecting subtle changes in crustal deformation. The GPS time series analysis has also proved useful as a data-quality tool, enabling the discovery of station anomalies and data processing and distribution errors. Improved visualization tools enable more efficient data exploration and understanding. Tools provide flexibility to science users for exploring data in new ways through download links, but also facilitate standard, intuitive, and routine uses for science users and end users such as emergency responders.

  1. htsint: a Python library for sequencing pipelines that combines data through gene set generation.

    PubMed

    Richards, Adam J; Herrel, Anthony; Bonneaud, Camille

    2015-09-24

    Sequencing technologies provide a wealth of details in terms of genes, expression, splice variants, polymorphisms, and other features. A standard for sequencing analysis pipelines is to put genomic or transcriptomic features into a context of known functional information, but the relationships between ontology terms are often ignored. For RNA-Seq, considering genes and their genetic variants at the group level enables a convenient way to both integrate annotation data and detect small coordinated changes between experimental conditions, a known caveat of gene level analyses. We introduce the high throughput data integration tool, htsint, as an extension to the commonly used gene set enrichment frameworks. The central aim of htsint is to compile annotation information from one or more taxa in order to calculate functional distances among all genes in a specified gene space. Spectral clustering is then used to partition the genes, thereby generating functional modules. The gene space can range from a targeted list of genes, like a specific pathway, all the way to an ensemble of genomes. Given a collection of gene sets and a count matrix of transcriptomic features (e.g. expression, polymorphisms), the gene sets produced by htsint can be tested for 'enrichment' or conditional differences using one of a number of commonly available packages. The database and bundled tools to generate functional modules were designed with sequencing pipelines in mind, but the toolkit nature of htsint allows it to also be used in other areas of genomics. The software is freely available as a Python library through GitHub at https://github.com/ajrichards/htsint.

  2. [How to evaluate the application of Clinical Governance tools in the management of hospitalized hyperglycemic patients: results of a multicentric study].

    PubMed

    De Belvis, Antonio Giulio; Specchia, Maria Lucia; Ferriero, Anna Maria; Capizzi, Silvio

    2017-01-01

    Risk management is a key tool in Clinical Governance. Our project aimed to define, share, apply and measure the impact of tools and methodologies for the continuous improvement of quality of care, especially in relation to the multi-disciplinary and integrated management of the hyperglycemic patient in hospital settings. A training project, coordinated by a scientific board of experts in diabetes and health management and an Expert Meeting with representatives of all the participating centers was launched in 2014. The project involved eight hospitals through the organization of meetings with five managers and 25 speakers, including diabetologists, internists, pharmacists and nurses. The analysis showed a wide variability in the adoption of tools and processes towards a comprehensive and coordinated management of hyperglycemic patients.

  3. High Temperature Logging and Monitoring Instruments to Explore and Drill Deep into Hot Oceanic Crust.

    NASA Astrophysics Data System (ADS)

    Denchik, N.; Pezard, P. A.; Ragnar, A.; Jean-Luc, D.; Jan, H.

    2014-12-01

    Drilling an entire section of the oceanic crust and through the Moho has been a goal of the scientific community for more than half of a century. On the basis of ODP and IODP experience and data, this will require instruments and strategies working at temperature far above 200°C (reached, for example, at the bottom of DSDP/ODP Hole 504B), and possibly beyond 300°C. Concerning logging and monitoring instruments, progress were made over the past ten years in the context of the HiTI ("High Temperature Instruments") project funded by the european community for deep drilling in hot Icelandic geothermal holes where supercritical conditions and a highly corrosive environment are expected at depth (with temperatures above 374 °C and pressures exceeding 22 MPa). For example, a slickline tool (memory tool) tolerating up to 400°C and wireline tools up to 300°C were developed and tested in Icelandic high-temperature geothermal fields. The temperature limitation of logging tools was defined to comply with the present limitation in wireline cables (320°C). As part of this new set of downhole tools, temperature, pressure, fluid flow and casing collar location might be measured up to 400°C from a single multisensor tool. Natural gamma radiation spectrum, borehole wall ultrasonic images signal, and fiber optic cables (using distributed temperature sensing methods) were also developed for wireline deployment up to 300°C and tested in the field. A wireline, dual laterolog electrical resistivity tool was also developed but could not be field tested as part of HiTI. This new set of tools constitutes a basis for the deep exploration of the oceanic crust in the future. In addition, new strategies including the real-time integration of drilling parameters with modeling of the thermo-mechanical status of the borehole could be developed, using time-lapse logging of temperature (for heat flow determination) and borehole wall images (for hole stability and in-situ stress determination) as boundary conditions for the models. In all, and with limited integration of existing tools, to deployment of high-temperature downhole tools could contribute largely to the success of the long awaited Mohole project.

  4. Conceptualisation and development of the Conversational Health Literacy Assessment Tool (CHAT).

    PubMed

    O'Hara, Jonathan; Hawkins, Melanie; Batterham, Roy; Dodson, Sarity; Osborne, Richard H; Beauchamp, Alison

    2018-03-22

    The aim of this study was to develop a tool to support health workers' ability to identify patients' multidimensional health literacy strengths and challenges. The tool was intended to be suitable for administration in healthcare settings where health workers must identify health literacy priorities as the basis for person-centred care. Development was based on a qualitative co-design process that used the Health Literacy Questionnaire (HLQ) as a framework to generate questions. Health workers were recruited to participate in an online consultation, a workshop, and two rounds of pilot testing. Participating health workers identified and refined ten questions that target five areas of assessment: supportive professional relationships, supportive personal relationships, health information access and comprehension, current health behaviours, and health promotion barriers and support. Preliminary evidence suggests that application of the Conversational Health Literacy Assessment Tool (CHAT) can support health workers to better understand the health literacy challenges and supportive resources of their patients. As an integrated clinical process, the CHAT can supplement existing intake and assessment procedures across healthcare settings to give insight into patients' circumstances so that decisions about care can be tailored to be more appropriate and effective.

  5. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

    PubMed Central

    Oelke, Nelly D.; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana

    2017-01-01

    Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework. PMID:29588637

  6. The STARTEC Decision Support Tool for Better Tradeoffs between Food Safety, Quality, Nutrition, and Costs in Production of Advanced Ready-to-Eat Foods.

    PubMed

    Skjerdal, Taran; Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; de Cecare, Alessandra; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Trevisiani, Marcello; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine

    2017-01-01

    A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes , quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as "good"; "sufficient"; or "corrective action needed" based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users.

  7. The GenABEL Project for statistical genomics.

    PubMed

    Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.

  8. The STARTEC Decision Support Tool for Better Tradeoffs between Food Safety, Quality, Nutrition, and Costs in Production of Advanced Ready-to-Eat Foods

    PubMed Central

    Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch.; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine

    2017-01-01

    A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes, quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as “good”; “sufficient”; or “corrective action needed” based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users. PMID:29457031

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flory, John Andrew; Padilla, Denise D.; Gauthier, John H.

    Upcoming weapon programs require an aggressive increase in Application Specific Integrated Circuit (ASIC) production at Sandia National Laboratories (SNL). SNL has developed unique modeling and optimization tools that have been instrumental in improving ASIC production productivity and efficiency, identifying optimal operational and tactical execution plans under resource constraints, and providing confidence in successful mission execution. With ten products and unprecedented levels of demand, a single set of shared resources, highly variable processes, and the need for external supplier task synchronization, scheduling is an integral part of successful manufacturing. The scheduler uses an iterative multi-objective genetic algorithm and a multi-dimensional performancemore » evaluator. Schedule feasibility is assessed using a discrete event simulation (DES) that incorporates operational uncertainty, variability, and resource availability. The tools provide rapid scenario assessments and responses to variances in the operational environment, and have been used to inform major equipment investments and workforce planning decisions in multiple SNL facilities.« less

  10. Quantifying Ubiquitin Signaling

    PubMed Central

    Ordureau, Alban; Münch, Christian; Harper, J. Wade

    2015-01-01

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), most notably phosphorylation. Flux through such pathways is typically dictated by the fractional stoichiometry of distinct regulatory modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events. A key regulatory feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. PMID:26000850

  11. Systems Engineering and Integration (SE and I)

    NASA Technical Reports Server (NTRS)

    Chevers, ED; Haley, Sam

    1990-01-01

    The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.

  12. ORAC: 21st Century Observing at UKIRT

    NASA Astrophysics Data System (ADS)

    Bridger, A.; Wright, G. S.; Tan, M.; Pickup, D. A.; Economou, F.; Currie, M. J.; Adamson, A. J.; Rees, N. P.; Purves, M. H.

    The Observatory Reduction and Acquisition Control system replaces all of the existing software which interacts with the observers at UKIRT. The aim is to improve observing efficiency with a set of integrated tools that take the user from pre-observing preparation, through the acquisition of observations to the reduction using a data-driven pipeline. ORAC is designed to be flexible and extensible, and is intended for use with all future UKIRT instruments, as well as existing telescope hardware and ``legacy'' instruments. It is also designed to allow integration with phase-1 and queue-scheduled observing tools in anticipation of possible future requirements. A brief overview of the project and its relationship to other systems is given. ORAC also re-uses much code from other systems and we discuss issues relating to the trade-off between reuse and the generation of new software specific to our requirements.

  13. Interpreter of maladies: redescription mining applied to biomedical data analysis.

    PubMed

    Waltman, Peter; Pearlman, Alex; Mishra, Bud

    2006-04-01

    Comprehensive, systematic and integrated data-centric statistical approaches to disease modeling can provide powerful frameworks for understanding disease etiology. Here, one such computational framework based on redescription mining in both its incarnations, static and dynamic, is discussed. The static framework provides bioinformatic tools applicable to multifaceted datasets, containing genetic, transcriptomic, proteomic, and clinical data for diseased patients and normal subjects. The dynamic redescription framework provides systems biology tools to model complex sets of regulatory, metabolic and signaling pathways in the initiation and progression of a disease. As an example, the case of chronic fatigue syndrome (CFS) is considered, which has so far remained intractable and unpredictable in its etiology and nosology. The redescription mining approaches can be applied to the Centers for Disease Control and Prevention's Wichita (KS, USA) dataset, integrating transcriptomic, epidemiological and clinical data, and can also be used to study how pathways in the hypothalamic-pituitary-adrenal axis affect CFS patients.

  14. A web-based neurological pain classifier tool utilizing Bayesian decision theory for pain classification in spinal cord injury patients

    NASA Astrophysics Data System (ADS)

    Verma, Sneha K.; Chun, Sophia; Liu, Brent J.

    2014-03-01

    Pain is a common complication after spinal cord injury with prevalence estimates ranging 77% to 81%, which highly affects a patient's lifestyle and well-being. In the current clinical setting paper-based forms are used to classify pain correctly, however, the accuracy of diagnoses and optimal management of pain largely depend on the expert reviewer, which in many cases is not possible because of very few experts in this field. The need for a clinical decision support system that can be used by expert and non-expert clinicians has been cited in literature, but such a system has not been developed. We have designed and developed a stand-alone tool for correctly classifying pain type in spinal cord injury (SCI) patients, using Bayesian decision theory. Various machine learning simulation methods are used to verify the algorithm using a pilot study data set, which consists of 48 patients data set. The data set consists of the paper-based forms, collected at Long Beach VA clinic with pain classification done by expert in the field. Using the WEKA as the machine learning tool we have tested on the 48 patient dataset that the hypothesis that attributes collected on the forms and the pain location marked by patients have very significant impact on the pain type classification. This tool will be integrated with an imaging informatics system to support a clinical study that will test the effectiveness of using Proton Beam radiotherapy for treating spinal cord injury (SCI) related neuropathic pain as an alternative to invasive surgical lesioning.

  15. Development of a multilevel health and safety climate survey tool within a mining setting.

    PubMed

    Parker, Anthony W; Tones, Megan J; Ritchie, Gabrielle E

    2017-09-01

    This study aimed to design, implement and evaluate the reliability and validity of a multifactorial and multilevel health and safety climate survey (HSCS) tool with utility in the Australian mining setting. An 84-item questionnaire was developed and pilot tested on a sample of 302 Australian miners across two open cut sites. A 67-item, 10 factor solution was obtained via exploratory factor analysis (EFA) representing prioritization and attitudes to health and safety across multiple domains and organizational levels. Each factor demonstrated a high level of internal reliability, and a series of ANOVAs determined a high level of consistency in responses across the workforce, and generally irrespective of age, experience or job category. Participants tended to hold favorable views of occupational health and safety (OH&S) climate at the management, supervisor, workgroup and individual level. The survey tool demonstrated reliability and validity for use within an open cut Australian mining setting and supports a multilevel, industry specific approach to OH&S climate. Findings suggested a need for mining companies to maintain high OH&S standards to minimize risks to employee health and safety. Future research is required to determine the ability of this measure to predict OH&S outcomes and its utility within other mine settings. As this tool integrates health and safety, it may have benefits for assessment, monitoring and evaluation in the industry, and improving the understanding of how health and safety climate interact at multiple levels to influence OH&S outcomes. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  16. Integration of Molecular Dynamics Based Predictions into the Optimization of De Novo Protein Designs: Limitations and Benefits.

    PubMed

    Carvalho, Henrique F; Barbosa, Arménio J M; Roque, Ana C A; Iranzo, Olga; Branco, Ricardo J F

    2017-01-01

    Recent advances in de novo protein design have gained considerable insight from the intrinsic dynamics of proteins, based on the integration of molecular dynamics simulations protocols on the state-of-the-art de novo protein design protocols used nowadays. With this protocol we illustrate how to set up and run a molecular dynamics simulation followed by a functional protein dynamics analysis. New users will be introduced to some useful open-source computational tools, including the GROMACS molecular dynamics simulation software package and ProDy for protein structural dynamics analysis.

  17. SEURAT: Visual analytics for the integrated analysis of microarray data

    PubMed Central

    2010-01-01

    Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data. PMID:20525257

  18. VSDMIP: virtual screening data management on an integrated platform

    NASA Astrophysics Data System (ADS)

    Gil-Redondo, Rubén; Estrada, Jorge; Morreale, Antonio; Herranz, Fernando; Sancho, Javier; Ortiz, Ángel R.

    2009-03-01

    A novel software (VSDMIP) for the virtual screening (VS) of chemical libraries integrated within a MySQL relational database is presented. Two main features make VSDMIP clearly distinguishable from other existing computational tools: (i) its database, which stores not only ligand information but also the results from every step in the VS process, and (ii) its modular and pluggable architecture, which allows customization of the VS stages (such as the programs used for conformer generation or docking), through the definition of a detailed workflow employing user-configurable XML files. VSDMIP, therefore, facilitates the storage and retrieval of VS results, easily adapts to the specific requirements of each method and tool used in the experiments, and allows the comparison of different VS methodologies. To validate the usefulness of VSDMIP as an automated tool for carrying out VS several experiments were run on six protein targets (acetylcholinesterase, cyclin-dependent kinase 2, coagulation factor Xa, estrogen receptor alpha, p38 MAP kinase, and neuraminidase) using nine binary (actives/inactive) test sets. The performance of several VS configurations was evaluated by means of enrichment factors and receiver operating characteristic plots.

  19. Science, technology, and pedagogy: Exploring secondary science teachers' effective uses of technology

    NASA Astrophysics Data System (ADS)

    Guzey, Siddika Selcen

    Technology has become a vital part of our professional and personal lives. Today we cannot imagine living without many technological tools such as computers. For the last two decades technology has become inseparable from several areas, such as science. However, it has not been fully integrated into the field of education. The integration of technology in teaching and learning is still challenging even though there has been a historical growth of Internet access and available technology tools in schools (U.S. Department of Education, National Center for Education Statistics, 2006). Most teachers have not incorporated technology into their teaching for various reasons such as lack of knowledge of educational technology tools and having unfavorable beliefs about the effectiveness of technology on student learning. In this study, three beginning science teachers who have achieved successful technology integration were followed to investigate how their beliefs, knowledge, and identity contribute to their uses of technology in their classroom instruction. Extensive classroom observations and interviews were conducted. The findings demonstrate that the participating teachers are all intrinsically motivated to use technology in their teaching and this motivation allows them to enjoy using technology in their instruction and keeps them engaged in technology use. These teachers use a variety of technology tools in their instruction while also allowing students to use them, and they posit a belief set in favor of technology. The major findings of the study are displayed in a model which indicates that teachers' use of technology in classroom instruction was constructed jointly by their technology, pedagogy, and content knowledge; identity; beliefs; and the resources that are available to them and that the internalization of the technology use comes from reflection. The study has implications for teachers, teacher educators, and school administrators for successful technology integration into science classrooms.

  20. Development and Integration of Professional Core Values Among Practicing Clinicians.

    PubMed

    McGinnis, Patricia Quinn; Guenther, Lee Ann; Wainwright, Susan F

    2016-09-01

    The physical therapy profession has adopted professional core values, which define expected values for its members, and developed a self-assessment tool with sample behaviors for each of the 7 core values. However, evidence related to the integration of these core values into practice is limited. The aims of this study were: (1) to gain insight into physical therapists' development of professional core values and (2) to gain insight into participants' integration of professional core values into clinical practice. A qualitative design permitted in-depth exploration of the development and integration of the American Physical Therapy Association's professional core values into physical therapist practice. Twenty practicing physical therapists were purposefully selected to explore the role of varied professional, postprofessional, and continuing education experiences related to exposure to professional values. The Core Values Self-Assessment and résumé sort served as prompts for reflection via semistructured interviews. Three themes were identified: (1) personal values were the foundation for developing professional values, which were further shaped by academic and clinical experiences, (2) core values were integrated into practice independent of practice setting and varied career paths, and (3) participants described the following professional core values as well integrated into their practice: integrity, compassion/caring, and accountability. Social responsibility was an area consistently identified as not being integrated into their practice. The Core Values Self-Assessment tool is a consensus-based document developed through a Delphi process. Future studies to establish reliability and construct validity of the tool may be warranted. Gaining an in-depth understanding of how practicing clinicians incorporate professional core values into clinical practice may shed light on the relationship between core values mastery and its impact on patient care. Findings may help shape educators' decisions for professional (entry-level), postprofessional, and continuing education. © 2016 American Physical Therapy Association.

  1. From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment

    NASA Astrophysics Data System (ADS)

    Klose, M.; Damm, B.

    2014-12-01

    The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their impact on public budgets is a further component of this approach. In integrated risk assessment, combination of methods plays an important role, with the objective of collecting and integrating complex data sets on landslide risk.

  2. Sentinel-2 ArcGIS Tool for Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Plesoianu, Alin; Cosmin Sandric, Ionut; Anca, Paula; Vasile, Alexandru; Calugaru, Andreea; Vasile, Cristian; Zavate, Lucian

    2017-04-01

    This paper addresses one of the biggest challenges regarding Sentinel-2 data, related to the need of an efficient tool to access and process the large collection of images that are available. Consequently, developing a tool for the automation of Sentinel-2 data analysis is the most immediate need. We developed a series of tools for the automation of Sentinel-2 data download and processing for vegetation health monitoring. The tools automatically perform the following operations: downloading image tiles from ESA's Scientific Hub or other venders (Amazon), pre-processing of the images to extract the 10-m bands, creating image composites, applying a series of vegetation indexes (NDVI, OSAVI, etc.) and performing change detection analyses on different temporal data sets. All of these tools run in a dynamic way in the ArcGIS Platform, without the need of creating intermediate datasets (rasters, layers), as the images are processed on-the-fly in order to avoid data duplication. Finally, they allow complete integration with the ArcGIS environment and workflows

  3. What Should We Grow in Our School Garden to Sell at the Farmers' Market? Initiating Statistical Literacy through Science and Mathematics Integration

    ERIC Educational Resources Information Center

    Selmer, Sarah J.; Rye, James A.; Malone, Elizabeth; Fernandez, Danielle; Trebino, Kathryn

    2014-01-01

    Statistical literacy is essential to scientific literacy, and the quest for such is best initiated in the elementary grades. The "Next Generation Science Standards and the Common Core State Standards for Mathematics" set forth practices (e.g., asking questions, using tools strategically to analyze and interpret data) and content (e.g.,…

  4. Dancing with Mobile Devices: The LAIT Application System in Performance and Educational Settings

    ERIC Educational Resources Information Center

    Toenjes, John; Beck, Ken; Reimer, M. Anthony; Mott, Erica

    2016-01-01

    At most any performance today, you will be notified to turn off your cell phone. The smartphone has become such an integral tool in our daily lives that turning it off is tantamount to severing our connection to our community and challenging the way we view and negotiate the world. Many audience members, particularly young ones, will be looking at…

  5. geneLAB: Expanding the Impact of NASA's Biological Research in Space

    NASA Technical Reports Server (NTRS)

    Rayl, Nicole; Smith, Jeffrey D.

    2014-01-01

    The geneLAB project is designed to leverage the value of large 'omics' datasets from molecular biology projects conducted on the ISS by making these datasets available, citable, discoverable, interpretable, reusable, and reproducible. geneLAB will create a collaboration space with an integrated set of tools for depositing, accessing, analyzing, and modeling these diverse datasets from spaceflight and related terrestrial studies.

  6. Human Behavioral Representations with Realistic Personality and Cultural Characteristics

    DTIC Science & Technology

    2005-06-01

    personality factors as customizations to an underlying formally rational symbolic architecture, PAC uses dimensions of personality, emotion , and culture as...foundations for the cognitive process. The structure of PAC allows it to function as a personality/ emotional layer that can be used stand-alone or...integrated with existing constrained- rationality cognitive architectures. In addition, a set of tools was developed to support the authoring

  7. Networking and Information Technology Research and Development. Supplement to the President’s Budget for FY 2002

    DTIC Science & Technology

    2001-07-01

    Web-based applications to improve health data systems and quality of care; innovative strategies for data collection in clinical settings; approaches...research to increase interoperability and integration of software in distributed systems ; protocols and tools for data annotation and management; and...Generation National Defense and National Security Systems .......................... 27 Improved Health Care Systems for All Citizens

  8. Preparing WIND for the STEREO Mission

    NASA Astrophysics Data System (ADS)

    Schroeder, P.; Ogilve, K.; Szabo, A.; Lin, R.; Luhmann, J.

    2006-05-01

    The upcoming STEREO mission's IMPACT and PLASTIC investigations will provide the first opportunity for long duration, detailed observations of 1 AU magnetic field structures, plasma ions and electrons, suprathermal electrons, and energetic particles at points bracketing Earth's heliospheric location. Stereoscopic/3D information from the STEREO SECCHI imagers and SWAVES radio experiment will make it possible to use both multipoint and quadrature studies to connect interplanetary Coronal Mass Ejections (ICME) and solar wind structures to CMEs and coronal holes observed at the Sun. To fully exploit these unique data sets, tight integration with similarly equipped missions at L1 will be essential, particularly WIND and ACE. The STEREO mission is building novel data analysis tools to take advantage of the mission's scientific potential. These tools will require reliable access and a well-documented interface to the L1 data sets. Such an interface already exists for ACE through the ACE Science Center. We plan to provide a similar service for the WIND mission that will supplement existing CDAWeb services. Building on tools also being developed for STEREO, we will create a SOAP application program interface (API) which will allow both our STEREO/WIND/ACE interactive browser and third-party software to access WIND data as a seamless and integral part of the STEREO mission. The API will also allow for more advanced forms of data mining than currently available through other data web services. Access will be provided to WIND-specific data analysis software as well. The development of cross-spacecraft data analysis tools will allow a larger scientific community to combine STEREO's unique in-situ data with those of other missions, particularly the L1 missions, and, therefore, to maximize STEREO's scientific potential in gaining a greater understanding of the heliosphere.

  9. International Space Station (ISS) Anomalies Trending Study. Volume II; Appendices

    NASA Technical Reports Server (NTRS)

    Beil, Robert J.; Brady, Timothy K.; Foster, Delmar C.; Graber, Robert R.; Malin, Jane T.; Thornesbery, Carroll G.; Throop, David R.

    2015-01-01

    The NASA Engineering and Safety Center (NESC) set out to utilize data mining and trending techniques to review the anomaly history of the International Space Station (ISS) and provide tools for discipline experts not involved with the ISS Program to search anomaly data to aid in identification of areas that may warrant further investigation. Additionally, the assessment team aimed to develop an approach and skillset for integrating data sets, with the intent of providing an enriched data set for discipline experts to investigate that is easier to navigate, particularly in light of ISS aging and the plan to extend its life into the late 2020s. This document contains the Appendices to the Volume I report.

  10. An introduction to the new Productivity Information Management System (PIMS)

    NASA Technical Reports Server (NTRS)

    Hull, R.

    1982-01-01

    The productivity information management system (PIMS), is described. The main objective of this computerized system is to enable management scientists to interactively explore data concerning DSN operations, maintenance and repairs, to develop and verify models for management planning. The PIMS will provide a powerful set of tools for iteratively manipulating data sets in a wide variety of ways. The initial version of PIMS will be a small scale pilot system. The following topics are discussed: (1) the motivation for developing PIMS; (2) various data sets which will be integrated by PIMS; (3) overall design of PIMS; and (4) how PIMS will be used. A survey of relevant databases concerning DSN operations at Goldstone is also included.

  11. International Space Station (ISS) Anomalies Trending Study

    NASA Technical Reports Server (NTRS)

    Beil, Robert J.; Brady, Timothy K.; Foster, Delmar C.; Graber, Robert R.; Malin, Jane T.; Thornesbery, Carroll G.; Throop, David R.

    2015-01-01

    The NASA Engineering and Safety Center (NESC) set out to utilize data mining and trending techniques to review the anomaly history of the International Space Station (ISS) and provide tools for discipline experts not involved with the ISS Program to search anomaly data to aid in identification of areas that may warrant further investigation. Additionally, the assessment team aimed to develop an approach and skillset for integrating data sets, with the intent of providing an enriched data set for discipline experts to investigate that is easier to navigate, particularly in light of ISS aging and the plan to extend its life into the late 2020s. This report contains the outcome of the NESC Assessment.

  12. Case Report: Activity Diagrams for Integrating Electronic Prescribing Tools into Clinical Workflow

    PubMed Central

    Johnson, Kevin B.; FitzHenry, Fern

    2006-01-01

    To facilitate the future implementation of an electronic prescribing system, this case study modeled prescription management processes in various primary care settings. The Vanderbilt e-prescribing design team conducted initial interviews with clinic managers, physicians and nurses, and then represented the sequences of steps carried out to complete prescriptions in activity diagrams. The diagrams covered outpatient prescribing for patients during a clinic visit and between clinic visits. Practice size, practice setting, and practice specialty type influenced the prescribing processes used. The model developed may be useful to others engaged in building or tailoring an e-prescribing system to meet the specific workflows of various clinic settings. PMID:16622168

  13. deepTools2: a next generation web server for deep-sequencing data analysis.

    PubMed

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. OrthoSelect: a protocol for selecting orthologous groups in phylogenomics.

    PubMed

    Schreiber, Fabian; Pick, Kerstin; Erpenbeck, Dirk; Wörheide, Gert; Morgenstern, Burkhard

    2009-07-16

    Phylogenetic studies using expressed sequence tags (EST) are becoming a standard approach to answer evolutionary questions. Such studies are usually based on large sets of newly generated, unannotated, and error-prone EST sequences from different species. A first crucial step in EST-based phylogeny reconstruction is to identify groups of orthologous sequences. From these data sets, appropriate target genes are selected, and redundant sequences are eliminated to obtain suitable sequence sets as input data for tree-reconstruction software. Generating such data sets manually can be very time consuming. Thus, software tools are needed that carry out these steps automatically. We developed a flexible and user-friendly software pipeline, running on desktop machines or computer clusters, that constructs data sets for phylogenomic analyses. It automatically searches assembled EST sequences against databases of orthologous groups (OG), assigns ESTs to these predefined OGs, translates the sequences into proteins, eliminates redundant sequences assigned to the same OG, creates multiple sequence alignments of identified orthologous sequences and offers the possibility to further process this alignment in a last step by excluding potentially homoplastic sites and selecting sufficiently conserved parts. Our software pipeline can be used as it is, but it can also be adapted by integrating additional external programs. This makes the pipeline useful for non-bioinformaticians as well as to bioinformatic experts. The software pipeline is especially designed for ESTs, but it can also handle protein sequences. OrthoSelect is a tool that produces orthologous gene alignments from assembled ESTs. Our tests show that OrthoSelect detects orthologs in EST libraries with high accuracy. In the absence of a gold standard for orthology prediction, we compared predictions by OrthoSelect to a manually created and published phylogenomic data set. Our tool was not only able to rebuild the data set with a specificity of 98%, but it detected four percent more orthologous sequences. Furthermore, the results OrthoSelect produces are in absolut agreement with the results of other programs, but our tool offers a significant speedup and additional functionality, e.g. handling of ESTs, computing sequence alignments, and refining them. To our knowledge, there is currently no fully automated and freely available tool for this purpose. Thus, OrthoSelect is a valuable tool for researchers in the field of phylogenomics who deal with large quantities of EST sequences. OrthoSelect is written in Perl and runs on Linux/Mac OS X. The tool can be downloaded at (http://gobics.de/fabian/orthoselect.php).

  15. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from . PMID:17389034

  16. Indicators and measurement tools for health system integration: a knowledge synthesis protocol.

    PubMed

    Oelke, Nelly D; Suter, Esther; da Silva Lima, Maria Alice Dias; Van Vliet-Brown, Cheryl

    2015-07-29

    Health system integration is a key component of health system reform with the goal of improving outcomes for patients, providers, and the health system. Although health systems continue to strive for better integration, current delivery of health services continues to be fragmented. A key gap in the literature is the lack of information on what successful integration looks like and how to measure achievement towards an integrated system. This multi-site study protocol builds on a prior knowledge synthesis completed by two of the primary investigators which identified 10 key principles that collectively support health system integration. The aim is to answer two research questions: What are appropriate indicators for each of the 10 key integration principles developed in our previous knowledge synthesis and what measurement tools are used to measure these indicators? To enhance generalizability of the findings, a partnership between Canada and Brazil was created as health system integration is a priority in both countries and they share similar contexts. This knowledge synthesis will follow an iterative scoping review process with emerging information from knowledge-user engagement leading to the refinement of research questions and study selection. This paper describes the methods for each phase of the study. Research questions were developed with stakeholder input. Indicator identification and prioritization will utilize a modified Delphi method and patient/user focus groups. Based on priority indicators, a search of the literature will be completed and studies screened for inclusion. Quality appraisal of relevant studies will be completed prior to data extraction. Results will be used to develop recommendations and key messages to be presented through integrated and end-of-grant knowledge translation strategies with researchers and knowledge-users from the three jurisdictions. This project will directly benefit policy and decision-makers by providing an easy accessible set of indicators and tools to measure health system integration across different contexts and cultures. Being able to evaluate the success of integration strategies and initiatives will lead to better health system design and improved health outcomes for patients.

  17. Demonstrating an Effective Marine Biodiversity Observation Network in the Santa Barbara Channel

    NASA Astrophysics Data System (ADS)

    Miller, R. J.

    2016-02-01

    The Santa Barbara Channel (SBC) is a transition zone characterized by high species and habitat diversity and strong environmental gradients within a relatively small area where cold- and warm-water species found from Baja to the Bering Sea coexist. These characteristics make SBC an ideal setting for our demonstration Marine Biodiversity Observation Network (BON) project that integrates biological levels from genes to habitats and links biodiversity observations to environmental forcing and biogeography. SBC BON is building a comprehensive demonstration system that includes representation of all levels of biotic diversity, key new tools to expand the scales of present observation, and a data management network to integrate new and existing data sources. Our system will be scalable to expand into a full regional Marine BON, and the methods and decision support tools we develop will be transferable to other regions. Incorporating a broad set of habitats including nearshore coast, continental shelf, and pelagic, and taxonomic breadth from microbes to whales will facilitate this transferability. The Santa Barbara Channel marine BON has three broad objectives: 1. Integrate biodiversity data to enable inferences about regional biodiversity 2. Develop advanced methods in optical and acoustic imaging and genomics for monitoring biodiversity in partnership with ongoing monitoring and research programs to begin filling the gaping gaps in our knowledge. 3. Implement a tradeoff framework that optimizes allocation of sampling effort. Here we discuss our progress towards these goals and challenges in developing an effective MBON.

  18. Reengineering Workflow for Curation of DICOM Datasets.

    PubMed

    Bennett, William; Smith, Kirk; Jarosz, Quasar; Nolan, Tracy; Bosch, Walter

    2018-06-15

    Reusable, publicly available data is a pillar of open science and rapid advancement of cancer imaging research. Sharing data from completed research studies not only saves research dollars required to collect data, but also helps insure that studies are both replicable and reproducible. The Cancer Imaging Archive (TCIA) is a global shared repository for imaging data related to cancer. Insuring the consistency, scientific utility, and anonymity of data stored in TCIA is of utmost importance. As the rate of submission to TCIA has been increasing, both in volume and complexity of DICOM objects stored, the process of curation of collections has become a bottleneck in acquisition of data. In order to increase the rate of curation of image sets, improve the quality of the curation, and better track the provenance of changes made to submitted DICOM image sets, a custom set of tools was developed, using novel methods for the analysis of DICOM data sets. These tools are written in the programming language perl, use the open-source database PostgreSQL, make use of the perl DICOM routines in the open-source package Posda, and incorporate DICOM diagnostic tools from other open-source packages, such as dicom3tools. These tools are referred to as the "Posda Tools." The Posda Tools are open source and available via git at https://github.com/UAMS-DBMI/PosdaTools . In this paper, we briefly describe the Posda Tools and discuss the novel methods employed by these tools to facilitate rapid analysis of DICOM data, including the following: (1) use a database schema which is more permissive, and differently normalized from traditional DICOM databases; (2) perform integrity checks automatically on a bulk basis; (3) apply revisions to DICOM datasets on an bulk basis, either through a web-based interface or via command line executable perl scripts; (4) all such edits are tracked in a revision tracker and may be rolled back; (5) a UI is provided to inspect the results of such edits, to verify that they are what was intended; (6) identification of DICOM Studies, Series, and SOP instances using "nicknames" which are persistent and have well-defined scope to make expression of reported DICOM errors easier to manage; and (7) rapidly identify potential duplicate DICOM datasets by pixel data is provided; this can be used, e.g., to identify submission subjects which may relate to the same individual, without identifying the individual.

  19. Real Time Metrics and Analysis of Integrated Arrival, Departure, and Surface Operations

    NASA Technical Reports Server (NTRS)

    Sharma, Shivanjli; Fergus, John

    2017-01-01

    To address the Integrated Arrival, Departure, and Surface (IADS) challenge, NASA is developing and demonstrating trajectory-based departure automation under a collaborative effort with the FAA and industry known Airspace Technology Demonstration 2 (ATD-2). ATD-2 builds upon and integrates previous NASA research capabilities that include the Spot and Runway Departure Advisor (SARDA), the Precision Departure Release Capability (PDRC), and the Terminal Sequencing and Spacing (TSAS) capability. As trajectory-based departure scheduling and collaborative decision making tools are introduced in order to reduce delays and uncertainties in taxi and climb operations across the National Airspace System, users of the tools across a number of roles benefit from a real time system that enables common situational awareness. A real time dashboard was developed to inform and present users notifications and integrated information regarding airport surface operations. The dashboard is a supplement to capabilities and tools that incorporate arrival, departure, and surface air-traffic operations concepts in a NextGen environment. In addition to shared situational awareness, the dashboard offers the ability to compute real time metrics and analysis to inform users about capacity, predictability, and efficiency of the system as a whole. This paper describes the architecture of the real time dashboard as well as an initial proposed set of metrics. The potential impact of the real time dashboard is studied at the site identified for initial deployment and demonstration in 2017: Charlotte-Douglas International Airport (CLT). The architecture of implementing such a tool as well as potential uses are presented for operations at CLT. Metrics computed in real time illustrate the opportunity to provide common situational awareness and inform users of system delay, throughput, taxi time, and airport capacity. In addition, common awareness of delays and the impact of takeoff and departure restrictions stemming from traffic flow management initiatives are explored. The potential of the real time tool to inform users of the predictability and efficiency of using a trajectory-based departure scheduling system is also discussed.

  20. Integration among databases and data sets to support productive nanotechnology: Challenges and recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karcher, Sandra; Willighagen, Egon L.; Rumble, John

    Many groups within the broad field of nanoinformatics are already developing data repositories and analytical tools driven by their individual organizational goals. Integrating these data resources across disciplines and with non-nanotechnology resources can support multiple objectives by enabling the reuse of the same information. Integration can also serve as the impetus for novel scientific discoveries by providing the framework to support deeper data analyses. This article discusses current data integration practices in nanoinformatics and in comparable mature fields, and nanotechnology-specific challenges impacting data integration. Based on results from a nanoinformatics-community-wide survey, recommendations for achieving integration of existing operational nanotechnology resourcesmore » are presented. Nanotechnology-specific data integration challenges, if effectively resolved, can foster the application and validation of nanotechnology within and across disciplines. This paper is one of a series of articles by the Nanomaterial Data Curation Initiative that address data issues such as data curation workflows, data completeness and quality, curator responsibilities, and metadata.« less

  1. EAGLE: 'EAGLE'Is an' Algorithmic Graph Library for Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-01-16

    The Resource Description Framework (RDF) and SPARQL Protocol and RDF Query Language (SPARQL) were introduced about a decade ago to enable flexible schema-free data interchange on the Semantic Web. Today data scientists use the framework as a scalable graph representation for integrating, querying, exploring and analyzing data sets hosted at different sources. With increasing adoption, the need for graph mining capabilities for the Semantic Web has emerged. Today there is no tools to conduct "graph mining" on RDF standard data sets. We address that need through implementation of popular iterative Graph Mining algorithms (Triangle count, Connected component analysis, degree distribution,more » diversity degree, PageRank, etc.). We implement these algorithms as SPARQL queries, wrapped within Python scripts and call our software tool as EAGLE. In RDF style, EAGLE stands for "EAGLE 'Is an' algorithmic graph library for exploration. EAGLE is like 'MATLAB' for 'Linked Data.'« less

  2. BIG: a large-scale data integration tool for renal physiology

    PubMed Central

    Zhao, Yue; Yang, Chin-Rang; Raghuram, Viswanathan; Parulekar, Jaya

    2016-01-01

    Due to recent advances in high-throughput techniques, we and others have generated multiple proteomic and transcriptomic databases to describe and quantify gene expression, protein abundance, or cellular signaling on the scale of the whole genome/proteome in kidney cells. The existence of so much data from diverse sources raises the following question: “How can researchers find information efficiently for a given gene product over all of these data sets without searching each data set individually?” This is the type of problem that has motivated the “Big-Data” revolution in Data Science, which has driven progress in fields such as marketing. Here we present an online Big-Data tool called BIG (Biological Information Gatherer) that allows users to submit a single online query to obtain all relevant information from all indexed databases. BIG is accessible at http://big.nhlbi.nih.gov/. PMID:27279488

  3. Update: Advancement of Contact Dynamics Modeling for Human Spaceflight Simulation Applications

    NASA Technical Reports Server (NTRS)

    Brain, Thomas A.; Kovel, Erik B.; MacLean, John R.; Quiocho, Leslie J.

    2017-01-01

    Pong is a new software tool developed at the NASA Johnson Space Center that advances interference-based geometric contact dynamics based on 3D graphics models. The Pong software consists of three parts: a set of scripts to extract geometric data from 3D graphics models, a contact dynamics engine that provides collision detection and force calculations based on the extracted geometric data, and a set of scripts for visualizing the dynamics response with the 3D graphics models. The contact dynamics engine can be linked with an external multibody dynamics engine to provide an integrated multibody contact dynamics simulation. This paper provides a detailed overview of Pong including the overall approach and modeling capabilities, which encompasses force generation from contact primitives and friction to computational performance. Two specific Pong-based examples of International Space Station applications are discussed, and the related verification and validation using this new tool are also addressed.

  4. The development of a patient-specific method for physiotherapy goal setting: a user-centered design.

    PubMed

    Stevens, Anita; Köke, Albère; van der Weijden, Trudy; Beurskens, Anna

    2018-08-01

    To deliver client-centered care, physiotherapists need to identify the patients' individual treatment goals. However, practical tools for involving patients in goal setting are lacking. The purpose of this study was to improve the frequently used Patient-Specific Complaints instrument in Dutch physiotherapy, and to develop it into a feasible method to improve physiotherapy goal setting. An iterative user-centered design was conducted in co-creation with the physiotherapists and patients, in three phases. Their needs and preferences were identified by means of group meetings and questionnaires. The new method was tested in several field tests in physiotherapy practices. Four main objectives for improvement were formulated: clear instructions for the administration procedure, targeted use across the physiotherapy process, client-activating communication skills, and a client-centered attitude of the physiotherapist. A theoretical goal-setting framework and elements of shared decision making were integrated into the new-called, Patient-Specific Goal-setting method, together with a practical training course. The user-centered approach resulted in a goal-setting method that is fully integrated in the physiotherapy process. The new goal-setting method contributes to a more structured approach to goal setting and enables patient participation and goal-oriented physiotherapy. Before large-scale implementation, its feasibility in physiotherapy practice needs to be investigated. Implications for rehabilitation Involving patients and physiotherapists in the development and testing of a goal-setting method, increases the likelihood of its feasibility in practice. The integration of a goal-setting method into the physiotherapy process offers the opportunity to focus more fully on the patient's goals. Patients should be informed about the aim of every step of the goal-setting process in order to increase their awareness and involvement. Training physiotherapists to use a patient-specific method for goal setting is crucial for a correct application.

  5. Tilling the soil while sowing the seeds: combining resident education with medical home transformation.

    PubMed

    Muench, John; Jarvis, Kelly; Boverman, Josh; Hardman, Joseph; Hayes, Meg; Winkle, Jim

    2012-01-01

    In order to successfully integrate screening, brief intervention, and referral to treatment (SBIRT) into primary care, education of clinicians must be paired with sustainable transformation of the clinical settings in which they practice. The SBIRT Oregon project adopted this strategy in an effort to fully integrate SBIRT into 7 primary care residency clinics. Residents were trained to assess and intervene in their patients' unhealthy substance use, whereas clinic staff personnel were trained to carry out a multistep screening process. Electronic medical record tools were created to further integrate and track SBIRT processes. This article describes how a resident training curriculum complemented and was informed by the transformation of workflow processes within the residents' home clinics.

  6. Information integration for a sky survey by data warehousing

    NASA Astrophysics Data System (ADS)

    Luo, A.; Zhang, Y.; Zhao, Y.

    The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line

  7. GeneSCF: a real-time based functional enrichment tool with support for multiple organisms.

    PubMed

    Subhash, Santhilal; Kanduri, Chandrasekhar

    2016-09-13

    High-throughput technologies such as ChIP-sequencing, RNA-sequencing, DNA sequencing and quantitative metabolomics generate a huge volume of data. Researchers often rely on functional enrichment tools to interpret the biological significance of the affected genes from these high-throughput studies. However, currently available functional enrichment tools need to be updated frequently to adapt to new entries from the functional database repositories. Hence there is a need for a simplified tool that can perform functional enrichment analysis by using updated information directly from the source databases such as KEGG, Reactome or Gene Ontology etc. In this study, we focused on designing a command-line tool called GeneSCF (Gene Set Clustering based on Functional annotations), that can predict the functionally relevant biological information for a set of genes in a real-time updated manner. It is designed to handle information from more than 4000 organisms from freely available prominent functional databases like KEGG, Reactome and Gene Ontology. We successfully employed our tool on two of published datasets to predict the biologically relevant functional information. The core features of this tool were tested on Linux machines without the need for installation of more dependencies. GeneSCF is more reliable compared to other enrichment tools because of its ability to use reference functional databases in real-time to perform enrichment analysis. It is an easy-to-integrate tool with other pipelines available for downstream analysis of high-throughput data. More importantly, GeneSCF can run multiple gene lists simultaneously on different organisms thereby saving time for the users. Since the tool is designed to be ready-to-use, there is no need for any complex compilation and installation procedures.

  8. Challenges in atmospheric monitoring of areal emission sources - an Open-path Fourier transform infrared (OP-FTIR) spectroscopic experience report

    NASA Astrophysics Data System (ADS)

    Schuetze, C.; Sauer, U.; Dietrich, P.

    2015-12-01

    Reliable detection and assessment of near-surface CO2 emissions from natural or anthropogenic sources require the application of various monitoring tools at different spatial scales. Especially, optical remote sensing tools for atmospheric monitoring have the potential to measure integrally CO2 emissions over larger scales (> 10.000m2). Within the framework of the MONACO project ("Monitoring approach for geological CO2 storage sites using a hierarchical observation concept"), an integrative hierarchical monitoring concept was developed and validated at different field sites with the aim to establish a modular observation strategy including investigations in the shallow subsurface, at ground surface level and the lower atmospheric boundary layer. The main aims of the atmospheric monitoring using optical remote sensing were the observation of the gas dispersion in to the near-surface atmosphere, the determination of maximum concentration values and identification of the main challenges associated with the monitoring of extended emission sources with the proposed methodological set up under typical environmental conditions. The presentation will give an overview about several case studies using the integrative approach of Open-Path Fourier Transform Infrared spectroscopy (OP FTIR) in combination with in situ measurements. As a main result, the method was validated as possible approach for continuous monitoring of the atmospheric composition, in terms of integral determination of GHG concentrations and to identify target areas which are needed to be investigated more in detail. Especially the data interpretation should closely consider the micrometeorological conditions. Technical aspects concerning robust equipment, experimental set up and fast data processing algorithms have to be taken into account for the enhanced automation of atmospheric monitoring.

  9. DaGO-Fun: tool for Gene Ontology-based functional analysis using term information content measures

    PubMed Central

    2013-01-01

    Background The use of Gene Ontology (GO) data in protein analyses have largely contributed to the improved outcomes of these analyses. Several GO semantic similarity measures have been proposed in recent years and provide tools that allow the integration of biological knowledge embedded in the GO structure into different biological analyses. There is a need for a unified tool that provides the scientific community with the opportunity to explore these different GO similarity measure approaches and their biological applications. Results We have developed DaGO-Fun, an online tool available at http://web.cbio.uct.ac.za/ITGOM, which incorporates many different GO similarity measures for exploring, analyzing and comparing GO terms and proteins within the context of GO. It uses GO data and UniProt proteins with their GO annotations as provided by the Gene Ontology Annotation (GOA) project to precompute GO term information content (IC), enabling rapid response to user queries. Conclusions The DaGO-Fun online tool presents the advantage of integrating all the relevant IC-based GO similarity measures, including topology- and annotation-based approaches to facilitate effective exploration of these measures, thus enabling users to choose the most relevant approach for their application. Furthermore, this tool includes several biological applications related to GO semantic similarity scores, including the retrieval of genes based on their GO annotations, the clustering of functionally related genes within a set, and term enrichment analysis. PMID:24067102

  10. Expert Recommender: Designing for a Network Organization

    NASA Astrophysics Data System (ADS)

    Reichling, Tim; Veith, Michael; Wulf, Volker

    Recent knowledge management initiatives focus on expertise sharing within formal organizational units and informal communities of practice. Expert recommender systems seem to be a promising tool in support of these initiatives. This paper presents experiences in designing an expert recommender system for a knowledge- intensive organization, namely the National Industry Association (NIA). Field study results provide a set of specific design requirements. Based on these requirements, we have designed an expert recommender system which is integrated into the specific software infrastructure of the organizational setting. The organizational setting is, as we will show, specific for historical, political, and economic reasons. These particularities influence the employees’ organizational and (inter-)personal needs within this setting. The paper connects empirical findings of a long-term case study with design experiences of an expertise recommender system.

  11. Integrated Measurements and Characterization | Photovoltaic Research | NREL

    Science.gov Websites

    Integrated Measurements and Characterization cluster tool offers powerful capabilities with integrated tools more details on these capabilities. Basic Cluster Tool Capabilities Sample Handling Ultra-high-vacuum connections, it can be interchanged between tools, such as the Copper Indium Gallium Diselenide cluster tool

  12. Electronic processing of informed consents in a global pharmaceutical company environment.

    PubMed

    Vishnyakova, Dina; Gobeill, Julien; Oezdemir-Zaech, Fatma; Kreim, Olivier; Vachon, Therese; Clade, Thierry; Haenning, Xavier; Mikhailov, Dmitri; Ruch, Patrick

    2014-01-01

    We present an electronic capture tool to process informed consents, which are mandatory recorded when running a clinical trial. This tool aims at the extraction of information expressing the duration of the consent given by the patient to authorize the exploitation of biomarker-related information collected during clinical trials. The system integrates a language detection module (LDM) to route a document into the appropriate information extraction module (IEM). The IEM is based on language-specific sets of linguistic rules for the identification of relevant textual facts. The achieved accuracy of both the LDM and IEM is 99%. The architecture of the system is described in detail.

  13. Tool calibration system for micromachining system

    DOEpatents

    Miller, Donald M.

    1979-03-06

    A tool calibration system including a tool calibration fixture and a tool height and offset calibration insert for calibrating the position of a tool bit in a micromachining tool system. The tool calibration fixture comprises a yokelike structure having a triangular head, a cavity in the triangular head, and a port which communicates a side of the triangular head with the cavity. Yoke arms integral with the triangular head extend along each side of a tool bar and a tool head of the micromachining tool system. The yoke arms are secured to the tool bar to place the cavity around a tool bit which may be mounted to the end of the tool head. Three linear variable differential transformer's (LVDT) are adjustably mounted in the triangular head along an X axis, a Y axis, and a Z axis. The calibration insert comprises a main base which can be mounted in the tool head of the micromachining tool system in place of a tool holder and a reference projection extending from a front surface of the main base. Reference surfaces of the calibration insert and a reference surface on a tool bar standard length are used to set the three LVDT's of the calibration fixture to the tool reference position. These positions are transferred permanently to a mastering station. The tool calibration fixture is then used to transfer the tool reference position of the mastering station to the tool bit.

  14. Integrated software system for low level waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worku, G.

    1995-12-31

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less

  15. The role of collaborative ontology development in the knowledge negotiation process

    NASA Astrophysics Data System (ADS)

    Rivera, Norma

    Interdisciplinary research (IDR) collaboration can be defined as the process of integrating experts' knowledge, perspectives, and resources to advance scientific discovery. The flourishing of more complex research problems, together with the growth of scientific and technical knowledge has resulted in the need for researchers from diverse fields to provide different expertise and points of view to tackle these problems. These collaborations, however, introduce a new set of "culture" barriers as participating experts are trained to communicate in discipline-specific languages, theories, and research practices. We propose that building a common knowledge base for research using ontology development techniques can provide a starting point for interdisciplinary knowledge exchange, negotiation, and integration. The goal of this work is to extend ontology development techniques to support the knowledge negotiation process in IDR groups. Towards this goal, this work presents a methodology that extends previous work in collaborative ontology development and integrates learning strategies and tools to enhance interdisciplinary research practices. We evaluate the effectiveness of applying such methodology in three different scenarios that cover educational and research settings. The results of this evaluation confirm that integrating learning strategies can, in fact, be advantageous to overall collaborative practices in IDR groups.

  16. Multi-objective spatial tools to inform maritime spatial planning in the Adriatic Sea.

    PubMed

    Depellegrin, Daniel; Menegon, Stefano; Farella, Giulio; Ghezzo, Michol; Gissi, Elena; Sarretta, Alessandro; Venier, Chiara; Barbanti, Andrea

    2017-12-31

    This research presents a set of multi-objective spatial tools for sea planning and environmental management in the Adriatic Sea Basin. The tools address four objectives: 1) assessment of cumulative impacts from anthropogenic sea uses on environmental components of marine areas; 2) analysis of sea use conflicts; 3) 3-D hydrodynamic modelling of nutrient dispersion (nitrogen and phosphorus) from riverine sources in the Adriatic Sea Basin and 4) marine ecosystem services capacity assessment from seabed habitats based on an ES matrix approach. Geospatial modelling results were illustrated, analysed and compared on country level and for three biogeographic subdivisions, Northern-Central-Southern Adriatic Sea. The paper discusses model results for their spatial implications, relevance for sea planning, limitations and concludes with an outlook towards the need for more integrated, multi-functional tools development for sea planning. Copyright © 2017. Published by Elsevier B.V.

  17. Democratization of Nanoscale Imaging and Sensing Tools Using Photonics

    PubMed Central

    2015-01-01

    Providing means for researchers and citizen scientists in the developing world to perform advanced measurements with nanoscale precision can help to accelerate the rate of discovery and invention as well as improve higher education and the training of the next generation of scientists and engineers worldwide. Here, we review some of the recent progress toward making optical nanoscale measurement tools more cost-effective, field-portable, and accessible to a significantly larger group of researchers and educators. We divide our review into two main sections: label-based nanoscale imaging and sensing tools, which primarily involve fluorescent approaches, and label-free nanoscale measurement tools, which include light scattering sensors, interferometric methods, photonic crystal sensors, and plasmonic sensors. For each of these areas, we have primarily focused on approaches that have either demonstrated operation outside of a traditional laboratory setting, including for example integration with mobile phones, or exhibited the potential for such operation in the near future. PMID:26068279

  18. Integrating Diabetes Guidelines into a Telehealth Screening Tool.

    PubMed

    Gervera, Kelly; Graves, Barbara Ann

    2015-01-01

    Diabetes is the seventh leading cause of death in the United States and contributes to long-term complications that are costly to healthcare systems. Twenty-five percent of all veterans in the Veterans Health Administration (VHA) have diabetes. The purpose of this article is to describe the development and implementation of a quality improvement project to embed an evidence-based diabetes screening tool, based on Veterans Affairs/Department of Defense diabetes clinical practice guidelines, into the VHA electronic medical record. The objectives of the screening tool were threefold: to promote evidence-based care, to standardize care coordination, and to promote self-management and proper utilization of resources. Record reviews were conducted to evaluate the effectiveness of the screening tool. Results showed an 88 percent increase in the assessment of annual exams and/or labs, a 16.5 percent increase in disease management assessment and offering of services, and a 50 percent increase in goal-setting activity.

  19. Democratization of Nanoscale Imaging and Sensing Tools Using Photonics.

    PubMed

    McLeod, Euan; Wei, Qingshan; Ozcan, Aydogan

    2015-07-07

    Providing means for researchers and citizen scientists in the developing world to perform advanced measurements with nanoscale precision can help to accelerate the rate of discovery and invention as well as improve higher education and the training of the next generation of scientists and engineers worldwide. Here, we review some of the recent progress toward making optical nanoscale measurement tools more cost-effective, field-portable, and accessible to a significantly larger group of researchers and educators. We divide our review into two main sections: label-based nanoscale imaging and sensing tools, which primarily involve fluorescent approaches, and label-free nanoscale measurement tools, which include light scattering sensors, interferometric methods, photonic crystal sensors, and plasmonic sensors. For each of these areas, we have primarily focused on approaches that have either demonstrated operation outside of a traditional laboratory setting, including for example integration with mobile phones, or exhibited the potential for such operation in the near future.

  20. ePlant and the 3D data display initiative: integrative systems biology on the world wide web.

    PubMed

    Fucile, Geoffrey; Di Biase, David; Nahal, Hardeep; La, Garon; Khodabandeh, Shokoufeh; Chen, Yani; Easley, Kante; Christendat, Dinesh; Kelley, Lawrence; Provart, Nicholas J

    2011-01-10

    Visualization tools for biological data are often limited in their ability to interactively integrate data at multiple scales. These computational tools are also typically limited by two-dimensional displays and programmatic implementations that require separate configurations for each of the user's computing devices and recompilation for functional expansion. Towards overcoming these limitations we have developed "ePlant" (http://bar.utoronto.ca/eplant) - a suite of open-source world wide web-based tools for the visualization of large-scale data sets from the model organism Arabidopsis thaliana. These tools display data spanning multiple biological scales on interactive three-dimensional models. Currently, ePlant consists of the following modules: a sequence conservation explorer that includes homology relationships and single nucleotide polymorphism data, a protein structure model explorer, a molecular interaction network explorer, a gene product subcellular localization explorer, and a gene expression pattern explorer. The ePlant's protein structure explorer module represents experimentally determined and theoretical structures covering >70% of the Arabidopsis proteome. The ePlant framework is accessed entirely through a web browser, and is therefore platform-independent. It can be applied to any model organism. To facilitate the development of three-dimensional displays of biological data on the world wide web we have established the "3D Data Display Initiative" (http://3ddi.org).

  1. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  2. Enhancing knowledge discovery from cancer genomics data with Galaxy

    PubMed Central

    Albuquerque, Marco A.; Grande, Bruno M.; Ritch, Elie J.; Pararajalingam, Prasath; Jessa, Selin; Krzywinski, Martin; Grewal, Jasleen K.; Shah, Sohrab P.; Boutros, Paul C.

    2017-01-01

    Abstract The field of cancer genomics has demonstrated the power of massively parallel sequencing techniques to inform on the genes and specific alterations that drive tumor onset and progression. Although large comprehensive sequence data sets continue to be made increasingly available, data analysis remains an ongoing challenge, particularly for laboratories lacking dedicated resources and bioinformatics expertise. To address this, we have produced a collection of Galaxy tools that represent many popular algorithms for detecting somatic genetic alterations from cancer genome and exome data. We developed new methods for parallelization of these tools within Galaxy to accelerate runtime and have demonstrated their usability and summarized their runtimes on multiple cloud service providers. Some tools represent extensions or refinement of existing toolkits to yield visualizations suited to cohort-wide cancer genomic analysis. For example, we present Oncocircos and Oncoprintplus, which generate data-rich summaries of exome-derived somatic mutation. Workflows that integrate these to achieve data integration and visualizations are demonstrated on a cohort of 96 diffuse large B-cell lymphomas and enabled the discovery of multiple candidate lymphoma-related genes. Our toolkit is available from our GitHub repository as Galaxy tool and dependency definitions and has been deployed using virtualization on multiple platforms including Docker. PMID:28327945

  3. The Five S’s: A Communication Tool for Child Psychiatric Access Projects

    PubMed Central

    Harrison, Joyce; Wasserman, Kate; Steinberg, Janna; Platt, Rheanna; Coble, Kelly; Bower, Kelly

    2017-01-01

    Given the gap in child psychiatric services available to meet existing pediatric behavioral health needs, children and families are increasingly seeking behavioral health services from their primary care clinicians (PCCs). However, many pediatricians report not feeling adequately trained to meet these needs. As a result, child psychiatric access projects (CPAPs) are being developed around the country to support the integration of care for children. Despite the promise and success of these programs, there are barriers, including the challenge of effective communication between PCCs and child psychiatrists. Consultants from the Maryland CPAP, the Behavioral Health Integration in Pediatric Primary Care (BHIPP) project, have developed a framework called the Five S’s. The Five S’s are Safety, Specific Behaviors, Setting, Scary Things, and Screening/Services. It is a tool that can be used to help PCCs and child psychiatrists communicate and collaborate to formulate pediatric behavioral health cases for consultation or referral requests. Each of these components and its importance to the case consultation are described. Two case studies are presented that illustrate how the Five S’s tool can be used in clinical consultation between PCC and child psychiatrist. We also describe the utility of the tool beyond its use in behavioral health consultation. PMID:27919566

  4. Enhancing knowledge discovery from cancer genomics data with Galaxy.

    PubMed

    Albuquerque, Marco A; Grande, Bruno M; Ritch, Elie J; Pararajalingam, Prasath; Jessa, Selin; Krzywinski, Martin; Grewal, Jasleen K; Shah, Sohrab P; Boutros, Paul C; Morin, Ryan D

    2017-05-01

    The field of cancer genomics has demonstrated the power of massively parallel sequencing techniques to inform on the genes and specific alterations that drive tumor onset and progression. Although large comprehensive sequence data sets continue to be made increasingly available, data analysis remains an ongoing challenge, particularly for laboratories lacking dedicated resources and bioinformatics expertise. To address this, we have produced a collection of Galaxy tools that represent many popular algorithms for detecting somatic genetic alterations from cancer genome and exome data. We developed new methods for parallelization of these tools within Galaxy to accelerate runtime and have demonstrated their usability and summarized their runtimes on multiple cloud service providers. Some tools represent extensions or refinement of existing toolkits to yield visualizations suited to cohort-wide cancer genomic analysis. For example, we present Oncocircos and Oncoprintplus, which generate data-rich summaries of exome-derived somatic mutation. Workflows that integrate these to achieve data integration and visualizations are demonstrated on a cohort of 96 diffuse large B-cell lymphomas and enabled the discovery of multiple candidate lymphoma-related genes. Our toolkit is available from our GitHub repository as Galaxy tool and dependency definitions and has been deployed using virtualization on multiple platforms including Docker. © The Author 2017. Published by Oxford University Press.

  5. The BioExtract Server: a web-based bioinformatic workflow platform

    PubMed Central

    Lushbough, Carol M.; Jennewein, Douglas M.; Brendel, Volker P.

    2011-01-01

    The BioExtract Server (bioextract.org) is an open, web-based system designed to aid researchers in the analysis of genomic data by providing a platform for the creation of bioinformatic workflows. Scientific workflows are created within the system by recording tasks performed by the user. These tasks may include querying multiple, distributed data sources, saving query results as searchable data extracts, and executing local and web-accessible analytic tools. The series of recorded tasks can then be saved as a reproducible, sharable workflow available for subsequent execution with the original or modified inputs and parameter settings. Integrated data resources include interfaces to the National Center for Biotechnology Information (NCBI) nucleotide and protein databases, the European Molecular Biology Laboratory (EMBL-Bank) non-redundant nucleotide database, the Universal Protein Resource (UniProt), and the UniProt Reference Clusters (UniRef) database. The system offers access to numerous preinstalled, curated analytic tools and also provides researchers with the option of selecting computational tools from a large list of web services including the European Molecular Biology Open Software Suite (EMBOSS), BioMoby, and the Kyoto Encyclopedia of Genes and Genomes (KEGG). The system further allows users to integrate local command line tools residing on their own computers through a client-side Java applet. PMID:21546552

  6. iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations

    NASA Astrophysics Data System (ADS)

    Vanfretti, L.; Rabuzin, T.; Baudette, M.; Murad, M.

    The iTesla Power Systems Library (iPSL) is a Modelica package providing a set of power system components for phasor time-domain modeling and simulation. The Modelica language provides a systematic approach to develop models using a formal mathematical description, that uniquely specifies the physical behavior of a component or the entire system. Furthermore, the standardized specification of the Modelica language (Modelica Association [1]) enables unambiguous model exchange by allowing any Modelica-compliant tool to utilize the models for simulation and their analyses without the need of a specific model transformation tool. As the Modelica language is being developed with open specifications, any tool that implements these requirements can be utilized. This gives users the freedom of choosing an Integrated Development Environment (IDE) of their choice. Furthermore, any integration solver can be implemented within a Modelica tool to simulate Modelica models. Additionally, Modelica is an object-oriented language, enabling code factorization and model re-use to improve the readability of a library by structuring it with object-oriented hierarchy. The developed library is released under an open source license to enable a wider distribution and let the user customize it to their specific needs. This paper describes the iPSL and provides illustrative application examples.

  7. ePlant: Visualizing and Exploring Multiple Levels of Data for Hypothesis Generation in Plant Biology[OPEN

    PubMed Central

    Waese, Jamie; Fan, Jim; Yu, Hans; Fucile, Geoffrey; Shi, Ruian; Cumming, Matthew; Town, Chris; Stuerzlinger, Wolfgang

    2017-01-01

    A big challenge in current systems biology research arises when different types of data must be accessed from separate sources and visualized using separate tools. The high cognitive load required to navigate such a workflow is detrimental to hypothesis generation. Accordingly, there is a need for a robust research platform that incorporates all data and provides integrated search, analysis, and visualization features through a single portal. Here, we present ePlant (http://bar.utoronto.ca/eplant), a visual analytic tool for exploring multiple levels of Arabidopsis thaliana data through a zoomable user interface. ePlant connects to several publicly available web services to download genome, proteome, interactome, transcriptome, and 3D molecular structure data for one or more genes or gene products of interest. Data are displayed with a set of visualization tools that are presented using a conceptual hierarchy from big to small, and many of the tools combine information from more than one data type. We describe the development of ePlant in this article and present several examples illustrating its integrative features for hypothesis generation. We also describe the process of deploying ePlant as an “app” on Araport. Building on readily available web services, the code for ePlant is freely available for any other biological species research. PMID:28808136

  8. Developments in the CCP4 molecular-graphics project.

    PubMed

    Potterton, Liz; McNicholas, Stuart; Krissinel, Eugene; Gruber, Jan; Cowtan, Kevin; Emsley, Paul; Murshudov, Garib N; Cohen, Serge; Perrakis, Anastassis; Noble, Martin

    2004-12-01

    Progress towards structure determination that is both high-throughput and high-value is dependent on the development of integrated and automatic tools for electron-density map interpretation and for the analysis of the resulting atomic models. Advances in map-interpretation algorithms are extending the resolution regime in which fully automatic tools can work reliably, but at present human intervention is required to interpret poor regions of macromolecular electron density, particularly where crystallographic data is only available to modest resolution [for example, I/sigma(I) < 2.0 for minimum resolution 2.5 A]. In such cases, a set of manual and semi-manual model-building molecular-graphics tools is needed. At the same time, converting the knowledge encapsulated in a molecular structure into understanding is dependent upon visualization tools, which must be able to communicate that understanding to others by means of both static and dynamic representations. CCP4 mg is a program designed to meet these needs in a way that is closely integrated with the ongoing development of CCP4 as a program suite suitable for both low- and high-intervention computational structural biology. As well as providing a carefully designed user interface to advanced algorithms of model building and analysis, CCP4 mg is intended to present a graphical toolkit to developers of novel algorithms in these fields.

  9. Integrated Decision Strategies for Skin Sensitization Hazard

    PubMed Central

    Strickland, Judy; Zang, Qingda; Kleinstreuer, Nicole; Paris, Michael; Lehmann, David M.; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Lowit, Anna; Allen, David; Casey, Warren

    2016-01-01

    One of the top priorities of ICCVAM is the identification and evaluation of non-animal alternatives for skin sensitization testing. Although skin sensitization is a complex process, the key biological events of the process have been well characterized in an adverse outcome pathway (AOP) proposed by OECD. Accordingly, ICCVAM is working to develop integrated decision strategies based on the AOP using in vitro, in chemico, and in silico information. Data were compiled for 120 substances tested in the murine local lymph node assay (LLNA), direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT), and KeratinoSens assay. Data for six physicochemical properties that may affect skin penetration were also collected, and skin sensitization read-across predictions were performed using OECD QSAR Toolbox. All data were combined into a variety of potential integrated decision strategies to predict LLNA outcomes using a training set of 94 substances and an external test set of 26 substances. Fifty-four models were built using multiple combinations of machine learning approaches and predictor variables. The seven models with the highest accuracy (89–96% for the test set and 96–99% for the training set) for predicting LLNA outcomes used a support vector machine (SVM) approach with different combinations of predictor variables. The performance statistics of the SVM models were higher than any of the non-animal tests alone and higher than simple test battery approaches using these methods. These data suggest that computational approaches are promising tools to effectively integrate data sources to identify potential skin sensitizers without animal testing. PMID:26851134

  10. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  11. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs,more » and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.« less

  12. Optimal visual-haptic integration with articulated tools.

    PubMed

    Takahashi, Chie; Watt, Simon J

    2017-05-01

    When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory 'correspondence problem' underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world-seeing and feeling the same thing-and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual-haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools' properties.

  13. Conceptual dissonance: evaluating the efficacy of natural language processing techniques for validating translational knowledge constructs.

    PubMed

    Payne, Philip R O; Kwok, Alan; Dhaval, Rakesh; Borlawsky, Tara B

    2009-03-01

    The conduct of large-scale translational studies presents significant challenges related to the storage, management and analysis of integrative data sets. Ideally, the application of methodologies such as conceptual knowledge discovery in databases (CKDD) provides a means for moving beyond intuitive hypothesis discovery and testing in such data sets, and towards the high-throughput generation and evaluation of knowledge-anchored relationships between complex bio-molecular and phenotypic variables. However, the induction of such high-throughput hypotheses is non-trivial, and requires correspondingly high-throughput validation methodologies. In this manuscript, we describe an evaluation of the efficacy of a natural language processing-based approach to validating such hypotheses. As part of this evaluation, we will examine a phenomenon that we have labeled as "Conceptual Dissonance" in which conceptual knowledge derived from two or more sources of comparable scope and granularity cannot be readily integrated or compared using conventional methods and automated tools.

  14. Figures of Merit for Control Verification

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Goesu. Daniel P.

    2008-01-01

    This paper proposes a methodology for evaluating a controller's ability to satisfy a set of closed-loop specifications when the plant has an arbitrary functional dependency on uncertain parameters. Control verification metrics applicable to deterministic and probabilistic uncertainty models are proposed. These metrics, which result from sizing the largest uncertainty set of a given class for which the specifications are satisfied, enable systematic assessment of competing control alternatives regardless of the methods used to derive them. A particularly attractive feature of the tools derived is that their efficiency and accuracy do not depend on the robustness of the controller. This is in sharp contrast to Monte Carlo based methods where the number of simulations required to accurately approximate the failure probability grows exponentially with its closeness to zero. This framework allows for the integration of complex, high-fidelity simulations of the integrated system and only requires standard optimization algorithms for its implementation.

  15. Investigating Uncertainty and Sensitivity in Integrated, Multimedia Environmental Models: Tools for FRAMES-3MRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babendreier, Justin E.; Castleton, Karl J.

    2005-08-01

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less

  16. "Mobile Nurse" platform for ubiquitous medicine.

    PubMed

    Struzik, Z R; Yoshiuchi, K; Sone, M; Ishikawa, T; Kikuchi, H; Kumano, H; Watsuji, T; Natelson, B H; Yamamoto, Y

    2007-01-01

    We introduce "Mobile Nurse" (MN) - an emerging platform for the practice of ubiquitous medicine. By implementing in a dynamic setting of daily life the patient care traditionally provided by the clinical nurses on duty, MN aims at integral data collection and shortening the response time to the patient. MN is also capable of intelligent interaction with the patient and is able to learn from the patient's behavior and disease sign evaluation for improved personalized treatment. In this paper, we outline the most essential concepts around the hardware, software and methodological designs of MN. We provide an example of the implementation, and elaborate on the possible future impact on medical practice and biomedical science research. The main innovation of MN, setting it apart from current tele-medicine systems, is the ability to integrate the patient's signs and symptoms on site, providing medical professionals with powerful tools to elucidate disease mechanisms, to make proper diagnoses and to prescribe treatment.

  17. Use of the adult attachment projective picture system in psychodynamic psychotherapy with a severely traumatized patient

    PubMed Central

    George, Carol; Buchheim, Anna

    2014-01-01

    The following case study is presented to facilitate an understanding of how the attachment information evident from Adult Attachment Projective Picture System (AAP) assessment can be integrated into a psychodynamic perspective in making therapeutic recommendations that integrate an attachment perspective. The Adult Attachment Projective Picture System (AAP) is a valid representational measure of internal representations of attachment based on the analysis of a set of free response picture stimuli designed to systematically activate the attachment system (George and West, 2012). The AAP provides a fruitful diagnostic tool for psychodynamic-oriented clinicians to identify attachment-based deficits and resources for an individual patient in therapy. This paper considers the use of the AAP with a traumatized patient in an inpatient setting and uses a case study to illustrate the components of the AAP that are particularly relevant to a psychodynamic conceptualization. The paper discusses also attachment-based recommendations for intervention. PMID:25140164

  18. [Life project of residents and institutional approach in nursing homes].

    PubMed

    Chanut, Corinne

    The life project in a nursing home involves all the players concerned: first of all, the resident, then the caregivers, the families and the institution. This unifying tool, organised around the elderly, helps to develop collective competencies, favours the integration of new residents and reassures families. This article presents a nursing home's experience of setting up a life project. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  19. Simulation Tools for Digital LSI (Large Scale Integration) Design.

    DTIC Science & Technology

    1983-09-01

    potential paths e:,,t from a t.,sde to t alnd (,A I isl gc’, , that It iioht he coton ilci, ilt d I:eriniiie d f ’ Ti.. ’he o diti ns for %0l’h , i p th...the flag is set during an execution of the code, another iteration is performed; otherwise, the subroutine is finished . The following is an extended

  20. Integrating Ideas for International Data Collaborations Through The Committee on Earth Observation Satellites (CEOS) International Directory Network (IDN)

    NASA Technical Reports Server (NTRS)

    Olsen, Lola M.

    2006-01-01

    The capabilities of the International Directory Network's (IDN) version MD9.5, along with a new version of the metadata authoring tool, "docBUILDER", will be presented during the Technology and Services Subgroup session of the Working Group on Information Systems and Services (WGISS). Feedback provided through the international community has proven instrumental in positively influencing the direction of the IDN s development. The international community was instrumental in encouraging support for using the IS0 international character set that is now available through the directory. Supporting metadata descriptions in additional languages encourages extended use of the IDN. Temporal and spatial attributes often prove pivotal in the search for data. Prior to the new software release, the IDN s geospatial and temporal searches suffered from browser incompatibilities and often resulted in unreliable performance for users attempting to initiate a spatial search using a map based on aging Java applet technology. The IDN now offers an integrated Google map and date search that replaces that technology. In addition, one of the most defining characteristics in the search for data relates to the temporal and spatial resolution of the data. The ability to refine the search for data sets meeting defined resolution requirements is now possible. Data set authors are encouraged to indicate the precise resolution values for their data sets and subsequently bin these into one of the pre-selected resolution ranges. New metadata authoring tools have been well received. In response to requests for a standalone metadata authoring tool, a new shareable software package called "docBUILDER solo" will soon be released to the public. This tool permits researchers to document their data during experiments and observational periods in the field. interoperability has been enhanced through the use of the Open Archives Initiative s (OAI) Protocol for Metadata Harvesting (PMH). Harvesting of XML content through OAI-MPH has been successfully tested with several organizations. The protocol appears to be a prime candidate for sharing metadata throughout the international community. Data services for visualizing and analyzing data have become valuable assets in facilitating the use of data. Data providers are offering many of their data-related services through the directory. The IDN plans to develop a service-based architecture to further promote the use of web services. During the IDN Task Team session, ideas for further enhancements will be discussed.

  1. JNSViewer—A JavaScript-based Nucleotide Sequence Viewer for DNA/RNA secondary structures

    PubMed Central

    Dong, Min; Graham, Mitchell; Yadav, Nehul

    2017-01-01

    Many tools are available for visualizing RNA or DNA secondary structures, but there is scarce implementation in JavaScript that provides seamless integration with the increasingly popular web computational platforms. We have developed JNSViewer, a highly interactive web service, which is bundled with several popular tools for DNA/RNA secondary structure prediction and can provide precise and interactive correspondence among nucleotides, dot-bracket data, secondary structure graphs, and genic annotations. In JNSViewer, users can perform RNA secondary structure predictions with different programs and settings, add customized genic annotations in GFF format to structure graphs, search for specific linear motifs, and extract relevant structure graphs of sub-sequences. JNSViewer also allows users to choose a transcript or specific segment of Arabidopsis thaliana genome sequences and predict the corresponding secondary structure. Popular genome browsers (i.e., JBrowse and BrowserGenome) were integrated into JNSViewer to provide powerful visualizations of chromosomal locations, genic annotations, and secondary structures. In addition, we used StructureFold with default settings to predict some RNA structures for Arabidopsis by incorporating in vivo high-throughput RNA structure profiling data and stored the results in our web server, which might be a useful resource for RNA secondary structure studies in plants. JNSViewer is available at http://bioinfolab.miamioh.edu/jnsviewer/index.html. PMID:28582416

  2. Integrated Baseline System (IBS), Version 1. 03. [Chemical Stockpile Emergency Preparedness Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, B.M.; Burford, M.J.; Downing, T.R.

    The Integrated Baseline System (IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planing and analysis. This document is the user guide for the IBS and explains how to operate the IBS system. The fundamental function of the IBS is to provide tools that civilian emergency management personnel can use in developing emergency plans and in supporting emergency management activities to cope with a chemical-releasing event at a military chemical stockpile. Emergency management planners can evaluate concepts and ideas using the IBS system. The results of that experience can then be factoredmore » into refining requirements and plans. This document provides information for the general system user, and is the primary reference for the system features of the IBS. It is designed for persons who are familiar with general emergency management concepts, operations, and vocabulary. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other LBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary. IBS is a dynamic system. Its capabilities are in a state of continuing expansion and enhancement.« less

  3. Modelling the exposure to chemicals for risk assessment: a comprehensive library of multimedia and PBPK models for integration, prediction, uncertainty and sensitivity analysis - the MERLIN-Expo tool.

    PubMed

    Ciffroy, P; Alfonso, B; Altenpohl, A; Banjac, Z; Bierkens, J; Brochot, C; Critto, A; De Wilde, T; Fait, G; Fierens, T; Garratt, J; Giubilato, E; Grange, E; Johansson, E; Radomyski, A; Reschwann, K; Suciu, N; Tanaka, T; Tediosi, A; Van Holderbeke, M; Verdonck, F

    2016-10-15

    MERLIN-Expo is a library of models that was developed in the frame of the FP7 EU project 4FUN in order to provide an integrated assessment tool for state-of-the-art exposure assessment for environment, biota and humans, allowing the detection of scientific uncertainties at each step of the exposure process. This paper describes the main features of the MERLIN-Expo tool. The main challenges in exposure modelling that MERLIN-Expo has tackled are: (i) the integration of multimedia (MM) models simulating the fate of chemicals in environmental media, and of physiologically based pharmacokinetic (PBPK) models simulating the fate of chemicals in human body. MERLIN-Expo thus allows the determination of internal effective chemical concentrations; (ii) the incorporation of a set of functionalities for uncertainty/sensitivity analysis, from screening to variance-based approaches. The availability of such tools for uncertainty and sensitivity analysis aimed to facilitate the incorporation of such issues in future decision making; (iii) the integration of human and wildlife biota targets with common fate modelling in the environment. MERLIN-Expo is composed of a library of fate models dedicated to non biological receptor media (surface waters, soils, outdoor air), biological media of concern for humans (several cultivated crops, mammals, milk, fish), as well as wildlife biota (primary producers in rivers, invertebrates, fish) and humans. These models can be linked together to create flexible scenarios relevant for both human and wildlife biota exposure. Standardized documentation for each model and training material were prepared to support an accurate use of the tool by end-users. One of the objectives of the 4FUN project was also to increase the confidence in the applicability of the MERLIN-Expo tool through targeted realistic case studies. In particular, we aimed at demonstrating the feasibility of building complex realistic exposure scenarios and the accuracy of the modelling predictions through a comparison with actual measurements. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. The Stanford Data Miner: a novel approach for integrating and exploring heterogeneous immunological data.

    PubMed

    Siebert, Janet C; Munsil, Wes; Rosenberg-Hasson, Yael; Davis, Mark M; Maecker, Holden T

    2012-03-28

    Systems-level approaches are increasingly common in both murine and human translational studies. These approaches employ multiple high information content assays. As a result, there is a need for tools to integrate heterogeneous types of laboratory and clinical/demographic data, and to allow the exploration of that data by aggregating and/or segregating results based on particular variables (e.g., mean cytokine levels by age and gender). Here we describe the application of standard data warehousing tools to create a novel environment for user-driven upload, integration, and exploration of heterogeneous data. The system presented here currently supports flow cytometry and immunoassays performed in the Stanford Human Immune Monitoring Center, but could be applied more generally. Users upload assay results contained in platform-specific spreadsheets of a defined format, and clinical and demographic data in spreadsheets of flexible format. Users then map sample IDs to connect the assay results with the metadata. An OLAP (on-line analytical processing) data exploration interface allows filtering and display of various dimensions (e.g., Luminex analytes in rows, treatment group in columns, filtered on a particular study). Statistics such as mean, median, and N can be displayed. The views can be expanded or contracted to aggregate or segregate data at various levels. Individual-level data is accessible with a single click. The result is a user-driven system that permits data integration and exploration in a variety of settings. We show how the system can be used to find gender-specific differences in serum cytokine levels, and compare them across experiments and assay types. We have used the tools and techniques of data warehousing, including open-source business intelligence software, to support investigator-driven data integration and mining of diverse immunological data.

  5. The Stanford Data Miner: a novel approach for integrating and exploring heterogeneous immunological data

    PubMed Central

    2012-01-01

    Background Systems-level approaches are increasingly common in both murine and human translational studies. These approaches employ multiple high information content assays. As a result, there is a need for tools to integrate heterogeneous types of laboratory and clinical/demographic data, and to allow the exploration of that data by aggregating and/or segregating results based on particular variables (e.g., mean cytokine levels by age and gender). Methods Here we describe the application of standard data warehousing tools to create a novel environment for user-driven upload, integration, and exploration of heterogeneous data. The system presented here currently supports flow cytometry and immunoassays performed in the Stanford Human Immune Monitoring Center, but could be applied more generally. Results Users upload assay results contained in platform-specific spreadsheets of a defined format, and clinical and demographic data in spreadsheets of flexible format. Users then map sample IDs to connect the assay results with the metadata. An OLAP (on-line analytical processing) data exploration interface allows filtering and display of various dimensions (e.g., Luminex analytes in rows, treatment group in columns, filtered on a particular study). Statistics such as mean, median, and N can be displayed. The views can be expanded or contracted to aggregate or segregate data at various levels. Individual-level data is accessible with a single click. The result is a user-driven system that permits data integration and exploration in a variety of settings. We show how the system can be used to find gender-specific differences in serum cytokine levels, and compare them across experiments and assay types. Conclusions We have used the tools and techniques of data warehousing, including open-source business intelligence software, to support investigator-driven data integration and mining of diverse immunological data. PMID:22452993

  6. A stochastic approach for quantifying immigrant integration: the Spanish test case

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

    2014-10-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

  7. Designing highly flexible and usable cyberinfrastructures for convergence.

    PubMed

    Herr, Bruce W; Huang, Weixia; Penumarthy, Shashikant; Börner, Katy

    2006-12-01

    This article presents the results of a 7-year-long quest into the development of a "dream tool" for our research in information science and scientometrics and more recently, network science. The results are two cyberinfrastructures (CI): The Cyberinfrastructure for Information Visualization and the Network Workbench that enjoy a growing national and interdisciplinary user community. Both CIs use the cyberinfrastructure shell (CIShell) software specification, which defines interfaces between data sets and algorithms/services and provides a means to bundle them into powerful tools and (Web) services. In fact, CIShell might be our major contribution to progress in convergence. Just as Wikipedia is an "empty shell" that empowers lay persons to share text, a CIShell implementation is an "empty shell" that empowers user communities to plug-and-play, share, compare and combine data sets, algorithms, and compute resources across national and disciplinary boundaries. It is argued here that CIs will not only transform the way science is conducted but also will play a major role in the diffusion of expertise, data sets, algorithms, and technologies across multiple disciplines and business sectors leading to a more integrative science.

  8. Emerging tools for continuous nutrient monitoring networks: Sensors advancing science and water resources protection

    USGS Publications Warehouse

    Pellerin, Brian; Stauffer, Beth A; Young, Dwane A; Sullivan, Daniel J.; Bricker, Suzanne B.; Walbridge, Mark R; Clyde, Gerard A; Shaw, Denice M

    2016-01-01

    Sensors and enabling technologies are becoming increasingly important tools for water quality monitoring and associated water resource management decisions. In particular, nutrient sensors are of interest because of the well-known adverse effects of nutrient enrichment on coastal hypoxia, harmful algal blooms, and impacts to human health. Accurate and timely information on nutrient concentrations and loads is integral to strategies designed to minimize risk to humans and manage the underlying drivers of water quality impairment. Using nitrate sensors as an example, we highlight the types of applications in freshwater and coastal environments that are likely to benefit from continuous, real-time nutrient data. The concurrent emergence of new tools to integrate, manage and share large data sets is critical to the successful use of nutrient sensors and has made it possible for the field of continuous nutrient monitoring to rapidly move forward. We highlight several near-term opportunities for Federal agencies, as well as the broader scientific and management community, that will help accelerate sensor development, build and leverage sites within a national network, and develop open data standards and data management protocols that are key to realizing the benefits of a large-scale, integrated monitoring network. Investing in these opportunities will provide new information to guide management and policies designed to protect and restore our nation’s water resources.

  9. Integrated System Health Management: Foundational Concepts, Approach, and Implementation

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando

    2009-01-01

    A sound basis to guide the community in the conception and implementation of ISHM (Integrated System Health Management) capability in operational systems was provided. The concept of "ISHM Model of a System" and a related architecture defined as a unique Data, Information, and Knowledge (DIaK) architecture were described. The ISHM architecture is independent of the typical system architecture, which is based on grouping physical elements that are assembled to make up a subsystem, and subsystems combine to form systems, etc. It was emphasized that ISHM capability needs to be implemented first at a low functional capability level (FCL), or limited ability to detect anomalies, diagnose, determine consequences, etc. As algorithms and tools to augment or improve the FCL are identified, they should be incorporated into the system. This means that the architecture, DIaK management, and software, must be modular and standards-based, in order to enable systematic augmentation of FCL (no ad-hoc modifications). A set of technologies (and tools) needed to implement ISHM were described. One essential tool is a software environment to create the ISHM Model. The software environment encapsulates DIaK, and an infrastructure to focus DIaK on determining health (detect anomalies, determine causes, determine effects, and provide integrated awareness of the system to the operator). The environment includes gateways to communicate in accordance to standards, specially the IEEE 1451.1 Standard for Smart Sensors and Actuators.

  10. SNPranker 2.0: a gene-centric data mining tool for diseases associated SNP prioritization in GWAS.

    PubMed

    Merelli, Ivan; Calabria, Andrea; Cozzi, Paolo; Viti, Federica; Mosca, Ettore; Milanesi, Luciano

    2013-01-01

    The capability of correlating specific genotypes with human diseases is a complex issue in spite of all advantages arisen from high-throughput technologies, such as Genome Wide Association Studies (GWAS). New tools for genetic variants interpretation and for Single Nucleotide Polymorphisms (SNPs) prioritization are actually needed. Given a list of the most relevant SNPs statistically associated to a specific pathology as result of a genotype study, a critical issue is the identification of genes that are effectively related to the disease by re-scoring the importance of the identified genetic variations. Vice versa, given a list of genes, it can be of great importance to predict which SNPs can be involved in the onset of a particular disease, in order to focus the research on their effects. We propose a new bioinformatics approach to support biological data mining in the analysis and interpretation of SNPs associated to pathologies. This system can be employed to design custom genotyping chips for disease-oriented studies and to re-score GWAS results. The proposed method relies (1) on the data integration of public resources using a gene-centric database design, (2) on the evaluation of a set of static biomolecular annotations, defined as features, and (3) on the SNP scoring function, which computes SNP scores using parameters and weights set by users. We employed a machine learning classifier to set default feature weights and an ontological annotation layer to enable the enrichment of the input gene set. We implemented our method as a web tool called SNPranker 2.0 (http://www.itb.cnr.it/snpranker), improving our first published release of this system. A user-friendly interface allows the input of a list of genes, SNPs or a biological process, and to customize the features set with relative weights. As result, SNPranker 2.0 returns a list of SNPs, localized within input and ontologically enriched genes, combined with their prioritization scores. Different databases and resources are already available for SNPs annotation, but they do not prioritize or re-score SNPs relying on a-priori biomolecular knowledge. SNPranker 2.0 attempts to fill this gap through a user-friendly integrated web resource. End users, such as researchers in medical genetics and epidemiology, may find in SNPranker 2.0 a new tool for data mining and interpretation able to support SNPs analysis. Possible scenarios are GWAS data re-scoring, SNPs selection for custom genotyping arrays and SNPs/diseases association studies.

  11. Recommendation Systems for Geoscience Data Portals Built by Analyzing Usage Patterns

    NASA Astrophysics Data System (ADS)

    Crosby, C.; Nandigam, V.; Baru, C.

    2009-04-01

    Since its launch five years ago, the National Science Foundation-funded GEON Project (www.geongrid.org) has been providing access to a variety of geoscience data sets such as geologic maps and other geographic information system (GIS)-oriented data, paleontologic databases, gravity and magnetics data and LiDAR topography via its online portal interface. In addition to data, the GEON Portal also provides web-based tools and other resources that enable users to process and interact with data. Examples of these tools include functions to dynamically map and integrate GIS data, compute synthetic seismograms, and to produce custom digital elevation models (DEMs) with user defined parameters such as resolution. The GEON portal built on the Gridsphere-portal framework allows us to capture user interaction with the system. In addition to the site access statistics captured by tools like Google Analystics which capture hits per unit time, search key words, operating systems, browsers, and referring sites, we also record additional statistics such as which data sets are being downloaded and in what formats, processing parameters, and navigation pathways through the portal. With over four years of data now available from the GEON Portal, this record of usage is a rich resource for exploring how earth scientists discover and utilize online data sets. Furthermore, we propose that this data could ultimately be harnessed to optimize the way users interact with the data portal, design intelligent processing and data management systems, and to make recommendations on algorithm settings and other available relevant data. The paradigm of integrating popular and commonly used patterns to make recommendations to a user is well established in the world of e-commerce where users receive suggestions on books, music and other products that they may find interesting based on their website browsing and purchasing history, as well as the patterns of fellow users who have made similar selections. However, this paradigm has not yet been explored for geoscience data portals. In this presentation we will present an initial analysis of user interaction and access statistics for the GEON OpenTopography LiDAR data distribution and processing system to illustrate what they reveal about user's spatial and temporal data access patterns, data processing parameter selections, and pathways through the data portal. We also demonstrate what these usage statistics can illustrate about aspects of the data sets that are of greatest interest. Finally, we explore how these usage statistics could be used to improve the user's experience in the data portal and to optimize how data access interfaces and tools are designed and implemented.

  12. Merging and Visualization of Archived Oceanographic Acoustic, Optical, and Sensor Data to Support Improved Access and Interpretation

    NASA Astrophysics Data System (ADS)

    Malik, M. A.; Cantwell, K. L.; Reser, B.; Gray, L. M.

    2016-02-01

    Marine researchers and managers routinely rely on interdisciplinary data sets collected using hull-mounted sonars, towed sensors, or submersible vehicles. These data sets can be broadly categorized into acoustic remote sensing, imagery-based observations, water property measurements, and physical samples. The resulting raw data sets are overwhelmingly large and complex, and often require specialized software and training to process. To address these challenges, NOAA's Office of Ocean Exploration and Research (OER) is developing tools to improve the discoverability of raw data sets and integration of quality-controlled processed data in order to facilitate re-use of archived oceanographic data. Majority of recently collected OER raw oceanographic data can be retrieved from national data archives (e.g. NCEI and NOAA central library). Merging of disperse data sets by scientists with diverse expertise, however remains problematic. Initial efforts at OER have focused on merging geospatial acoustic remote sensing data with imagery and water property measurements that typically lack direct geo-referencing. OER has developed `smart' ship and submersible tracks that can provide a synopsis of geospatial coverage of various data sets. Tools under development enable scientists to quickly assess the relevance of archived OER data to their respective research or management interests, and enable quick access to the desired raw and processed data sets. Pre-processing of the data and visualization to combine various data sets also offers benefits to streamline data quality assurance and quality control efforts.

  13. Cost-effective management alternatives for Snake River Chinook salmon: a biological-economic synthesis.

    PubMed

    Halsing, David L; Moore, Michael R

    2008-04-01

    The mandate to increase endangered salmon populations in the Columbia River Basin of North America has created a complex, controversial resource-management issue. We constructed an integrated assessment model as a tool for analyzing biological-economic trade-offs in recovery of Snake River spring- and summer-run chinook salmon (Oncorhynchus tshawytscha). We merged 3 frameworks: a salmon-passage model to predict migration and survival of smolts; an age-structured matrix model to predict long-term population growth rates of salmon stocks; and a cost-effectiveness analysis to determine a set of least-cost management alternatives for achieving particular population growth rates. We assessed 6 individual salmon-management measures and 76 management alternatives composed of one or more measures. To reflect uncertainty, results were derived for different assumptions of effectiveness of smolt transport around dams. Removal of an estuarine predator, the Caspian Tern (Sterna caspia), was cost-effective and generally increased long-term population growth rates regardless of transport effectiveness. Elimination of adult salmon harvest had a similar effect over a range of its cost estimates. The specific management alternatives in the cost-effective set depended on assumptions about transport effectiveness. On the basis of recent estimates of smolt transport effectiveness, alternatives that discontinued transportation or breached dams were prevalent in the cost-effective set, whereas alternatives that maximized transportation dominated if transport effectiveness was relatively high. More generally, the analysis eliminated 80-90% of management alternatives from the cost-effective set. Application of our results to salmon management is limited by data availability and model assumptions, but these limitations can help guide research that addresses critical uncertainties and information. Our results thus demonstrate that linking biology and economics through integrated models can provide valuable tools for science-based policy and management.

  14. Cost-effective management alternatives for Snake river chinook salmon: A biological-economic synthesis

    USGS Publications Warehouse

    Halsing, D.L.; Moore, M.R.

    2008-01-01

    The mandate to increase endangered salmon populations in the Columbia River Basin of North America has created a complex, controversial resource-management issue. We constructed an integrated assessment model as a tool for analyzing biological-economic trade-offs in recovery of Snake River spring- and summer-run chinook salmon (Oncorhynchus tshawytscha). We merged 3 frameworks: a salmon-passage model to predict migration and survival of smolts; an age-structured matrix model to predict long-term population growth rates of salmon stocks; and a cost-effectiveness analysis to determine a set of least-cost management alternatives for achieving particular population growth rates. We assessed 6 individual salmon-management measures and 76 management alternatives composed of one or more measures. To reflect uncertainty, results were derived for different assumptions of effectiveness of smolt transport around dams. Removal of an estuarine predator, the Caspian Tern (Sterna caspia), was cost-effective and generally increased long-term population growth rates regardless of transport effectiveness. Elimination of adult salmon harvest had a similar effect over a range of its cost estimates. The specific management alternatives in the cost-effective set depended on assumptions about transport effectiveness. On the basis of recent estimates of smolt transport effectiveness, alternatives that discontinued transportation or breached dams were prevalent in the cost-effective set, whereas alternatives that maximized transportation dominated if transport effectiveness was relatively high. More generally, the analysis eliminated 80-90% of management alternatives from the cost-effective set. Application of our results to salmon management is limited by data availability and model assumptions, but these limitations can help guide research that addresses critical uncertainties and information. Our results thus demonstrate that linking biology and economics through integrated models can provide valuable tools for science-based policy and management.

  15. Gaussian process regression for tool wear prediction

    NASA Astrophysics Data System (ADS)

    Kong, Dongdong; Chen, Yongjie; Li, Ning

    2018-05-01

    To realize and accelerate the pace of intelligent manufacturing, this paper presents a novel tool wear assessment technique based on the integrated radial basis function based kernel principal component analysis (KPCA_IRBF) and Gaussian process regression (GPR) for real-timely and accurately monitoring the in-process tool wear parameters (flank wear width). The KPCA_IRBF is a kind of new nonlinear dimension-increment technique and firstly proposed for feature fusion. The tool wear predictive value and the corresponding confidence interval are both provided by utilizing the GPR model. Besides, GPR performs better than artificial neural networks (ANN) and support vector machines (SVM) in prediction accuracy since the Gaussian noises can be modeled quantitatively in the GPR model. However, the existence of noises will affect the stability of the confidence interval seriously. In this work, the proposed KPCA_IRBF technique helps to remove the noises and weaken its negative effects so as to make the confidence interval compressed greatly and more smoothed, which is conducive for monitoring the tool wear accurately. Moreover, the selection of kernel parameter in KPCA_IRBF can be easily carried out in a much larger selectable region in comparison with the conventional KPCA_RBF technique, which helps to improve the efficiency of model construction. Ten sets of cutting tests are conducted to validate the effectiveness of the presented tool wear assessment technique. The experimental results show that the in-process flank wear width of tool inserts can be monitored accurately by utilizing the presented tool wear assessment technique which is robust under a variety of cutting conditions. This study lays the foundation for tool wear monitoring in real industrial settings.

  16. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  17. Use of concurrent mixed methods combining concept mapping and focus groups to adapt a health equity tool in Canada.

    PubMed

    Guichard, Anne; Tardieu, Émilie; Dagenais, Christian; Nour, Kareen; Lafontaine, Ginette; Ridde, Valéry

    2017-04-01

    The aim of this project was to identify and prioritize a set of conditions to be considered for incorporating a health equity tool into public health practice. Concept mapping and focus groups were implemented as complementary methods to investigate the conditions of use of a health equity tool by public health organizations in Quebec. Using a hybrid integrated research design is a richer way to address the complexity of questions emerging from intervention and planning settings. This approach provides a deeper, operational, and contextualized understanding of research results involving different professional and organizational cultures, and thereby supports the decision-making process. Concept mapping served to identify and prioritize in a limited timeframe the conditions to be considered for incorporation into a health equity tool into public health practices. Focus groups then provided a more refined understanding of the barriers, issues, and facilitating factors surrounding the tools adoption, helped distinguish among participants' perspectives based on functional roles and organizational contexts, and clarified some apparently contradictory results from the concept map. The combined use of these two techniques brought the strengths of each approach to bear, thereby overcoming some of the respective limitations of concept mapping and focus groups. This design is appropriate for investigating targets with multiple levels of complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Integrating cultures: a tool for mission leaders and others in collaborating organizations.

    PubMed

    Bradel, W T; Gillis, V; Harkness, J; McGuire, T P; Nehring, T

    1999-01-01

    This resource, Integrating Cultures, is a direct response to numerous requests received last fall from mission leaders in CHA-member organizations struggling with the cultural realities of strategic alliances. This tool presents the learnings of five authors who shared their significant experience of collaborative activities in ministry organizations, ranging from joint operating agreements to full mergers of assets and expenses. This resource specifically addresses the challenges facing organizations in the first 18 to 24 months follow the finalization of a collaboration. Strategies are presented here for bringing together previously distinct communities of people into positive, healthy new cultures that reflect the visions and purposes of the collaborative activities. Future articles will recommend culture integration strategies appropriate at other points along the collaboration timeline: the period of initial investigation, the stage of due diligence, and the ongoing life of collaborating entities two years and more after signing the final papers. Integrating Cultures and a resource from CHA collaboration with other-than-Catholic organizations (set for publication later this spring) were developed in response to members' requests for the accurate information they need as they proceed with integration strategies in today's healthcare environment. These resources are examples of the powerful knowledge e transfer and wisdom sharing that is possible when ministry leaders work with and for one another to make Christ's healing presence more evident in our world.

  19. ITEP: an integrated toolkit for exploration of microbial pan-genomes.

    PubMed

    Benedict, Matthew N; Henriksen, James R; Metcalf, William W; Whitaker, Rachel J; Price, Nathan D

    2014-01-03

    Comparative genomics is a powerful approach for studying variation in physiological traits as well as the evolution and ecology of microorganisms. Recent technological advances have enabled sequencing large numbers of related genomes in a single project, requiring computational tools for their integrated analysis. In particular, accurate annotations and identification of gene presence and absence are critical for understanding and modeling the cellular physiology of newly sequenced genomes. Although many tools are available to compare the gene contents of related genomes, new tools are necessary to enable close examination and curation of protein families from large numbers of closely related organisms, to integrate curation with the analysis of gain and loss, and to generate metabolic networks linking the annotations to observed phenotypes. We have developed ITEP, an Integrated Toolkit for Exploration of microbial Pan-genomes, to curate protein families, compute similarities to externally-defined domains, analyze gene gain and loss, and generate draft metabolic networks from one or more curated reference network reconstructions in groups of related microbial species among which the combination of core and variable genes constitute the their "pan-genomes". The ITEP toolkit consists of: (1) a series of modular command-line scripts for identification, comparison, curation, and analysis of protein families and their distribution across many genomes; (2) a set of Python libraries for programmatic access to the same data; and (3) pre-packaged scripts to perform common analysis workflows on a collection of genomes. ITEP's capabilities include de novo protein family prediction, ortholog detection, analysis of functional domains, identification of core and variable genes and gene regions, sequence alignments and tree generation, annotation curation, and the integration of cross-genome analysis and metabolic networks for study of metabolic network evolution. ITEP is a powerful, flexible toolkit for generation and curation of protein families. ITEP's modular design allows for straightforward extension as analysis methods and tools evolve. By integrating comparative genomics with the development of draft metabolic networks, ITEP harnesses the power of comparative genomics to build confidence in links between genotype and phenotype and helps disambiguate gene annotations when they are evaluated in both evolutionary and metabolic network contexts.

  20. QuickMap: a public tool for large-scale gene therapy vector insertion site mapping and analysis.

    PubMed

    Appelt, J-U; Giordano, F A; Ecker, M; Roeder, I; Grund, N; Hotz-Wagenblatt, A; Opelz, G; Zeller, W J; Allgayer, H; Fruehauf, S; Laufs, S

    2009-07-01

    Several events of insertional mutagenesis in pre-clinical and clinical gene therapy studies have created intense interest in assessing the genomic insertion profiles of gene therapy vectors. For the construction of such profiles, vector-flanking sequences detected by inverse PCR, linear amplification-mediated-PCR or ligation-mediated-PCR need to be mapped to the host cell's genome and compared to a reference set. Although remarkable progress has been achieved in mapping gene therapy vector insertion sites, public reference sets are lacking, as are the possibilities to quickly detect non-random patterns in experimental data. We developed a tool termed QuickMap, which uniformly maps and analyzes human and murine vector-flanking sequences within seconds (available at www.gtsg.org). Besides information about hits in chromosomes and fragile sites, QuickMap automatically determines insertion frequencies in +/- 250 kb adjacency to genes, cancer genes, pseudogenes, transcription factor and (post-transcriptional) miRNA binding sites, CpG islands and repetitive elements (short interspersed nuclear elements (SINE), long interspersed nuclear elements (LINE), Type II elements and LTR elements). Additionally, all experimental frequencies are compared with the data obtained from a reference set, containing 1 000 000 random integrations ('random set'). Thus, for the first time a tool allowing high-throughput profiling of gene therapy vector insertion sites is available. It provides a basis for large-scale insertion site analyses, which is now urgently needed to discover novel gene therapy vectors with 'safe' insertion profiles.

  1. The GenABEL Project for statistical genomics

    PubMed Central

    Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381

  2. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  3. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  4. Layout-aware simulation of soft errors in sub-100 nm integrated circuits

    NASA Astrophysics Data System (ADS)

    Balbekov, A.; Gorbunov, M.; Bobkov, S.

    2016-12-01

    Single Event Transient (SET) caused by charged particle traveling through the sensitive volume of integral circuit (IC) may lead to different errors in digital circuits in some cases. In technologies below 180 nm, a single particle can affect multiple devices causing multiple SET. This fact adds the complexity to fault tolerant devices design, because the schematic design techniques become useless without their layout consideration. The most common layout mitigation technique is a spatial separation of sensitive nodes of hardened circuits. Spatial separation decreases the circuit performance and increases power consumption. Spacing should thus be reasonable and its scaling follows the device dimensions' scaling trend. This paper presents the development of the SET simulation approach comprised of SPICE simulation with "double exponent" current source as SET model. The technique uses layout in GDSII format to locate nearby devices that can be affected by a single particle and that can share the generated charge. The developed software tool automatizes multiple simulations and gathers the produced data to present it as the sensitivity map. The examples of conducted simulations of fault tolerant cells and their sensitivity maps are presented in this paper.

  5. Pan-European management of coastal lagoons: A science-policy-stakeholder interface perspective

    NASA Astrophysics Data System (ADS)

    Lillebø, Ana I.; Stålnacke, Per; Gooch, Geoffrey D.; Krysanova, Valentina; Bielecka, Małgorzata

    2017-11-01

    The main objective of the work carried out in the scope of a three years collaborative research project was to develop science-based strategies and a decision support framework for the integrated management of coastal lagoons and their catchments and, in this context, to enhance connectivity between research and policymaking. In this paper our main objective is to share the lessons learned from the innovative methodology used throughout the project. To achieve the proposed objectives, the multidisciplinary scientific knowledge in the project team was combined and integrated with the knowledge and views of local stakeholders of four selected European coastal lagoons, using a three step participatory approach. With this innovative approach, which included the usage of eco-hydrological and water quality-modelling tools, the team developed and analyzed integrated scenarios of possible economic development and environmental impacts in four European lagoons and their catchments. These scenarios were presented and discussed with stakeholders, giving rise to management recommendations for each case study lagoon. Results show that some management options might be transferrable to other European lagoons having similar climatic, geophysical and socio-economic settings. In management terms, the project output provides a set of policy guidelines derived from the different analyses conducted and proposes initiatives concerning management implementation in a local-regional-national-European setting.

  6. Integrating multisource land use and land cover data

    USGS Publications Warehouse

    Wright, Bruce E.; Tait, Mike; Lins, K.F.; Crawford, J.S.; Benjamin, S.P.; Brown, Jesslyn F.

    1995-01-01

    As part of the U.S. Geological Survey's (USGS) land use and land cover (LULC) program, the USGS in cooperation with the Environmental Systems Research Institute (ESRI) is collecting and integrating LULC data for a standard USGS 1:100,000-scale product. The LULC data collection techniques include interpreting spectrally clustered Landsat Thematic Mapper (TM) images; interpreting 1-meter resolution digital panchromatic orthophoto images; and, for comparison, aggregating locally available large-scale digital data of urban areas. The area selected is the Vancouver, WA-OR quadrangle, which has a mix of urban, rural agriculture, and forest land. Anticipated products include an integrated LULC prototype data set in a standard classification scheme referenced to the USGS digital line graph (DLG) data of the area and prototype software to develop digital LULC data sets.This project will evaluate a draft standard LULC classification system developed by the USGS for use with various source material and collection techniques. Federal, State, and local governments, and private sector groups will have an opportunity to evaluate the resulting prototype software and data sets and to provide recommendations. It is anticipated that this joint research endeavor will increase future collaboration among interested organizations, public and private, for LULC data collection using common standards and tools.

  7. Visual analytics in cheminformatics: user-supervised descriptor selection for QSAR methods.

    PubMed

    Martínez, María Jimena; Ponzoni, Ignacio; Díaz, Mónica F; Vazquez, Gustavo E; Soto, Axel J

    2015-01-01

    The design of QSAR/QSPR models is a challenging problem, where the selection of the most relevant descriptors constitutes a key step of the process. Several feature selection methods that address this step are concentrated on statistical associations among descriptors and target properties, whereas the chemical knowledge is left out of the analysis. For this reason, the interpretability and generality of the QSAR/QSPR models obtained by these feature selection methods are drastically affected. Therefore, an approach for integrating domain expert's knowledge in the selection process is needed for increase the confidence in the final set of descriptors. In this paper a software tool, which we named Visual and Interactive DEscriptor ANalysis (VIDEAN), that combines statistical methods with interactive visualizations for choosing a set of descriptors for predicting a target property is proposed. Domain expertise can be added to the feature selection process by means of an interactive visual exploration of data, and aided by statistical tools and metrics based on information theory. Coordinated visual representations are presented for capturing different relationships and interactions among descriptors, target properties and candidate subsets of descriptors. The competencies of the proposed software were assessed through different scenarios. These scenarios reveal how an expert can use this tool to choose one subset of descriptors from a group of candidate subsets or how to modify existing descriptor subsets and even incorporate new descriptors according to his or her own knowledge of the target property. The reported experiences showed the suitability of our software for selecting sets of descriptors with low cardinality, high interpretability, low redundancy and high statistical performance in a visual exploratory way. Therefore, it is possible to conclude that the resulting tool allows the integration of a chemist's expertise in the descriptor selection process with a low cognitive effort in contrast with the alternative of using an ad-hoc manual analysis of the selected descriptors. Graphical abstractVIDEAN allows the visual analysis of candidate subsets of descriptors for QSAR/QSPR. In the two panels on the top, users can interactively explore numerical correlations as well as co-occurrences in the candidate subsets through two interactive graphs.

  8. Integrating uncertainty into public energy research and development decisions

    NASA Astrophysics Data System (ADS)

    Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina

    2017-05-01

    Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.

  9. Warfighter Integrated Physical Ergonomics Tool Development: Needs Analysis and State of the Art Review

    DTIC Science & Technology

    2011-03-01

    Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 14 e) Forces: Griffon seat design assessments include questions of vibration...the suitability of alternative designs . Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 5 e) Performance Measures...configurations to assess Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 8 design and acquisition decisions, and more

  10. OR.NET: multi-perspective qualitative evaluation of an integrated operating room based on IEEE 11073 SDC.

    PubMed

    Rockstroh, M; Franke, S; Hofer, M; Will, A; Kasparick, M; Andersen, B; Neumuth, T

    2017-08-01

    Clinical working environments have become very complex imposing many different tasks in diagnosis, medical treatment, and care procedures. During the German flagship project OR.NET, more than 50 partners developed technologies for an open integration of medical devices and IT systems in the operating room. The aim of the present work was to evaluate a large set of the proposed concepts from the perspectives of various stakeholders. The demonstration OR is focused on interventions from the head and neck surgery and was developed in close cooperation with surgeons and numerous colleagues of the project partners. The demonstration OR was qualitatively evaluated including technical as well as clinical aspects. In the evaluation, a questionnaire was used to obtain feedback from hospital operators. The clinical implications were covered by structured interviews with surgeons, anesthesiologists and OR staff. In the present work, we qualitatively evaluate a subset of the proposed concepts from the perspectives of various stakeholders. The feedback of the clinicians indicates that there is a need for a flexible data and control integration. The hospital operators stress the need for tools to simplify risk management in openly integrated operating rooms. The implementation of openly integrated operating rooms will positively affect the surgeons, the anesthesiologists, the surgical nursing staff, as well as the technical personnel and the hospital operators. The evaluation demonstrated the need for OR integration technologies and identified the missing tools to support risk management and approval as the main barriers for future installments.

  11. Simulink/PARS Integration Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vacaliuc, B.; Nakhaee, N.

    2013-12-18

    The state of the art for signal processor hardware has far out-paced the development tools for placing applications on that hardware. In addition, signal processors are available in a variety of architectures, each uniquely capable of handling specific types of signal processing efficiently. With these processors becoming smaller and demanding less power, it has become possible to group multiple processors, a heterogeneous set of processors, into single systems. Different portions of the desired problem set can be assigned to different processor types as appropriate. As software development tools do not keep pace with these processors, especially when multiple processors ofmore » different types are used, a method is needed to enable software code portability among multiple processors and multiple types of processors along with their respective software environments. Sundance DSP, Inc. has developed a software toolkit called “PARS”, whose objective is to provide a framework that uses suites of tools provided by different vendors, along with modeling tools and a real time operating system, to build an application that spans different processor types. The software language used to express the behavior of the system is a very high level modeling language, “Simulink”, a MathWorks product. ORNL has used this toolkit to effectively implement several deliverables. This CRADA describes this collaboration between ORNL and Sundance DSP, Inc.« less

  12. Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases

    PubMed Central

    Amos, Christopher I.; Bafna, Vineet; Hauser, Elizabeth R.; Hernandez, Ryan D.; Li, Chun; Liberles, David A.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Papanicolaou, George J.; Peng, Bo; Ritchie, Marylyn D.; Rosenfeld, Gabriel; Witte, John S.

    2014-01-01

    Genetic simulation programs are used to model data under specified assumptions to facilitate the understanding and study of complex genetic systems. Standardized data sets generated using genetic simulation are essential for the development and application of novel analytical tools in genetic epidemiology studies. With continuing advances in high-throughput genomic technologies and generation and analysis of larger, more complex data sets, there is a need for updating current approaches in genetic simulation modeling. To provide a forum to address current and emerging challenges in this area, the National Cancer Institute (NCI) sponsored a workshop, entitled “Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases” at the National Institutes of Health (NIH) in Bethesda, Maryland on March 11-12, 2014. The goals of the workshop were to: (i) identify opportunities, challenges and resource needs for the development and application of genetic simulation models; (ii) improve the integration of tools for modeling and analysis of simulated data; and (iii) foster collaborations to facilitate development and applications of genetic simulation. During the course of the meeting the group identified challenges and opportunities for the science of simulation, software and methods development, and collaboration. This paper summarizes key discussions at the meeting, and highlights important challenges and opportunities to advance the field of genetic simulation. PMID:25371374

  13. Re-Writing the Construction History of Boughton House (northamptonshire, Uk) with the Help of DOCU-TOOLS®

    NASA Astrophysics Data System (ADS)

    Schuster, J. C.

    2017-08-01

    The tablet-based software docu-tools digitize the documentation of buildings, simplifies construction and facility management and the data analysis in building and construction-history research. As a plan-based software, `pins' can be set to record data (images, audio, text etc.), each data point containing a time and date stamp. Once a pin is set and information recorded, it can never be deleted from the system, creating clear contentious-free documentation. Reports to any/all data recorded can immediately be generated through various templates in order to share, document, analyze and archive the information gathered. The software both digitizes building condition assessment, as well as simplifies the fully documented management and solving of problems and monitoring of a building. Used both in the construction industry and for documenting and analyzing historic buildings, docu-tools is a versatile and flexible tool that has become integral to my work as a building historian working on the conservation and curating of the historic built environment in Europe. I used the software at Boughton House, Northamptonshire, UK, during a one-year research project into the construction history of the building. The details of how docu-tools was used during this project will be discussed in this paper.

  14. Developing Electronic Health Record (EHR) Strategies Related to Health Center Patients' Social Determinants of Health.

    PubMed

    Gold, Rachel; Cottrell, Erika; Bunce, Arwen; Middendorf, Mary; Hollombe, Celine; Cowburn, Stuart; Mahr, Peter; Melgar, Gerardo

    2017-01-01

    "Social determinants of heath" (SDHs) are nonclinical factors that profoundly affect health. Helping community health centers (CHCs) document patients' SDH data in electronic health records (EHRs) could yield substantial health benefits, but little has been reported about CHCs' development of EHR-based tools for SDH data collection and presentation. We worked with 27 diverse CHC stakeholders to develop strategies for optimizing SDH data collection and presentation in their EHR, and approaches for integrating SDH data collection and the use of those data (eg, through referrals to community resources) into CHC workflows. We iteratively developed a set of EHR-based SDH data collection, summary, and referral tools for CHCs. We describe considerations that arose while developing the tools and present some preliminary lessons learned. Standardizing SDH data collection and presentation in EHRs could lead to improved patient and population health outcomes in CHCs and other care settings. We know of no previous reports of processes used to develop similar tools. This article provides an example of 1 such process. Lessons from our process may be useful to health care organizations interested in using EHRs to collect and act on SDH data. Research is needed to empirically test the generalizability of these lessons. © Copyright 2017 by the American Board of Family Medicine.

  15. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Runhaar, Hens

    An abundance of approaches, strategies, and instruments – in short: tools – have been developed that intend to stimulate or facilitate the integration of a variety of environmental objectives into development planning, national or regional sectoral policies, international agreements, business strategies, etc. These tools include legally mandatory procedures, such as Environmental Impact Assessment and Strategic Environmental Assessment; more voluntary tools such as environmental indicators developed by scientists and planning tools; green budgeting, etc. A relatively underexplored question is what integration tool fits what particular purposes and contexts, in short: “what works where?”. This paper intends to contribute to answering thismore » question, by first providing conceptual clarity about what integration entails, by suggesting and illustrating a classification of integration tools, and finally by summarising some of the lessons learned about how and why integration tools are (not) used and with what outcomes, particularly in terms of promoting the integration of environmental objectives.« less

  17. Freva - Freie Univ Evaluation System Framework for Scientific Infrastructures in Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Schartner, Thomas; Kirchner, Ingo; Rust, Henning W.; Cubasch, Ulrich; Ulbrich, Uwe

    2016-04-01

    The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.

  18. A web platform for integrated surface water - groundwater modeling and data management

    NASA Astrophysics Data System (ADS)

    Fatkhutdinov, Aybulat; Stefan, Catalin; Junghanns, Ralf

    2016-04-01

    Model-based decision support systems are considered to be reliable and time-efficient tools for resources management in various hydrology related fields. However, searching and acquisition of the required data, preparation of the data sets for simulations as well as post-processing, visualization and publishing of the simulations results often requires significantly more work and time than performing the modeling itself. The purpose of the developed software is to combine data storage facilities, data processing instruments and modeling tools in a single platform which potentially can reduce time required for performing simulations, hence decision making. The system is developed within the INOWAS (Innovative Web Based Decision Support System for Water Sustainability under a Changing Climate) project. The platform integrates spatially distributed catchment scale rainfall - runoff, infiltration and groundwater flow models with data storage, processing and visualization tools. The concept is implemented in a form of a web-GIS application and is build based on free and open source components, including the PostgreSQL database management system, Python programming language for modeling purposes, Mapserver for visualization and publishing the data, Openlayers for building the user interface and others. Configuration of the system allows performing data input, storage, pre- and post-processing and visualization in a single not disturbed workflow. In addition, realization of the decision support system in the form of a web service provides an opportunity to easily retrieve and share data sets as well as results of simulations over the internet, which gives significant advantages for collaborative work on the projects and is able to significantly increase usability of the decision support system.

  19. Using Arden Syntax for the Generation of Intelligent Intensive Care Discharge Letters.

    PubMed

    Kraus, Stefan; Castellanos, Ixchel; Albermann, Matthias; Schuettler, Christina; Prokosch, Hans-Ulrich; Staudigel, Martin; Toddenroth, Dennis

    2016-01-01

    Discharge letters are an important means of communication between physicians and nurses from intensive care units and their colleagues from normal wards. The patient data management system (PDMS) used at our local intensive care units provides an export tool to create discharge letters by inserting data items from electronic medical records into predefined templates. Local intensivists criticized the limitations of this tool regarding the identification and the further processing of clinically relevant data items for a flexible creation of discharge letters. As our PDMS supports Arden Syntax, and the demanded functionalities are well within the scope of this standard, we set out to investigate the suitability of Arden Syntax for the generation of discharge letters. To provide an easy-to-understand facility for integrating data items into document templates, we created an Arden Syntax interface function which replaces the names of previously defined variables with their content in a way that permits arbitrary custom formatting by clinical users. Our approach facilitates the creation of flexible text sections by conditional statements, as well as the integration of arbitrary HTML code and dynamically generated graphs. The resulting prototype enables clinical users to apply the full set of Arden Syntax language constructs to identify and process relevant data items in a way that far exceeds the capabilities of the PDMS export tool. The generation of discharge letters is an uncommon area of application for Arden Syntax, considerably differing from its original purpose. However, we found our prototype well suited for this task and plan to evaluate it in clinical production after the next major release change of our PDMS.

  20. Investigation and Evaluation of the open source ETL tools GeoKettle and Talend Open Studio in terms of their ability to process spatial data

    NASA Astrophysics Data System (ADS)

    Kuhnert, Kristin; Quedenau, Jörn

    2016-04-01

    Integration and harmonization of large spatial data sets is not only since the introduction of the spatial data infrastructure INSPIRE a big issue. The process of extracting and combining spatial data from heterogeneous source formats, transforming that data to obtain the required quality for particular purposes and loading it into a data store, are common tasks. The procedure of Extraction, Transformation and Loading of data is called ETL process. Geographic Information Systems (GIS) can take over many of these tasks but often they are not suitable for processing large datasets. ETL tools can make the implementation and execution of ETL processes convenient and efficient. One reason for choosing ETL tools for data integration is that they ease maintenance because of a clear (graphical) presentation of the transformation steps. Developers and administrators are provided with tools for identification of errors, analyzing processing performance and managing the execution of ETL processes. Another benefit of ETL tools is that for most tasks no or only little scripting skills are required so that also researchers without programming background can easily work with it. Investigations on ETL tools for business approaches are available for a long time. However, little work has been published on the capabilities of those tools to handle spatial data. In this work, we review and compare the open source ETL tools GeoKettle and Talend Open Studio in terms of processing spatial data sets of different formats. For evaluation, ETL processes are performed with both software packages based on air quality data measured during the BÄRLIN2014 Campaign initiated by the Institute for Advanced Sustainability Studies (IASS). The aim of the BÄRLIN2014 Campaign is to better understand the sources and distribution of particulate matter in Berlin. The air quality data are available in heterogeneous formats because they were measured with different instruments. For further data analysis, the instrument data has been complemented by other georeferenced data provided by the local environmental authorities. This includes both vector and raster data on e.g. land use categories or building heights, extracted from flat files and OGC-compliant web services. The requirements on the ETL tools are now for instance the extraction of different input datasets like Web Feature Services or vector datasets and the loading of those into databases. The tools also have to manage transformations on spatial datasets like to work with spatial functions (e.g. intersection, union) or change spatial reference systems. Preliminary results suggest that many complex transformation tasks could be accomplished with the existing set of components from both software tools, while there are still many gaps in the range of available features. Both ETL tools differ in functionality and in the way of implementation of various steps. For some tasks no predefined components are available at all, which could partly be compensated by the use of the respective API (freely configurable components in Java or JavaScript).

  1. A Distributed Control System Prototyping Environment to Support Control Room Modernization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lew, Roger Thomas; Boring, Ronald Laurids; Ulrich, Thomas Anthony

    Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, however the set of tools for developing and designing HMIs is still in its infancy. Here we propose a rapid prototyping approach for integrating proposed HMIs into their native environments before a design is finalized. This approach allows researchers and developers tomore » test design ideas and eliminate design flaws prior to fully developing the new system. We illustrate this approach with four prototype designs developed using Microsoft’s Windows Presentation Foundation (WPF). One example is integrated into a microworld environment to test the functionality of the design and identify the optimal level of automation for a new system in a nuclear power plant. The other three examples are integrated into a full-scale, glasstop digital simulator of a nuclear power plant. One example demonstrates the capabilities of next generation control concepts; another aims to expand the current state of the art; lastly, an HMI prototype was developed as a test platform for a new control system currently in development at U.S. nuclear power plants. WPF possesses several characteristics that make it well suited to HMI design. It provides a tremendous amount of flexibility, agility, robustness, and extensibility. Distributed control system (DCS) specific environments tend to focus on the safety and reliability requirements for real-world interfaces and consequently have less emphasis on providing functionality to support novel interaction paradigms. Because of WPF’s large user-base, Microsoft can provide an extremely mature tool. Within process control applications,WPF is platform independent and can communicate with popular full-scope process control simulator vendor plant models and DCS platforms.« less

  2. Development of the Hydroecological Integrity Assessment Process for Determining Environmental Flows for New Jersey Streams

    USGS Publications Warehouse

    Kennen, Jonathan G.; Henriksen, James A.; Nieswand, Steven P.

    2007-01-01

    The natural flow regime paradigm and parallel stream ecological concepts and theories have established the benefits of maintaining or restoring the full range of natural hydrologic variation for physiochemical processes, biodiversity, and the evolutionary potential of aquatic and riparian communities. A synthesis of recent advances in hydroecological research coupled with stream classification has resulted in a new process to determine environmental flows and assess hydrologic alteration. This process has national and international applicability. It allows classification of streams into hydrologic stream classes and identification of a set of non-redundant and ecologically relevant hydrologic indices for 10 critical sub-components of flow. Three computer programs have been developed for implementing the Hydroecological Integrity Assessment Process (HIP): (1) the Hydrologic Indices Tool (HIT), which calculates 171 ecologically relevant hydrologic indices on the basis of daily-flow and peak-flow stream-gage data; (2) the New Jersey Hydrologic Assessment Tool (NJHAT), which can be used to establish a hydrologic baseline period, provide options for setting baseline environmental-flow standards, and compare past and proposed streamflow alterations; and (3) the New Jersey Stream Classification Tool (NJSCT), designed for placing unclassified streams into pre-defined stream classes. Biological and multivariate response models including principal-component, cluster, and discriminant-function analyses aided in the development of software and implementation of the HIP for New Jersey. A pilot effort is currently underway by the New Jersey Department of Environmental Protection in which the HIP is being used to evaluate the effects of past and proposed surface-water use, ground-water extraction, and land-use changes on stream ecosystems while determining the most effective way to integrate the process into ongoing regulatory programs. Ultimately, this scientifically defensible process will help to quantify the effects of anthropogenic changes and development on hydrologic variability and help planners and resource managers balance current and future water requirements with ecological needs.

  3. LENS: web-based lens for enrichment and network studies of human proteins

    PubMed Central

    2015-01-01

    Background Network analysis is a common approach for the study of genetic view of diseases and biological pathways. Typically, when a set of genes are identified to be of interest in relation to a disease, say through a genome wide association study (GWAS) or a different gene expression study, these genes are typically analyzed in the context of their protein-protein interaction (PPI) networks. Further analysis is carried out to compute the enrichment of known pathways and disease-associations in the network. Having tools for such analysis at the fingertips of biologists without the requirement for computer programming or curation of data would accelerate the characterization of genes of interest. Currently available tools do not integrate network and enrichment analysis and their visualizations, and most of them present results in formats not most conducive to human cognition. Results We developed the tool Lens for Enrichment and Network Studies of human proteins (LENS) that performs network and pathway and diseases enrichment analyses on genes of interest to users. The tool creates a visualization of the network, provides easy to read statistics on network connectivity, and displays Venn diagrams with statistical significance values of the network's association with drugs, diseases, pathways, and GWASs. We used the tool to analyze gene sets related to craniofacial development, autism, and schizophrenia. Conclusion LENS is a web-based tool that does not require and download or plugins to use. The tool is free and does not require login for use, and is available at http://severus.dbmi.pitt.edu/LENS. PMID:26680011

  4. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing.

    PubMed

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D'Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-04-05

    International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants' comprehension of the study information was measured by using a validated digitised audio questionnaire. The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants' 'recall' and 'understanding' between first and second visits were statistically significant (F (1,41)=25.38, p<0.00001 and (F (1, 41) = 31.61, p<0.00001 respectively. Our locally developed multimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings.

  5. Drug2Gene: an exhaustive resource to explore effectively the drug-target relation network.

    PubMed

    Roider, Helge G; Pavlova, Nadia; Kirov, Ivaylo; Slavov, Stoyan; Slavov, Todor; Uzunov, Zlatyo; Weiss, Bertram

    2014-03-11

    Information about drug-target relations is at the heart of drug discovery. There are now dozens of databases providing drug-target interaction data with varying scope, and focus. Therefore, and due to the large chemical space, the overlap of the different data sets is surprisingly small. As searching through these sources manually is cumbersome, time-consuming and error-prone, integrating all the data is highly desirable. Despite a few attempts, integration has been hampered by the diversity of descriptions of compounds, and by the fact that the reported activity values, coming from different data sets, are not always directly comparable due to usage of different metrics or data formats. We have built Drug2Gene, a knowledge base, which combines the compound/drug-gene/protein information from 19 publicly available databases. A key feature is our rigorous unification and standardization process which makes the data truly comparable on a large scale, allowing for the first time effective data mining in such a large knowledge corpus. As of version 3.2, Drug2Gene contains 4,372,290 unified relations between compounds and their targets most of which include reported bioactivity data. We extend this set with putative (i.e. homology-inferred) relations where sufficient sequence homology between proteins suggests they may bind to similar compounds. Drug2Gene provides powerful search functionalities, very flexible export procedures, and a user-friendly web interface. Drug2Gene v3.2 has become a mature and comprehensive knowledge base providing unified, standardized drug-target related information gathered from publicly available data sources. It can be used to integrate proprietary data sets with publicly available data sets. Its main goal is to be a 'one-stop shop' to identify tool compounds targeting a given gene product or for finding all known targets of a drug. Drug2Gene with its integrated data set of public compound-target relations is freely accessible without restrictions at http://www.drug2gene.com.

  6. A governance model for integrated primary/secondary care for the health-reforming first world – results of a systematic review

    PubMed Central

    2013-01-01

    Background Internationally, key health care reform elements rely on improved integration of care between the primary and secondary sectors. The objective of this systematic review is to synthesise the existing published literature on elements of current integrated primary/secondary health care. These elements and how they have supported integrated healthcare governance are presented. Methods A systematic review of peer-reviewed literature from PubMed, MEDLINE, CINAHL, the Cochrane Library, Informit Health Collection, the Primary Health Care Research and Information Service, the Canadian Health Services Research Foundation, European Foundation for Primary Care, European Forum for Primary Care, and Europa Sinapse was undertaken for the years 2006–2012. Relevant websites were also searched for grey literature. Papers were assessed by two assessors according to agreed inclusion criteria which were published in English, between 2006–2012, studies describing an integrated primary/secondary care model, and had reported outcomes in care quality, efficiency and/or satisfaction. Results Twenty-one studies met the inclusion criteria. All studies evaluated the process of integrated governance and service delivery structures, rather than the effectiveness of services. They included case reports and qualitative data analyses addressing policy change, business issues and issues of clinical integration. A thematic synthesis approach organising data according to themes identified ten elements needed for integrated primary/secondary health care governance across a regional setting including: joint planning; integrated information communication technology; change management; shared clinical priorities; incentives; population focus; measurement – using data as a quality improvement tool; continuing professional development supporting joint working; patient/community engagement; and, innovation. Conclusions All examples of successful primary/secondary care integration reported in the literature have focused on a combination of some, if not all, of the ten elements described in this paper, and there appears to be agreement that multiple elements are required to ensure successful and sustained integration efforts. Whilst no one model fits all systems these elements provide a focus for setting up integration initiatives which need to be flexible for adapting to local conditions and settings. PMID:24359610

  7. A governance model for integrated primary/secondary care for the health-reforming first world - results of a systematic review.

    PubMed

    Nicholson, Caroline; Jackson, Claire; Marley, John

    2013-12-20

    Internationally, key health care reform elements rely on improved integration of care between the primary and secondary sectors. The objective of this systematic review is to synthesise the existing published literature on elements of current integrated primary/secondary health care. These elements and how they have supported integrated healthcare governance are presented. A systematic review of peer-reviewed literature from PubMed, MEDLINE, CINAHL, the Cochrane Library, Informit Health Collection, the Primary Health Care Research and Information Service, the Canadian Health Services Research Foundation, European Foundation for Primary Care, European Forum for Primary Care, and Europa Sinapse was undertaken for the years 2006-2012. Relevant websites were also searched for grey literature. Papers were assessed by two assessors according to agreed inclusion criteria which were published in English, between 2006-2012, studies describing an integrated primary/secondary care model, and had reported outcomes in care quality, efficiency and/or satisfaction. Twenty-one studies met the inclusion criteria. All studies evaluated the process of integrated governance and service delivery structures, rather than the effectiveness of services. They included case reports and qualitative data analyses addressing policy change, business issues and issues of clinical integration. A thematic synthesis approach organising data according to themes identified ten elements needed for integrated primary/secondary health care governance across a regional setting including: joint planning; integrated information communication technology; change management; shared clinical priorities; incentives; population focus; measurement - using data as a quality improvement tool; continuing professional development supporting joint working; patient/community engagement; and, innovation. All examples of successful primary/secondary care integration reported in the literature have focused on a combination of some, if not all, of the ten elements described in this paper, and there appears to be agreement that multiple elements are required to ensure successful and sustained integration efforts. Whilst no one model fits all systems these elements provide a focus for setting up integration initiatives which need to be flexible for adapting to local conditions and settings.

  8. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  9. Integrating habitat status, human population pressure, and protection status into biodiversity conservation priority setting

    USGS Publications Warehouse

    Shi, Hua; Singh, Ashbindu; Kant, S.; Zhu, Zhiliang; Waller, E.

    2005-01-01

    Priority setting is an essential component of biodiversity conservation. Existing methods to identify priority areas for conservation have focused almost entirely on biological factors. We suggest a new relative ranking method for identifying priority conservation areas that integrates both biological and social aspects. It is based on the following criteria: the habitat's status, human population pressure, human efforts to protect habitat, and number of endemic plant and vertebrate species. We used this method to rank 25 hotspots, 17 megadiverse countries, and the hotspots within each megadiverse country. We used consistent, comprehensive, georeferenced, and multiband data sets and analytical remote sensing and geographic information system tools to quantify habitat status, human population pressure, and protection status. The ranking suggests that the Philippines, Atlantic Forest, Mediterranean Basin, Caribbean Islands, Caucasus, and Indo-Burma are the hottest hotspots and that China, the Philippines, and India are the hottest megadiverse countries. The great variation in terms of habitat, protected areas, and population pressure among the hotspots, the megadiverse countries, and the hotspots within the same country suggests the need for hotspot- and country-specific conservation policies.

  10. Allen Brain Atlas: an integrated spatio-temporal portal for exploring the central nervous system

    PubMed Central

    Sunkin, Susan M.; Ng, Lydia; Lau, Chris; Dolbeare, Tim; Gilbert, Terri L.; Thompson, Carol L.; Hawrylycz, Michael; Dang, Chinh

    2013-01-01

    The Allen Brain Atlas (http://www.brain-map.org) provides a unique online public resource integrating extensive gene expression data, connectivity data and neuroanatomical information with powerful search and viewing tools for the adult and developing brain in mouse, human and non-human primate. Here, we review the resources available at the Allen Brain Atlas, describing each product and data type [such as in situ hybridization (ISH) and supporting histology, microarray, RNA sequencing, reference atlases, projection mapping and magnetic resonance imaging]. In addition, standardized and unique features in the web applications are described that enable users to search and mine the various data sets. Features include both simple and sophisticated methods for gene searches, colorimetric and fluorescent ISH image viewers, graphical displays of ISH, microarray and RNA sequencing data, Brain Explorer software for 3D navigation of anatomy and gene expression, and an interactive reference atlas viewer. In addition, cross data set searches enable users to query multiple Allen Brain Atlas data sets simultaneously. All of the Allen Brain Atlas resources can be accessed through the Allen Brain Atlas data portal. PMID:23193282

  11. Data management integration for biomedical core facilities

    NASA Astrophysics Data System (ADS)

    Zhang, Guo-Qiang; Szymanski, Jacek; Wilson, David

    2007-03-01

    We present the design, development, and pilot-deployment experiences of MIMI, a web-based, Multi-modality Multi-Resource Information Integration environment for biomedical core facilities. This is an easily customizable, web-based software tool that integrates scientific and administrative support for a biomedical core facility involving a common set of entities: researchers; projects; equipments and devices; support staff; services; samples and materials; experimental workflow; large and complex data. With this software, one can: register users; manage projects; schedule resources; bill services; perform site-wide search; archive, back-up, and share data. With its customizable, expandable, and scalable characteristics, MIMI not only provides a cost-effective solution to the overarching data management problem of biomedical core facilities unavailable in the market place, but also lays a foundation for data federation to facilitate and support discovery-driven research.

  12. The challenge of causal inference in gene-environment interaction research: leveraging research designs from the social sciences.

    PubMed

    Fletcher, Jason M; Conley, Dalton

    2013-10-01

    The integration of genetics and the social sciences will lead to a more complex understanding of the articulation between social and biological processes, although the empirical difficulties inherent in this integration are large. One key challenge is the implications of moving "outside the lab" and away from the experimental tools available for research with model organisms. Social science research methods used to examine human behavior in nonexperimental, real-world settings to date have not been fully taken advantage of during this disciplinary integration, especially in the form of gene-environment interaction research. This article outlines and provides examples of several prominent research designs that should be used in gene-environment research and highlights a key benefit to geneticists of working with social scientists.

  13. The Role of Dog Population Management in Rabies Elimination—A Review of Current Approaches and Future Opportunities

    PubMed Central

    Taylor, Louise H.; Wallace, Ryan M.; Balaram, Deepashree; Lindenmayer, Joann M.; Eckery, Douglas C.; Mutonono-Watkiss, Beryl; Parravani, Ellie; Nel, Louis H.

    2017-01-01

    Free-roaming dogs and rabies transmission are integrally linked across many low-income countries, and large unmanaged dog populations can be daunting to rabies control program planners. Dog population management (DPM) is a multifaceted concept that aims to improve the health and well-being of free-roaming dogs, reduce problems they may cause, and may also aim to reduce dog population size. In theory, DPM can facilitate more effective rabies control. Community engagement focused on promoting responsible dog ownership and better veterinary care could improve the health of individual animals and dog vaccination coverage, thus reducing rabies transmission. Humane DPM tools, such as sterilization, could theoretically reduce dog population turnover and size, allowing rabies vaccination coverage to be maintained more easily. However, it is important to understand local dog populations and community attitudes toward them in order to determine whether and how DPM might contribute to rabies control and which DPM tools would be most successful. In practice, there is very limited evidence of DPM tools achieving reductions in the size or turnover of dog populations in canine rabies-endemic areas. Different DPM tools are frequently used together and combined with rabies vaccinations, but full impact assessments of DPM programs are not usually available, and therefore, evaluation of tools is difficult. Surgical sterilization is the most frequently documented tool and has successfully reduced dog population size and turnover in a few low-income settings. However, DPM programs are mostly conducted in urban settings and are usually not government funded, raising concerns about their applicability in rural settings and sustainability over time. Technical demands, costs, and the time necessary to achieve population-level impacts are major barriers. Given their potential value, we urgently need more evidence of the effectiveness of DPM tools in the context of canine rabies control. Cheaper, less labor-intensive tools for dog sterilization will be extremely valuable in realizing the potential benefits of reduced population turnover and size. No one DPM tool will fit all situations, but if DPM objectives are achieved dog populations may be stabilized or even reduced, facilitating higher dog vaccination coverages that will benefit rabies elimination efforts. PMID:28740850

  14. Empirical study using network of semantically related associations in bridging the knowledge gap.

    PubMed

    Abedi, Vida; Yeasin, Mohammed; Zand, Ramin

    2014-11-27

    The data overload has created a new set of challenges in finding meaningful and relevant information with minimal cognitive effort. However designing robust and scalable knowledge discovery systems remains a challenge. Recent innovations in the (biological) literature mining tools have opened new avenues to understand the confluence of various diseases, genes, risk factors as well as biological processes in bridging the gaps between the massive amounts of scientific data and harvesting useful knowledge. In this paper, we highlight some of the findings using a text analytics tool, called ARIANA--Adaptive Robust and Integrative Analysis for finding Novel Associations. Empirical study using ARIANA reveals knowledge discovery instances that illustrate the efficacy of such tool. For example, ARIANA can capture the connection between the drug hexamethonium and pulmonary inflammation and fibrosis that caused the tragic death of a healthy volunteer in a 2001 John Hopkins asthma study, even though the abstract of the study was not part of the semantic model. An integrated system, such as ARIANA, could assist the human expert in exploratory literature search by bringing forward hidden associations, promoting data reuse and knowledge discovery as well as stimulating interdisciplinary projects by connecting information across the disciplines.

  15. CATCh, an Ensemble Classifier for Chimera Detection in 16S rRNA Sequencing Studies

    PubMed Central

    Mysara, Mohamed; Saeys, Yvan; Leys, Natalie; Raes, Jeroen

    2014-01-01

    In ecological studies, microbial diversity is nowadays mostly assessed via the detection of phylogenetic marker genes, such as 16S rRNA. However, PCR amplification of these marker genes produces a significant amount of artificial sequences, often referred to as chimeras. Different algorithms have been developed to remove these chimeras, but efforts to combine different methodologies are limited. Therefore, two machine learning classifiers (reference-based and de novo CATCh) were developed by integrating the output of existing chimera detection tools into a new, more powerful method. When comparing our classifiers with existing tools in either the reference-based or de novo mode, a higher performance of our ensemble method was observed on a wide range of sequencing data, including simulated, 454 pyrosequencing, and Illumina MiSeq data sets. Since our algorithm combines the advantages of different individual chimera detection tools, our approach produces more robust results when challenged with chimeric sequences having a low parent divergence, short length of the chimeric range, and various numbers of parents. Additionally, it could be shown that integrating CATCh in the preprocessing pipeline has a beneficial effect on the quality of the clustering in operational taxonomic units. PMID:25527546

  16. [Compatibility of different quality control systems].

    PubMed

    Invernizzi, Enrico

    2002-01-01

    Management of the good laboratory practice (GLP) quality system presupposes its linking to a basic recognized and approved quality system, from which it can draw on management procedures common to all quality systems, such as the ISO 9000 set of norms. A quality system organized in this way can also be integrated with other dedicated quality systems, or parts of them, to obtain principles or management procedures for specific topics. The aim of this organization is to set up a reliable, recognized quality system compatible with the principles of GLP and other quality management systems, which provides users with a simplified set of easily accessible management tools and answers. The organization of this quality system is set out in the quality assurance programme, which is actually the document in which the test facility incorporates the GLP principles into its own quality organization.

  17. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  18. Integrated simulations for fusion research in the 2030's time frame (white paper outline)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, Alex; LoDestro, Lynda L.; Parker, Jeffrey B.

    This white paper presents the rationale for developing a community-wide capability for whole-device modeling, and advocates for an effort with the expectation of persistence: a long-term programmatic commitment, and support for community efforts. Statement of 2030 goal (two suggestions): (a) Robust integrated simulation tools to aid real-time experimental discharges and reactor designs by employing a hierarchy in fidelity of physics models. (b) To produce by the early 2030s a capability for validated, predictive simulation via integration of a suite of physics models from moderate through high fidelity, to understand and plan full plasma discharges, aid in data interpretation, carry outmore » discovery science, and optimize future machine designs. We can achieve this goal via a focused effort to extend current scientific capabilities and rigorously integrate simulations of disparate physics into a comprehensive set of workflows.« less

  19. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  20. Terminology tools: state of the art and practical lessons.

    PubMed

    Cimino, J J

    2001-01-01

    As controlled medical terminologies evolve from simple code-name-hierarchy arrangements, into rich, knowledge-based ontologies of medical concepts, increased demands are placed on both the developers and users of the terminologies. In response, researchers have begun developing tools to address their needs. The aims of this article are to review previous work done to develop these tools and then to describe work done at Columbia University and New York Presbyterian Hospital (NYPH). Researchers working with the Systematized Nomenclature of Medicine (SNOMED), the Unified Medical Language System (UMLS), and NYPH's Medical Entities Dictionary (MED) have created a wide variety of terminology browsers, editors and servers to facilitate creation, maintenance and use of these terminologies. Although much work has been done, no generally available tools have yet emerged. Consensus on requirement for tool functions, especially terminology servers is emerging. Tools at NYPH have been used successfully to support the integration of clinical applications and the merger of health care institutions. Significant advancement has occurred over the past fifteen years in the development of sophisticated controlled terminologies and the tools to support them. The tool set at NYPH provides a case study to demonstrate one feasible architecture.

Top