Sample records for tool set developed

  1. Multi-criteria decision analysis of breast cancer control in low- and middle- income countries: development of a rating tool for policy makers.

    PubMed

    Venhorst, Kristie; Zelle, Sten G; Tromp, Noor; Lauer, Jeremy A

    2014-01-01

    The objective of this study was to develop a rating tool for policy makers to prioritize breast cancer interventions in low- and middle- income countries (LMICs), based on a simple multi-criteria decision analysis (MCDA) approach. The definition and identification of criteria play a key role in MCDA, and our rating tool could be used as part of a broader priority setting exercise in a local setting. This tool may contribute to a more transparent priority-setting process and fairer decision-making in future breast cancer policy development. First, an expert panel (n = 5) discussed key considerations for tool development. A literature review followed to inventory all relevant criteria and construct an initial set of criteria. A Delphi study was then performed and questionnaires used to discuss a final list of criteria with clear definitions and potential scoring scales. For this Delphi study, multiple breast cancer policy and priority-setting experts from different LMICs were selected and invited by the World Health Organization. Fifteen international experts participated in all three Delphi rounds to assess and evaluate each criterion. This study resulted in a preliminary rating tool for assessing breast cancer interventions in LMICs. The tool consists of 10 carefully crafted criteria (effectiveness, quality of the evidence, magnitude of individual health impact, acceptability, cost-effectiveness, technical complexity, affordability, safety, geographical coverage, and accessibility), with clear definitions and potential scoring scales. This study describes the development of a rating tool to assess breast cancer interventions in LMICs. Our tool can offer supporting knowledge for the use or development of rating tools as part of a broader (MCDA based) priority setting exercise in local settings. Further steps for improving the tool are proposed and should lead to its useful adoption in LMICs.

  2. Image Processing Using a Parallel Architecture.

    DTIC Science & Technology

    1987-12-01

    ENG/87D-25 Abstract This study developed a set o± low level image processing tools on a parallel computer that allows concurrent processing of images...environment, the set of tools offers a significant reduction in the time required to perform some commonly used image processing operations. vI IMAGE...step toward developing these systems, a structured set of image processing tools was implemented using a parallel computer. More important than

  3. Measuring social exclusion in healthcare settings: a scoping review.

    PubMed

    O'Donnell, Patrick; O'Donovan, Diarmuid; Elmusharaf, Khalifa

    2018-02-02

    Social exclusion is a concept that has been widely debated in recent years; a particular focus of the discussion has been its significance in relation to health. The meanings of the phrase "social exclusion", and the closely associated term "social inclusion", are contested in the literature. Both of these concepts are important in relation to health and the area of primary healthcare in particular. Thus, several tools for the measurement of social exclusion or social inclusion status in health care settings have been developed. A scoping review of the peer-reviewed and grey literature was conducted to examine tools developed since 2000 that measure social exclusion or social inclusion. We focused on those measurement tools developed for use with individual patients in healthcare settings. Efforts were made to obtain a copy of each of the original tools, and all relevant background literature. All tools retrieved were compared in tables, and the specific domains that were included in each measure were tabulated. Twenty-two measurement tools were included in the final scoping review. The majority of these had been specifically developed for the measurement of social inclusion or social exclusion, but a small number were created for the measurement of other closely aligned concepts. The majority of the tools included were constructed for engaging with patients in mental health settings. The tools varied greatly in their design, the scoring systems and the ways they were administered. The domains covered by these tools varied widely and some of the tools were quite narrow in the areas of focus. A review of the definitions of both social inclusion and social exclusion also revealed the variations among the explanations of these complex concepts. There are several definitions of both social inclusion and social exclusion in use and they differ greatly in scope. While there are many tools that have been developed for measuring these concepts in healthcare settings, these do not have a primary healthcare focus. There is a need for the development of a tool for measuring social inclusion or social exclusion in primary healthcare settings.

  4. Coproducing Aboriginal patient journey mapping tools for improved quality and coordination of care.

    PubMed

    Kelly, Janet; Dwyer, Judith; Mackean, Tamara; O'Donnell, Kim; Willis, Eileen

    2016-12-08

    This paper describes the rationale and process for developing a set of Aboriginal patient journey mapping tools with Aboriginal patients, health professionals, support workers, educators and researchers in the Managing Two Worlds Together project between 2008 and 2015. Aboriginal patients and their families from rural and remote areas, and healthcare providers in urban, rural and remote settings, shared their perceptions of the barriers and enablers to quality care in interviews and focus groups, and individual patient journey case studies were documented. Data were thematically analysed. In the absence of suitable existing tools, a new analytical framework and mapping approach was developed. The utility of the tools in other settings was then tested with health professionals, and the tools were further modified for use in quality improvement in health and education settings in South Australia and the Northern Territory. A central set of patient journey mapping tools with flexible adaptations, a workbook, and five sets of case studies describing how staff adapted and used the tools at different sites are available for wider use.

  5. A strategy to improve priority setting in developing countries.

    PubMed

    Kapiriri, Lydia; Martin, Douglas K

    2007-09-01

    Because the demand for health services outstrips the available resources, priority setting is one of the most difficult issues faced by health policy makers, particularly those in developing countries. Priority setting in developing countries is fraught with uncertainty due to lack of credible information, weak priority setting institutions, and unclear priority setting processes. Efforts to improve priority setting in these contexts have focused on providing information and tools. In this paper we argue that priority setting is a value laden and political process, and although important, the available information and tools are not sufficient to address the priority setting challenges in developing countries. Additional complementary efforts are required. Hence, a strategy to improve priority setting in developing countries should also include: (i) capturing current priority setting practices, (ii) improving the legitimacy and capacity of institutions that set priorities, and (iii) developing fair priority setting processes.

  6. Facility Composer (Trademark) and PACES (Trademark) Integration: Development of an XML Interface Based on Industry Foundation Classes

    DTIC Science & Technology

    2007-11-01

    Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria

  7. Cloud-based MOTIFSIM: Detecting Similarity in Large DNA Motif Data Sets.

    PubMed

    Tran, Ngoc Tam L; Huang, Chun-Hsi

    2017-05-01

    We developed the cloud-based MOTIFSIM on Amazon Web Services (AWS) cloud. The tool is an extended version from our web-based tool version 2.0, which was developed based on a novel algorithm for detecting similarity in multiple DNA motif data sets. This cloud-based version further allows researchers to exploit the computing resources available from AWS to detect similarity in multiple large-scale DNA motif data sets resulting from the next-generation sequencing technology. The tool is highly scalable with expandable AWS.

  8. What makes a sustainability tool valuable, practical and useful in real-world healthcare practice? A mixed-methods study on the development of the Long Term Success Tool in Northwest London

    PubMed Central

    Lennox, Laura; Doyle, Cathal; Reed, Julie E

    2017-01-01

    Objectives Although improvement initiatives show benefits to patient care, they often fail to sustain. Models and frameworks exist to address this challenge, but issues with design, clarity and usability have been barriers to use in healthcare settings. This work aimed to collaborate with stakeholders to develop a sustainability tool relevant to people in healthcare settings and practical for use in improvement initiatives. Design Tool development was conducted in six stages. A scoping literature review, group discussions and a stakeholder engagement event explored literature findings and their resonance with stakeholders in healthcare settings. Interviews, small-scale trialling and piloting explored the design and tested the practicality of the tool in improvement initiatives. Setting National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care for Northwest London (CLAHRC NWL). Participants CLAHRC NWL improvement initiative teams and staff. Results The iterative design process and engagement of stakeholders informed the articulation of the sustainability factors identified from the literature and guided tool design for practical application. Key iterations of factors and tool design are discussed. From the development process, the Long Term Success Tool (LTST) has been designed. The Tool supports those implementing improvements to reflect on 12 sustainability factors to identify risks to increase chances of achieving sustainability over time. The Tool is designed to provide a platform for improvement teams to share their own views on sustainability as well as learn about the different views held within their team to prompt discussion and actions. Conclusion The development of the LTST has reinforced the importance of working with stakeholders to design strategies which respond to their needs and preferences and can practically be implemented in real-world settings. Further research is required to study the use and effectiveness of the tool in practice and assess engagement with the method over time. PMID:28947436

  9. KNOW ESSENTIALS: a tool for informed decisions in the absence of formal HTA systems.

    PubMed

    Mathew, Joseph L

    2011-04-01

    Most developing countries and resource-limited settings lack robust health technology assessment (HTA) systems. Because the development of locally relevant HTA is not immediately viable, and the extrapolation of external HTA is inappropriate, a new model for evaluating health technologies is required. The aim of this study was to describe the development and application of KNOW ESSENTIALS, a tool facilitating evidence-based decisions on health technologies by stakeholders in settings lacking formal HTA systems. Current HTA methodology was examined through literature search. Additional issues relevant to resource-limited settings, but not adequately addressed in current methodology, were identified through further literature search, appraisal of contextually relevant issues, discussion with healthcare professionals familiar with the local context, and personal experience. A set of thirteen elements important for evidence-based decisions was identified, selected and combined into a tool with the mnemonic KNOW ESSENTIALS. Detailed definitions for each element, coding for the elements, and a system to evaluate a given health technology using the tool were developed. Developing countries and resource-limited settings face several challenges to informed decision making. Models that are relevant and applicable in high-income countries are unlikely in such settings. KNOW ESSENTIALS is an alternative that facilitates evidence-based decision making by stakeholders without formal expertise in HTA. The tool could be particularly useful, as an interim measure, in healthcare systems that are developing HTA capacity. It could also be useful anywhere when rapid evidence-based decisions on health technologies are required.

  10. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  11. Two New Tools for Glycopeptide Analysis Researchers: A Glycopeptide Decoy Generator and a Large Data Set of Assigned CID Spectra of Glycopeptides.

    PubMed

    Lakbub, Jude C; Su, Xiaomeng; Zhu, Zhikai; Patabandige, Milani W; Hua, David; Go, Eden P; Desaire, Heather

    2017-08-04

    The glycopeptide analysis field is tightly constrained by a lack of effective tools that translate mass spectrometry data into meaningful chemical information, and perhaps the most challenging aspect of building effective glycopeptide analysis software is designing an accurate scoring algorithm for MS/MS data. We provide the glycoproteomics community with two tools to address this challenge. The first tool, a curated set of 100 expert-assigned CID spectra of glycopeptides, contains a diverse set of spectra from a variety of glycan types; the second tool, Glycopeptide Decoy Generator, is a new software application that generates glycopeptide decoys de novo. We developed these tools so that emerging methods of assigning glycopeptides' CID spectra could be rigorously tested. Software developers or those interested in developing skills in expert (manual) analysis can use these tools to facilitate their work. We demonstrate the tools' utility in assessing the quality of one particular glycopeptide software package, GlycoPep Grader, which assigns glycopeptides to CID spectra. We first acquired the set of 100 expert assigned CID spectra; then, we used the Decoy Generator (described herein) to generate 20 decoys per target glycopeptide. The assigned spectra and decoys were used to test the accuracy of GlycoPep Grader's scoring algorithm; new strengths and weaknesses were identified in the algorithm using this approach. Both newly developed tools are freely available. The software can be downloaded at http://glycopro.chem.ku.edu/GPJ.jar.

  12. Toolpack mathematical software development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osterweil, L.

    1982-07-21

    The purpose of this research project was to produce a well integrated set of tools for the support of numerical computation. The project entailed the specification, design and implementation of both a diversity of tools and an innovative tool integration mechanism. This large configuration of tightly integrated tools comprises an environment for numerical software development, and has been named Toolpack/IST (Integrated System of Tools). Following the creation of this environment in prototype form, the environment software was readied for widespread distribution by transitioning it to a development organization for systematization, documentation and distribution. It is expected that public release ofmore » Toolpack/IST will begin imminently and will provide a basis for evaluation of the innovative software approaches taken as well as a uniform set of development tools for the numerical software community.« less

  13. Ready, Set, Change! Development and usability testing of an online readiness for change decision support tool for healthcare organizations.

    PubMed

    Timmings, Caitlyn; Khan, Sobia; Moore, Julia E; Marquez, Christine; Pyka, Kasha; Straus, Sharon E

    2016-02-24

    To address challenges related to selecting a valid, reliable, and appropriate readiness assessment measure in practice, we developed an online decision support tool to aid frontline implementers in healthcare settings in this process. The focus of this paper is to describe a multi-step, end-user driven approach to developing this tool for use during the planning stages of implementation. A multi-phase, end-user driven approach was used to develop and test the usability of a readiness decision support tool. First, readiness assessment measures that are valid, reliable, and appropriate for healthcare settings were identified from a systematic review. Second, a mapping exercise was performed to categorize individual items of included measures according to key readiness constructs from an existing framework. Third, a modified Delphi process was used to collect stakeholder ratings of the included measures on domains of feasibility, relevance, and likelihood to recommend. Fourth, two versions of a decision support tool prototype were developed and evaluated for usability. Nine valid and reliable readiness assessment measures were included in the decision support tool. The mapping exercise revealed that of the nine measures, most measures (78 %) focused on assessing readiness for change at the organizational versus the individual level, and that four measures (44 %) represented all constructs of organizational readiness. During the modified Delphi process, stakeholders rated most measures as feasible and relevant for use in practice, and reported that they would be likely to recommend use of most measures. Using data from the mapping exercise and stakeholder panel, an algorithm was developed to link users to a measure based on characteristics of their organizational setting and their readiness for change assessment priorities. Usability testing yielded recommendations that were used to refine the Ready, Set, Change! decision support tool . Ready, Set, Change! decision support tool is an implementation support that is designed to facilitate the routine incorporation of a readiness assessment as an early step in implementation. Use of this tool in practice may offer time and resource-saving implications for implementation.

  14. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    DOT National Transportation Integrated Search

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  15. A validated set of tool pictures with matched objects and non-objects for laterality research.

    PubMed

    Verma, Ark; Brysbaert, Marc

    2015-01-01

    Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.

  16. Development of an Ada package library

    NASA Technical Reports Server (NTRS)

    Burton, Bruce; Broido, Michael

    1986-01-01

    A usable prototype Ada package library was developed and is currently being evaluated for use in large software development efforts. The library system is comprised of an Ada-oriented design language used to facilitate the collection of reuse information, a relational data base to store reuse information, a set of reusable Ada components and tools, and a set of guidelines governing the system's use. The prototyping exercise is discussed and the lessons learned from it have led to the definition of a comprehensive tool set to facilitate software reuse.

  17. The Gene Set Builder: collation, curation, and distribution of sets of genes

    PubMed Central

    Yusuf, Dimas; Lim, Jonathan S; Wasserman, Wyeth W

    2005-01-01

    Background In bioinformatics and genomics, there are many applications designed to investigate the common properties for a set of genes. Often, these multi-gene analysis tools attempt to reveal sequential, functional, and expressional ties. However, while tremendous effort has been invested in developing tools that can analyze a set of genes, minimal effort has been invested in developing tools that can help researchers compile, store, and annotate gene sets in the first place. As a result, the process of making or accessing a set often involves tedious and time consuming steps such as finding identifiers for each individual gene. These steps are often repeated extensively to shift from one identifier type to another; or to recreate a published set. In this paper, we present a simple online tool which – with the help of the gene catalogs Ensembl and GeneLynx – can help researchers build and annotate sets of genes quickly and easily. Description The Gene Set Builder is a database-driven, web-based tool designed to help researchers compile, store, export, and share sets of genes. This application supports the 17 eukaryotic genomes found in version 32 of the Ensembl database, which includes species from yeast to human. User-created information such as sets and customized annotations are stored to facilitate easy access. Gene sets stored in the system can be "exported" in a variety of output formats – as lists of identifiers, in tables, or as sequences. In addition, gene sets can be "shared" with specific users to facilitate collaborations or fully released to provide access to published results. The application also features a Perl API (Application Programming Interface) for direct connectivity to custom analysis tools. A downloadable Quick Reference guide and an online tutorial are available to help new users learn its functionalities. Conclusion The Gene Set Builder is an Ensembl-facilitated online tool designed to help researchers compile and manage sets of genes in a user-friendly environment. The application can be accessed via . PMID:16371163

  18. Data and Tools | NREL

    Science.gov Websites

    Data and Tools Data and Tools NREL develops data sets, maps, models, and tools for the analysis of , models, and tools in the alphabetical listing. Popular Resources PVWatts Calculator Geospatial Data

  19. metAlignID: a high-throughput software tool set for automated detection of trace level contaminants in comprehensive LECO two-dimensional gas chromatography time-of-flight mass spectrometry data.

    PubMed

    Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J

    2012-11-09

    A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Rotorcraft Conceptual Design Environment

    DTIC Science & Technology

    2009-10-01

    systems engineering design tool sets. The DaVinci Project vision is to develop software architecture and tools specifically for acquisition system...enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. Introduction...information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION

  1. GARNET--gene set analysis with exploration of annotation relations.

    PubMed

    Rho, Kyoohyoung; Kim, Bumjin; Jang, Youngjun; Lee, Sanghyun; Bae, Taejeong; Seo, Jihae; Seo, Chaehwa; Lee, Jihyun; Kang, Hyunjung; Yu, Ungsik; Kim, Sunghoon; Lee, Sanghyuk; Kim, Wan Kyu

    2011-02-15

    Gene set analysis is a powerful method of deducing biological meaning for an a priori defined set of genes. Numerous tools have been developed to test statistical enrichment or depletion in specific pathways or gene ontology (GO) terms. Major difficulties towards biological interpretation are integrating diverse types of annotation categories and exploring the relationships between annotation terms of similar information. GARNET (Gene Annotation Relationship NEtwork Tools) is an integrative platform for gene set analysis with many novel features. It includes tools for retrieval of genes from annotation database, statistical analysis & visualization of annotation relationships, and managing gene sets. In an effort to allow access to a full spectrum of amassed biological knowledge, we have integrated a variety of annotation data that include the GO, domain, disease, drug, chromosomal location, and custom-defined annotations. Diverse types of molecular networks (pathways, transcription and microRNA regulations, protein-protein interaction) are also included. The pair-wise relationship between annotation gene sets was calculated using kappa statistics. GARNET consists of three modules--gene set manager, gene set analysis and gene set retrieval, which are tightly integrated to provide virtually automatic analysis for gene sets. A dedicated viewer for annotation network has been developed to facilitate exploration of the related annotations. GARNET (gene annotation relationship network tools) is an integrative platform for diverse types of gene set analysis, where complex relationships among gene annotations can be easily explored with an intuitive network visualization tool (http://garnet.isysbio.org/ or http://ercsb.ewha.ac.kr/garnet/).

  2. Vega-Constellation Tools to Analize Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Savorskiy, V.; Loupian, E.; Balashov, I.; Kashnitskii, A.; Konstantinova, A.; Tolpin, V.; Uvarov, I.; Kuznetsov, O.; Maklakov, S.; Panova, O.; Savchenko, E.

    2016-06-01

    Creating high-performance means to manage massive hyperspectral data (HSD) arrays is an actual challenge when it is implemented to deal with disparate information resources. Aiming to solve this problem the present work develops tools to work with HSD in a distributed information infrastructure, i.e. primarily to use those tools in remote access mode. The main feature of presented approach is in the development of remotely accessed services, which allow users both to conduct search and retrieval procedures on HSD sets and to provide target users with tools to analyze and to process HSD in remote mode. These services were implemented within VEGA-Constellation family information systems that were extended by adding tools oriented to support the studies of certain classes of natural objects by exploring their HSD. Particular developed tools provide capabilities to conduct analysis of such objects as vegetation canopies (forest and agriculture), open soils, forest fires, and areas of thermal anomalies. Developed software tools were successfully tested on Hyperion data sets.

  3. Evaluation of Micronutrient Sensors for Food Matrices in Resource-Limited Settings: A Systematic Narrative Review.

    PubMed

    Waller, Anna W; Lotton, Jennifer L; Gaur, Shashank; Andrade, Jeanette M; Andrade, Juan E

    2018-06-21

    In resource-limited settings, mass food fortification is a common strategy to ensure the population consumes appropriate quantities of essential micronutrients. Food and government organizations in these settings, however, lack tools to monitor the quality and compliance of fortified products and their efficacy to enhance nutrient status. The World Health Organization has developed general guidelines known as ASSURED (Affordable, Sensitive, Specific, User-friendly, Rapid and Robust, Equipment-free, and Deliverable to end-users) to aid the development of useful diagnostic tools for these settings. These guidelines assume performance aspects such as sufficient accuracy, reliability, and validity. The purpose of this systematic narrative review is to examine the micronutrient sensor literature on its adherence towards the ASSURED criteria along with accuracy, reliability, and validation when developing micronutrient sensors for resource-limited settings. Keyword searches were conducted in three databases: Web of Science, PubMed, and Scopus and were based on 6-point inclusion criteria. A 16-question quality assessment tool was developed to determine the adherence towards quality and performance criteria. Of the 2,365 retrieved studies, 42 sensors were included based on inclusion/exclusion criteria. Results showed that improvements to the current sensor design are necessary, especially their affordability, user-friendliness, robustness, equipment-free, and deliverability within the ASSURED criteria, and accuracy and validity of the additional criteria to be useful in resource-limited settings. Although it requires further validation, the 16-question quality assessment tool can be used as a guide in the development of sensors for resource-limited settings. © 2018 Institute of Food Technologists®.

  4. Cognitive screening tools for identification of dementia in illiterate and low-educated older adults, a systematic review and meta-analysis.

    PubMed

    Paddick, Stella-Maria; Gray, William K; McGuire, Jackie; Richardson, Jenny; Dotchin, Catherine; Walker, Richard W

    2017-06-01

    The majority of older adults with dementia live in low- and middle-income countries (LMICs). Illiteracy and low educational background are common in older LMIC populations, particularly in rural areas, and cognitive screening tools developed for this setting must reflect this. This study aimed to review published validation studies of cognitive screening tools for dementia in low-literacy settings in order to determine the most appropriate tools for use. A systematic search of major databases was conducted according to PRISMA guidelines. Validation studies of brief cognitive screening tests including illiterate participants or those with elementary education were eligible. Studies were quality assessed using the QUADAS-2 tool. Good or fair quality studies were included in a bivariate random-effects meta-analysis and a hierarchical summary receiver operating characteristic (HSROC) curve constructed. Forty-five eligible studies were quality assessed. A significant proportion utilized a case-control design, resulting in spectrum bias. The area under the ROC (AUROC) curve was 0.937 for community/low prevalence studies, 0.881 for clinic based/higher prevalence studies, and 0.869 for illiterate populations. For the Mini-Mental State Examination (MMSE) (and adaptations), the AUROC curve was 0.853. Numerous tools for assessment of cognitive impairment in low-literacy settings have been developed, and tools developed for use in high-income countries have also been validated in low-literacy settings. Most tools have been inadequately validated, with only MMSE, cognitive abilities screening instrument (CASI), Eurotest, and Fototest having more than one published good or fair quality study in an illiterate or low-literate setting. At present no screening test can be recommended.

  5. Machine Tool Technology. Automatic Screw Machine Troubleshooting & Set-Up Training Outlines [and] Basic Operator's Skills Set List.

    ERIC Educational Resources Information Center

    Anoka-Hennepin Technical Coll., Minneapolis, MN.

    This set of two training outlines and one basic skills set list are designed for a machine tool technology program developed during a project to retrain defense industry workers at risk of job loss or dislocation because of conversion of the defense industry. The first troubleshooting training outline lists the categories of problems that develop…

  6. Implementation and Assessment of a Virtual Laboratory of Parallel Robots Developed for Engineering Students

    ERIC Educational Resources Information Center

    Gil, Arturo; Peidró, Adrián; Reinoso, Óscar; Marín, José María

    2017-01-01

    This paper presents a tool, LABEL, oriented to the teaching of parallel robotics. The application, organized as a set of tools developed using Easy Java Simulations, enables the study of the kinematics of parallel robotics. A set of classical parallel structures was implemented such that LABEL can solve the inverse and direct kinematic problem of…

  7. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  8. The development of environmental assessment tools to support the creation of dementia friendly care environments: Innovative practice.

    PubMed

    Waller, Sarah; Masterson, Abigail; Evans, Simon C

    2017-02-01

    The need for more dementia friendly design in hospitals and other care settings is now widely acknowledged. Working with 26 NHS Trusts in England as part of a Department of Health commissioned programme, The King's Fund developed a set of overarching design principles and an environmental assessment tool for hospital wards in 2012. Following requests from other sectors, additional tools were developed for hospitals, care homes, health centres and housing with care. The tools have proven to be effective in both disseminating the principles of dementia friendly design and in enabling the case to be made for improvements that have a positive effect on patient outcomes and staff morale. This paper reports on the development, use and review of the environmental assessment tools, including further work that is now being taken forward by The Association for Dementia Studies, University of Worcester.

  9. Generative Text Sets: Tools for Negotiating Critically Inclusive Early Childhood Teacher Education Pedagogical Practices

    ERIC Educational Resources Information Center

    Souto-Manning, Mariana

    2017-01-01

    Through a case study, this article sheds light onto generative text sets as tools for developing and enacting critically inclusive early childhood teacher education pedagogies. In doing so, it positions teaching and learning processes as sociocultural, historical, and political acts as it inquires into the use of generative text sets in one early…

  10. What makes a sustainability tool valuable, practical and useful in real-world healthcare practice? A mixed-methods study on the development of the Long Term Success Tool in Northwest London.

    PubMed

    Lennox, Laura; Doyle, Cathal; Reed, Julie E; Bell, Derek

    2017-09-24

    Although improvement initiatives show benefits to patient care, they often fail to sustain. Models and frameworks exist to address this challenge, but issues with design, clarity and usability have been barriers to use in healthcare settings. This work aimed to collaborate with stakeholders to develop a sustainability tool relevant to people in healthcare settings and practical for use in improvement initiatives. Tool development was conducted in six stages. A scoping literature review, group discussions and a stakeholder engagement event explored literature findings and their resonance with stakeholders in healthcare settings. Interviews, small-scale trialling and piloting explored the design and tested the practicality of the tool in improvement initiatives. National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care for Northwest London (CLAHRC NWL). CLAHRC NWL improvement initiative teams and staff. The iterative design process and engagement of stakeholders informed the articulation of the sustainability factors identified from the literature and guided tool design for practical application. Key iterations of factors and tool design are discussed. From the development process, the Long Term Success Tool (LTST) has been designed. The Tool supports those implementing improvements to reflect on 12 sustainability factors to identify risks to increase chances of achieving sustainability over time. The Tool is designed to provide a platform for improvement teams to share their own views on sustainability as well as learn about the different views held within their team to prompt discussion and actions. The development of the LTST has reinforced the importance of working with stakeholders to design strategies which respond to their needs and preferences and can practically be implemented in real-world settings. Further research is required to study the use and effectiveness of the tool in practice and assess engagement with the method over time. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Psychometric Analysis of the Servicemember Evaluation Tool

    DTIC Science & Technology

    to assess psychological resilience. The Naval Center for Combat and Operational Stress Control developed the Servicemember Evaluation Tool (SET) to...vessels on deployment. The goals of this thesis are to evaluate the psychometric properties of the SET on this sample population. Furthermore, this

  12. MOTIFSIM 2.1: An Enhanced Software Platform for Detecting Similarity in Multiple DNA Motif Data Sets

    PubMed Central

    Huang, Chun-Hsi

    2017-01-01

    Abstract Finding binding site motifs plays an important role in bioinformatics as it reveals the transcription factors that control the gene expression. The development for motif finders has flourished in the past years with many tools have been introduced to the research community. Although these tools possess exceptional features for detecting motifs, they report different results for an identical data set. Hence, using multiple tools is recommended because motifs reported by several tools are likely biologically significant. However, the results from multiple tools need to be compared for obtaining common significant motifs. MOTIFSIM web tool and command-line tool were developed for this purpose. In this work, we present several technical improvements as well as additional features to further support the motif analysis in our new release MOTIFSIM 2.1. PMID:28632401

  13. Open-source tools for data mining.

    PubMed

    Zupan, Blaz; Demsar, Janez

    2008-03-01

    With a growing volume of biomedical databases and repositories, the need to develop a set of tools to address their analysis and support knowledge discovery is becoming acute. The data mining community has developed a substantial set of techniques for computational treatment of these data. In this article, we discuss the evolution of open-source toolboxes that data mining researchers and enthusiasts have developed over the span of a few decades and review several currently available open-source data mining suites. The approaches we review are diverse in data mining methods and user interfaces and also demonstrate that the field and its tools are ready to be fully exploited in biomedical research.

  14. Translations from Kommunist, Number 13, September 1978

    DTIC Science & Technology

    1978-10-30

    programmed machine tool here is merely a component of a more complex reprogrammable technological system. This includes the robot machine tools with...sufficient possibilities for changing technological operations and processes and automated technological lines. 52 The reprogrammable automated sets will...simulate the possibilities of such sets. A new technological level will be developed in industry related to reprogrammable automated sets, their design

  15. Phylogenetic Reconstruction as a Broadly Applicable Teaching Tool in the Biology Classroom: The Value of Data in Estimating Likely Answers

    ERIC Educational Resources Information Center

    Julius, Matthew L.; Schoenfuss, Heiko L.

    2006-01-01

    This laboratory exercise introduces students to a fundamental tool in evolutionary biology--phylogenetic inference. Students are required to create a data set via observation and through mining preexisting data sets. These student data sets are then used to develop and compare competing hypotheses of vertebrate phylogeny. The exercise uses readily…

  16. Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3

    NASA Astrophysics Data System (ADS)

    Endsley, K. A.; Billmire, M. G.

    2016-01-01

    Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.

  17. Physical activity and healthy eating environmental audit tools in youth care settings: A systematic review.

    PubMed

    Ajja, Rahma; Beets, Michael W; Chandler, Jessica; Kaczynski, Andrew T; Ward, Dianne S

    2015-08-01

    There is a growing interest in evaluating the physical activity (PA) and healthy eating (HE) policy and practice environment characteristics in settings frequented by youth (≤18years). This review evaluates the measurement properties of audit tools designed to assess PA and HE policy and practice environment characteristics in settings that care for youth (e.g., childcare, school, afterschool, summer camp). Three electronic databases, reference lists, educational department and national health organizations' web pages were searched between January 1980 and February 2014 to identify tools assessing PA and/or HE policy and practice environments in settings that care for youth (≤18years). Sixty-five audit tools were identified of which 53 individual tools met the inclusion criteria. Thirty-three tools assessed both the PA and HE domains, 6 assessed the PA domain and 14 assessed the HE domain solely. The majority of the tools were self-assessment tools (n=40), and were developed to assess the PA and/or HE environment in school settings (n=33), childcare (n=12), and after school programs (n=4). Four tools assessed the community at-large and had sections for assessing preschool, school and/or afterschool settings within the tool. The majority of audit tools lacked validity and/or reliability data (n=42). Inter-rater reliability and construct validity were the most frequently reported reliability (n=7) and validity types (n=5). Limited attention has been given to establishing the reliability and validity of audit tools for settings that care for youth. Future efforts should be directed towards establishing a strong measurement foundation for these important environmental audit tools. Published by Elsevier Inc.

  18. Evaluating the Development of Science Research Skills in Work-Integrated Learning through the Use of Workplace Science Tools

    ERIC Educational Resources Information Center

    McCurdy, Susan M.; Zegwaard, Karsten E.; Dalgety, Jacinta

    2013-01-01

    Concept understanding, the development of analytical skills and a research mind set are explored through the use of academic tools common in a tertiary science education and relevant work-integrated learning (WIL) experiences. The use and development of the tools; laboratory book, technical report, and literature review are examined by way of…

  19. Determination of real machine-tool settings and minimization of real surface deviation by computerized inspection

    NASA Technical Reports Server (NTRS)

    Litvin, Faydor L.; Kuan, Chihping; Zhang, YI

    1991-01-01

    A numerical method is developed for the minimization of deviations of real tooth surfaces from the theoretical ones. The deviations are caused by errors of manufacturing, errors of installment of machine-tool settings and distortion of surfaces by heat-treatment. The deviations are determined by coordinate measurements of gear tooth surfaces. The minimization of deviations is based on the proper correction of initially applied machine-tool settings. The contents of accomplished research project cover the following topics: (1) Descriptions of the principle of coordinate measurements of gear tooth surfaces; (2) Deviation of theoretical tooth surfaces (with examples of surfaces of hypoid gears and references for spiral bevel gears); (3) Determination of the reference point and the grid; (4) Determination of the deviations of real tooth surfaces at the points of the grid; and (5) Determination of required corrections of machine-tool settings for minimization of deviations. The procedure for minimization of deviations is based on numerical solution of an overdetermined system of n linear equations in m unknowns (m much less than n ), where n is the number of points of measurements and m is the number of parameters of applied machine-tool settings to be corrected. The developed approach is illustrated with numerical examples.

  20. Leveraging Python Interoperability Tools to Improve Sapphire's Usability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gezahegne, A; Love, N S

    2007-12-10

    The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.

  1. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  2. Follow up: Compound data sets and software tools for chemoinformatics and medicinal chemistry applications: update and data transfer.

    PubMed

    Hu, Ye; Bajorath, Jürgen

    2014-01-01

    In 2012, we reported 30 compound data sets and/or programs developed in our laboratory in a data article and made them freely available to the scientific community to support chemoinformatics and computational medicinal chemistry applications. These data sets and computational tools were provided for download from our website. Since publication of this data article, we have generated 13 new data sets with which we further extend our collection of publicly available data and tools. Due to changes in web servers and website architectures, data accessibility has recently been limited at times. Therefore, we have also transferred our data sets and tools to a public repository to ensure full and stable accessibility. To aid in data selection, we have classified the data sets according to scientific subject areas. Herein, we describe new data sets, introduce the data organization scheme, summarize the database content and provide detailed access information in ZENODO (doi: 10.5281/zenodo.8451 and doi:10.5281/zenodo.8455).

  3. The development of an online decision support tool for organizational readiness for change.

    PubMed

    Khan, Sobia; Timmings, Caitlyn; Moore, Julia E; Marquez, Christine; Pyka, Kasha; Gheihman, Galina; Straus, Sharon E

    2014-05-10

    Much importance has been placed on assessing readiness for change as one of the earliest steps of implementation, but measuring it can be a complex and daunting task. Organizations and individuals struggle with how to reliably and accurately measure readiness for change. Several measures have been developed to help organizations assess readiness, but these are often underused due to the difficulty of selecting the right measure. In response to this challenge, we will develop and test a prototype of a decision support tool that is designed to guide individuals interested in implementation in the selection of an appropriate readiness assessment measure for their setting. A multi-phase approach will be used to develop the decision support tool. First, we will identify key measures for assessing organizational readiness for change from a recently completed systematic review. Included measures will be those developed for healthcare settings (e.g., acute care, public health, mental health) and that have been deemed valid and reliable. Second, study investigators and field experts will engage in a mapping exercise to categorize individual items of included measures according to key readiness constructs from an existing framework. Third, a stakeholder panel will be recruited and consulted to determine the feasibility and relevance of the selected measures using a modified Delphi process. Fourth, findings from the mapping exercise and stakeholder consultation will inform the development of a decision support tool that will guide users in appropriately selecting change readiness measures. Fifth, the tool will undergo usability testing. Our proposed decision support tool will address current challenges in the field of organizational change readiness by aiding individuals in selecting a valid and reliable assessment measure that is relevant to user needs and practice settings. We anticipate that implementers and researchers who use our tool will be more likely to conduct readiness for change assessments in their settings when planning for implementation. This, in turn, may contribute to more successful implementation outcomes. We will test this tool in a future study to determine its efficacy and impact on implementation processes.

  4. Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets

    NASA Astrophysics Data System (ADS)

    Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.

    On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.

  5. The current state of cancer family history collection tools in primary care: a systematic review.

    PubMed

    Qureshi, Nadeem; Carroll, June C; Wilson, Brenda; Santaguida, Pasqualina; Allanson, Judith; Brouwers, Melissa; Raina, Parminder

    2009-07-01

    Systematic collection of family history is a prerequisite for identifying genetic risk. This study reviewed tools applicable to the primary care assessment of family history of breast, colorectal, ovarian, and prostate cancer. MEDLINE, EMBASE, CINAHL, and Cochrane Central were searched for publications. All primary study designs were included. Characteristics of the studies, the family history collection tools, and the setting were evaluated. Of 40 eligible studies, 18 relevant family history tools were identified, with 11 developed for use in primary care. Most collected information on more than one cancer and on affected relatives used self-administered questionnaires and paper-based formats. Eleven tools had been evaluated relative to current practice, demonstrating 46-78% improvement in data recording over family history recording in patient charts and 75-100% agreement with structured genetic interviews. Few tools have been developed specifically for primary care settings. The few that have been evaluated performed well. The very limited evidence, which depends in part on extrapolation from studies in settings other than primary care, suggests that systematic tools may add significant family health information compared with current primary care practice. The effect of their use on health outcomes has not been evaluated.

  6. Content and functional specifications for a standards-based multidisciplinary rounding tool to maintain continuity across acute and critical care.

    PubMed

    Collins, Sarah; Hurley, Ann C; Chang, Frank Y; Illa, Anisha R; Benoit, Angela; Laperle, Sarah; Dykes, Patricia C

    2014-01-01

    Maintaining continuity of care (CoC) in the inpatient setting is dependent on aligning goals and tasks with the plan of care (POC) during multidisciplinary rounds (MDRs). A number of locally developed rounding tools exist, yet there is a lack of standard content and functional specifications for electronic tools to support MDRs within and across settings. To identify content and functional requirements for an MDR tool to support CoC. We collected discrete clinical data elements (CDEs) discussed during rounds for 128 acute and critical care patients. To capture CDEs, we developed and validated an iPad-based observational tool based on informatics CoC standards. We observed 19 days of rounds and conducted eight group and individual interviews. Descriptive and bivariate statistics and network visualization were conducted to understand associations between CDEs discussed during rounds with a particular focus on the POC. Qualitative data were thematically analyzed. All analyses were triangulated. We identified the need for universal and configurable MDR tool views across settings and users and the provision of messaging capability. Eleven empirically derived universal CDEs were identified, including four POC CDEs: problems, plan, goals, and short-term concerns. Configurable POC CDEs were: rationale, tasks/'to dos', pending results and procedures, discharge planning, patient preferences, need for urgent review, prognosis, and advice/guidance. Some requirements differed between settings; yet, there was overlap between POC CDEs. We recommend an initial list of 11 universal CDEs for continuity in MDRs across settings and 27 CDEs that can be configured to meet setting-specific needs.

  7. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  8. DMT-TAFM: a data mining tool for technical analysis of futures market

    NASA Astrophysics Data System (ADS)

    Stepanov, Vladimir; Sathaye, Archana

    2002-03-01

    Technical analysis of financial markets describes many patterns of market behavior. For practical use, all these descriptions need to be adjusted for each particular trading session. In this paper, we develop a data mining tool for technical analysis of the futures markets (DMT-TAFM), which dynamically generates rules based on the notion of the price pattern similarity. The tool consists of three main components. The first component provides visualization of data series on a chart with different ranges, scales, and chart sizes and types. The second component constructs pattern descriptions using sets of polynomials. The third component specifies the training set for mining, defines the similarity notion, and searches for a set of similar patterns. DMT-TAFM is useful to prepare the data, and then reveal and systemize statistical information about similar patterns found in any type of historical price series. We performed experiments with our tool on three decades of trading data fro hundred types of futures. Our results for this data set shows that, we can prove or disprove many well-known patterns based on real data, as well as reveal new ones, and use the set of relatively consistent patterns found during data mining for developing better futures trading strategies.

  9. Developing a tool to support diagnostic delivery of dementia.

    PubMed

    Bennett, Claire E; De Boos, Danielle; Moghaddam, Nima G

    2018-01-01

    It is increasingly recognised that there are challenges affecting the current delivery of dementia diagnoses. Steps are required to address this. Current good practice guidelines provide insufficient direction and interventions from other healthcare settings do not appear to fully translate to dementia care settings. This project has taken a sequential two-phase design to developing a tool specific to dementia diagnostic delivery. Interviews with 14 participants explored good diagnostic delivery. Thematic analysis produced key themes (overcoming barriers, navigation of multiple journeys and completing overt and covert tasks) that were used to inform the design of a tool for use by clinicians, patients and companions. The tool was evaluated for acceptability in focused group discussions with 13 participants, which indicated a desire to use the tool and that it could encourage good practice. Adaptations were highlighted and incorporated to improve acceptability. Future research is now required to further evaluate the tool.

  10. Interchangeable end effector tools utilized on the protoflight manipulator arm

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A subset of teleoperator and effector tools was designed, fabricated, delivered and successfully demonstrated on the Marshall Space Flight Center (MSFC) protoflight manipulator arm (PFMA). The tools delivered included a rotary power tool with interchangeable collets and two fluid coupling mate/demate tools; one for a Fairchild coupling and the other for a Purolator coupling. An electrical interface connector was also provided for the rotary power tool. A tool set, from which the subset was selected, for performing on-orbit satellite maintenance was identified and conceptionally designed. Maintenance requirements were synthesized, evaluated and prioritized to develop design requirements for a set of end effector tools representative of those needed to provide on-orbit maintenance of satellites to be flown in the 1986 to 2000 timeframe.

  11. Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Johnson, C. W.; Gotchy, M. B.

    2000-01-01

    The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

  12. The GenABEL Project for statistical genomics.

    PubMed

    Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roger Lew; Ronald L. Boring; Thomas A. Ulrich

    Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, but the set of tools for developing and designing HMIs is still in its infancy. Here we propose that Microsoft Windows Presentation Foundation (WPF) is well suited for many roles in the research and development of HMIs for process control.

  14. Planetary data in education: tool development for access to the Planetary Data System

    NASA Technical Reports Server (NTRS)

    Atkinson, C. H.; Andres, P. M.; Liggett, P. K.; Lowes, L. L.; Sword, B. J.

    2003-01-01

    In this session we will describe and demonstrate the interface to the PDS access tools and functions developed for the scientific community, and discuss the potential for its utilization in K-14 formal and informal settings.

  15. Preparing for the future: a review of tools and strategies to support autonomous goal setting for children and youth with autism spectrum disorders.

    PubMed

    Hodgetts, Sandra; Park, Elly

    2017-03-01

    Despite recognized benefits, current clinical practice rarely includes direct input from children and youth with autism spectrum disorder (ASD) in setting rehabilitation goals. This study reviews tools and evidence-based strategies to assist with autonomous goal settings for children and youth with ASD. This study included two components: (1) A scoping review of existing tools and strategies to assist with autonomous goal setting in individuals with ASD and (2) a chart review of inter-disciplinary service plan goals for children and youth with ASD. Eleven data sources, evaluating five different tools to assist with autonomous goal setting for children and youth with ASD, were found. Three themes emerged from the integration of the scoping review and chart review, which are discussed in the paper: (1) generalizability of findings, (2) adaptations to support participation and (3) practice implications. Children and youth with ASD can participate in setting rehabilitation goals, but few tools to support their participation have been evaluated, and those tools that do exist do not align well with current services foci. Visual aids appear to be one effective support, but further research on effective strategies for meaningful engagement in autonomous goal setting for children and youth with ASD is warranted. Implications for rehabilitation Persons with ASD are less self-determined than their peers. Input into one's own rehabilitation goals and priorities is an important component of self-determination. Few tools exist to help engage children and youth with ASD in setting their own rehabilitation goals. An increased focus on identifying, developing and evaluating effective tools and strategies to facilitate engagement of children and youth with ASD in setting their own rehabilitation goals is warranted.

  16. Development of a comprehensive software engineering environment

    NASA Technical Reports Server (NTRS)

    Hartrum, Thomas C.; Lamont, Gary B.

    1987-01-01

    The generation of a set of tools for software lifecycle is a recurring theme in the software engineering literature. The development of such tools and their integration into a software development environment is a difficult task because of the magnitude (number of variables) and the complexity (combinatorics) of the software lifecycle process. An initial development of a global approach was initiated in 1982 as the Software Development Workbench (SDW). Continuing efforts focus on tool development, tool integration, human interfacing, data dictionaries, and testing algorithms. Current efforts are emphasizing natural language interfaces, expert system software development associates and distributed environments with Ada as the target language. The current implementation of the SDW is on a VAX-11/780. Other software development tools are being networked through engineering workstations.

  17. Nutrition screening tools: does one size fit all? A systematic review of screening tools for the hospital setting.

    PubMed

    van Bokhorst-de van der Schueren, Marian A E; Guaitoli, Patrícia Realino; Jansma, Elise P; de Vet, Henrica C W

    2014-02-01

    Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. A systematic review of English, French, German, Spanish, Portuguese and Dutch articles identified via MEDLINE, Cinahl and EMBASE (from inception to the 2nd of February 2012). Additional studies were identified by checking reference lists of identified manuscripts. Search terms included key words for malnutrition, screening or assessment instruments, and terms for hospital setting and adults. Data were extracted independently by 2 authors. Only studies expressing the (construct, criterion or predictive) validity of a tool were included. 83 studies (32 screening tools) were identified: 42 studies on construct or criterion validity versus a reference method and 51 studies on predictive validity on outcome (i.e. length of stay, mortality or complications). None of the tools performed consistently well to establish the patients' nutritional status. For the elderly, MNA performed fair to good, for the adults MUST performed fair to good. SGA, NRS-2002 and MUST performed well in predicting outcome in approximately half of the studies reviewed in adults, but not in older patients. Not one single screening or assessment tool is capable of adequate nutrition screening as well as predicting poor nutrition related outcome. Development of new tools seems redundant and will most probably not lead to new insights. New studies comparing different tools within one patient population are required. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  18. Maternal morbidity measurement tool pilot: study protocol.

    PubMed

    Say, Lale; Barreix, Maria; Chou, Doris; Tunçalp, Özge; Cottler, Sara; McCaw-Binns, Affette; Gichuhi, Gathari Ndirangu; Taulo, Frank; Hindin, Michelle

    2016-06-09

    While it is estimated that for every maternal death, 20-30 women suffer morbidity, these estimates are not based on standardized methods and measures. Lack of an agreed-upon definition, identification criteria, standardized assessment tools, and indicators has limited valid, routine, and comparable measurements of maternal morbidity. The World Health Organization (WHO) convened the Maternal Morbidity Working Group (MMWG) to develop standardized methods to improve estimates of maternal morbidity. To date, the MMWG has developed a definition and provided input into the development of a set of measurement tools. This protocol outlines the pilot test for measuring maternal morbidity in antenatal and postnatal clinical populations using these new tools. In each setting, the tools will be piloted on approximately 250 women receiving antenatal care (ANC) (at least 28 weeks pregnant) and 250 women receiving postpartum care (PPC) (at least 6 weeks postpartum). The tools will be administered by trained health care workers. Each tool has three modules as follows: 1. personal history - socio-economic information, and risk-factors (such as violence and substance abuse) 2. patient symptoms - WHO Disability Assessment Schedule (WHODAS) 12-item, and mental health questionnaires, General Anxiety Disorder, 7-item (GAD-7) and Personal Health Questionnaire, 9-item (PHQ-9) 3. physical examination - signs, laboratory tests and results. This pilot (planned for Jamaica, Kenya and Malawi) will allow for comparing the types of morbidities women experience between and across settings, and determine the feasibility, acceptability and utility of using a modified, streamlined tool for routine measurement and summary estimates of morbidity to inform resource allocation and service provision. As part of the post-2015 Sustainable Development Goals (SDGs) estimating and measuring maternal morbidity will be essential to ensure appropriate resources are allocated to address its impact and improve well-being.

  19. Comprehensive development and testing of the ASIST-GBV, a screening tool for responding to gender-based violence among women in humanitarian settings.

    PubMed

    Wirtz, A L; Glass, N; Pham, K; Perrin, N; Rubenstein, L S; Singh, S; Vu, A

    2016-01-01

    Conflict affected refugees and internally displaced persons (IDPs) are at increased vulnerability to gender-based violence (GBV). Health, psychosocial, and protection services have been implemented in humanitarian settings, but GBV remains under-reported and available services under-utilized. To improve access to existing GBV services and facilitate reporting, the ASIST-GBV screening tool was developed and tested for use in humanitarian settings. This process was completed in four phases: 1) systematic literature review, 2) qualitative research that included individual interviews and focus groups with GBV survivors and service providers, respectively, 3) pilot testing of the developed screening tool, and 4) 3-month implementation testing of the screening tool. Research was conducted among female refugees, aged ≥15 years in Ethiopia, and female IDPs, aged ≥18 years in Colombia. The systematic review and meta-analysis identified a range of GBV experiences and estimated a 21.4 % prevalence of sexual violence (95 % CI:14.9-28.7) among conflict-affected populations. No existing screening tools for GBV in humanitarian settings were identified. Qualitative research with GBV survivors in Ethiopia and Colombia found multiple forms of GBV experienced by refugees and IDPs that occurred during conflict, in transit, and in displaced settings. Identified forms of violence were combined into seven key items on the screening tool: threats of violence, physical violence, forced sex, sexual exploitation, forced pregnancy, forced abortion, and early or forced marriage. Cognitive testing further refined the tool. Pilot testing in both sites demonstrated preliminary feasibility where 64.8 % of participants in Ethiopia and 44.9 % of participants in Colombia were identified with recent (last 12 months) cases of GBV. Implementation testing of the screening tool, conducted as a routine service in camp/district hospitals, allowed for identification of GBV cases and referrals to services. In this phase, 50.6 % of participants in Ethiopia and 63.4 % in Colombia screened positive for recent experiences of GBV. Psychometric testing demonstrated appropriate internal consistency of the tool (Cronbach's α = 0.77) and item response theory demonstrated appropriate discrimination and difficulty of the tool. The ASIST-GBV screening tool has demonstrated utility and validity for use in confidential identification and referral of refugees and IDPs who experience GBV.

  20. Reducing Information Overload in Large Seismic Data Sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their effortsmore » to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.« less

  1. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    PubMed Central

    Abbasi, Arash; Berry, Jeffrey C.; Callen, Steven T.; Chavez, Leonardo; Doust, Andrew N.; Feldman, Max J.; Gilbert, Kerrigan B.; Hodge, John G.; Hoyer, J. Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning. PMID:29209576

  2. PlantCV v2: Image analysis software for high-throughput plant phenotyping.

    PubMed

    Gehan, Malia A; Fahlgren, Noah; Abbasi, Arash; Berry, Jeffrey C; Callen, Steven T; Chavez, Leonardo; Doust, Andrew N; Feldman, Max J; Gilbert, Kerrigan B; Hodge, John G; Hoyer, J Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  3. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  4. Coordinating complex decision support activities across distributed applications

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  5. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE PAGES

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash; ...

    2017-12-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  6. Virtual Beach Manager Toolset

    EPA Science Inventory

    The Virtual Beach Manager Toolset (VB) is a set of decision support software tools developed to help local beach managers make decisions as to when beaches should be closed due to predicted high levels of water borne pathogens. The tools are being developed under the umbrella of...

  7. Development of RAD-Score: A Tool to Assess the Procedural Competence of Diagnostic Radiology Residents.

    PubMed

    Isupov, Inga; McInnes, Matthew D F; Hamstra, Stan J; Doherty, Geoffrey; Gupta, Ashish; Peddle, Susan; Jibri, Zaid; Rakhra, Kawan; Hibbert, Rebecca M

    2017-04-01

    The purpose of this study is to develop a tool to assess the procedural competence of radiology trainees, with sources of evidence gathered from five categories to support the construct validity of tool: content, response process, internal structure, relations to other variables, and consequences. A pilot form for assessing procedural competence among radiology residents, known as the RAD-Score tool, was developed by evaluating published literature and using a modified Delphi procedure involving a group of local content experts. The pilot version of the tool was tested by seven radiology department faculty members who evaluated procedures performed by 25 residents at one institution between October 2014 and June 2015. Residents were evaluated while performing multiple procedures in both clinical and simulation settings. The main outcome measure was the percentage of residents who were considered ready to perform procedures independently, with testing conducted to determine differences between levels of training. A total of 105 forms (for 52 procedures performed in a clinical setting and 53 procedures performed in a simulation setting) were collected for a variety of procedures (eight vascular or interventional, 42 body, 12 musculoskeletal, 23 chest, and 20 breast procedures). A statistically significant difference was noted in the percentage of trainees who were rated as being ready to perform a procedure independently (in postgraduate year [PGY] 2, 12% of residents; in PGY3, 61%; in PGY4, 85%; and in PGY5, 88%; p < 0.05); this difference persisted in the clinical and simulation settings. User feedback and psychometric analysis were used to create a final version of the form. This prospective study describes the successful development of a tool for assessing the procedural competence of radiology trainees with high levels of construct validity in multiple domains. Implementation of the tool in the radiology residency curriculum is planned and can play an instrumental role in the transition to competency-based radiology training.

  8. Comparison of Performance Predictions for New Low-Thrust Trajectory Tools

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Kos, Larry; Hopkins, Randall; Crane, Tracie

    2006-01-01

    Several low thrust trajectory optimization tools have been developed over the last 3% years by the Low Thrust Trajectory Tools development team. This toolset includes both low-medium fidelity and high fidelity tools which allow the analyst to quickly research a wide mission trade space and perform advanced mission design. These tools were tested using a set of reference trajectories that exercised each tool s unique capabilities. This paper compares the performance predictions of the various tools against several of the reference trajectories. The intent is to verify agreement between the high fidelity tools and to quantify the performance prediction differences between tools of different fidelity levels.

  9. Development and Validation of a Monte Carlo Simulation Tool for Multi-Pinhole SPECT

    PubMed Central

    Mok, Greta S. P.; Du, Yong; Wang, Yuchuan; Frey, Eric C.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose In this work, we developed and validated a Monte Carlo simulation (MCS) tool for investigation and evaluation of multi-pinhole (MPH) SPECT imaging. Procedures This tool was based on a combination of the SimSET and MCNP codes. Photon attenuation and scatter in the object, as well as penetration and scatter through the collimator detector, are modeled in this tool. It allows accurate and efficient simulation of MPH SPECT with focused pinhole apertures and user-specified photon energy, aperture material, and imaging geometry. The MCS method was validated by comparing the point response function (PRF), detection efficiency (DE), and image profiles obtained from point sources and phantom experiments. A prototype single-pinhole collimator and focused four- and five-pinhole collimators fitted on a small animal imager were used for the experimental validations. We have also compared computational speed among various simulation tools for MPH SPECT, including SimSET-MCNP, MCNP, SimSET-GATE, and GATE for simulating projections of a hot sphere phantom. Results We found good agreement between the MCS and experimental results for PRF, DE, and image profiles, indicating the validity of the simulation method. The relative computational speeds for SimSET-MCNP, MCNP, SimSET-GATE, and GATE are 1: 2.73: 3.54: 7.34, respectively, for 120-view simulations. We also demonstrated the application of this MCS tool in small animal imaging by generating a set of low-noise MPH projection data of a 3D digital mouse whole body phantom. Conclusions The new method is useful for studying MPH collimator designs, data acquisition protocols, image reconstructions, and compensation techniques. It also has great potential to be applied for modeling the collimator-detector response with penetration and scatter effects for MPH in the quantitative reconstruction method. PMID:19779896

  10. Content and functional specifications for a standards-based multidisciplinary rounding tool to maintain continuity across acute and critical care

    PubMed Central

    Collins, Sarah; Hurley, Ann C; Chang, Frank Y; Illa, Anisha R; Benoit, Angela; Laperle, Sarah; Dykes, Patricia C

    2014-01-01

    Background Maintaining continuity of care (CoC) in the inpatient setting is dependent on aligning goals and tasks with the plan of care (POC) during multidisciplinary rounds (MDRs). A number of locally developed rounding tools exist, yet there is a lack of standard content and functional specifications for electronic tools to support MDRs within and across settings. Objective To identify content and functional requirements for an MDR tool to support CoC. Materials and methods We collected discrete clinical data elements (CDEs) discussed during rounds for 128 acute and critical care patients. To capture CDEs, we developed and validated an iPad-based observational tool based on informatics CoC standards. We observed 19 days of rounds and conducted eight group and individual interviews. Descriptive and bivariate statistics and network visualization were conducted to understand associations between CDEs discussed during rounds with a particular focus on the POC. Qualitative data were thematically analyzed. All analyses were triangulated. Results We identified the need for universal and configurable MDR tool views across settings and users and the provision of messaging capability. Eleven empirically derived universal CDEs were identified, including four POC CDEs: problems, plan, goals, and short-term concerns. Configurable POC CDEs were: rationale, tasks/‘to dos’, pending results and procedures, discharge planning, patient preferences, need for urgent review, prognosis, and advice/guidance. Discussion Some requirements differed between settings; yet, there was overlap between POC CDEs. Conclusions We recommend an initial list of 11 universal CDEs for continuity in MDRs across settings and 27 CDEs that can be configured to meet setting-specific needs. PMID:24081019

  11. Using the Nine Common Themes of Good Practice checklist as a tool for evaluating the research priority setting process of a provincial research and program evaluation program.

    PubMed

    Mador, Rebecca L; Kornas, Kathy; Simard, Anne; Haroun, Vinita

    2016-03-23

    Given the context-specific nature of health research prioritization and the obligation to effectively allocate resources to initiatives that will achieve the greatest impact, evaluation of priority setting processes can refine and strengthen such exercises and their outcomes. However, guidance is needed on evaluation tools that can be applied to research priority setting. This paper describes the adaption and application of a conceptual framework to evaluate a research priority setting exercise operating within the public health sector in Ontario, Canada. The Nine Common Themes of Good Practice checklist, described by Viergever et al. (Health Res Policy Syst 8:36, 2010) was used as the conceptual framework to evaluate the research priority setting process developed for the Locally Driven Collaborative Projects (LDCP) program in Ontario, Canada. Multiple data sources were used to inform the evaluation, including a review of selected priority setting approaches, surveys with priority setting participants, document review, and consultation with the program advisory committee. The evaluation assisted in identifying improvements to six elements of the LDCP priority setting process. The modifications were aimed at improving inclusiveness, information gathering practices, planning for project implementation, and evaluation. In addition, the findings identified that the timing of priority setting activities and level of control over the process were key factors that influenced the ability to effectively implement changes. The findings demonstrate the novel adaptation and application of the 'Nine Common Themes of Good Practice checklist' as a tool for evaluating a research priority setting exercise. The tool can guide the development of evaluation questions and enables the assessment of key constructs related to the design and delivery of a research priority setting process.

  12. Improving e-book access via a library-developed full-text search tool*

    PubMed Central

    Foust, Jill E.; Bergen, Phillip; Maxeiner, Gretchen L.; Pawlowski, Peter N.

    2007-01-01

    Purpose: This paper reports on the development of a tool for searching the contents of licensed full-text electronic book (e-book) collections. Setting: The Health Sciences Library System (HSLS) provides services to the University of Pittsburgh's medical programs and large academic health system. Brief Description: The HSLS has developed an innovative tool for federated searching of its e-book collections. Built using the XML-based Vivísimo development environment, the tool enables a user to perform a full-text search of over 2,500 titles from the library's seven most highly used e-book collections. From a single “Google-style” query, results are returned as an integrated set of links pointing directly to relevant sections of the full text. Results are also grouped into categories that enable more precise retrieval without reformulation of the search. Results/Evaluation: A heuristic evaluation demonstrated the usability of the tool and a web server log analysis indicated an acceptable level of usage. Based on its success, there are plans to increase the number of online book collections searched. Conclusion: This library's first foray into federated searching has produced an effective tool for searching across large collections of full-text e-books and has provided a good foundation for the development of other library-based federated searching products. PMID:17252065

  13. Development of a music therapy assessment tool for patients in low awareness states.

    PubMed

    Magee, Wendy L

    2007-01-01

    People in low awareness states following profound brain injury typically demonstrate subtle changes in functional behaviors which challenge the sensitivity of measurement tools. Failure to identify and measure changes in functioning can lead to misdiagnosis and withdrawal of treatment with this population. Thus, the development of tools which are sensitive to responsiveness is of central concern. As the auditory modality has been found to be particularly sensitive in identifying responses indicating awareness, a convincing case can be made for music therapy as a treatment medium. However, little has been recommended about protocols for intervention or tools for measuring patient responses within the music therapy setting. This paper presents the rationale for an assessment tool specifically designed to measure responses in the music therapy setting with patients who are diagnosed as minimally conscious or in a vegetative state. Developed over fourteen years as part of interdisciplinary assessment and treatment, the music therapy assessment tool for low awareness states (MATLAS) contains fourteen items which rate behavioral responses across a number of domains. The tool can provide important information for interdisciplinary assessment and treatment particularly in the auditory and communication domains. Recommendations are made for testing its reliability and validity through research.

  14. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    EPA Pesticide Factsheets

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  15. New Tools For Understanding Microbial Diversity Using High-throughput Sequence Data

    NASA Astrophysics Data System (ADS)

    Knight, R.; Hamady, M.; Liu, Z.; Lozupone, C.

    2007-12-01

    High-throughput sequencing techniques such as 454 are straining the limits of tools traditionally used to build trees, choose OTUs, and perform other essential sequencing tasks. We have developed a workflow for phylogenetic analysis of large-scale sequence data sets that combines existing tools, such as the Arb phylogeny package and the NAST multiple sequence alignment tool, with new methods for choosing and clustering OTUs and for performing phylogenetic community analysis with UniFrac. This talk discusses the cyberinfrastructure we are developing to support the human microbiome project, and the application of these workflows to analyze very large data sets that contrast the gut microbiota with a range of physical environments. These tools will ultimately help to define core and peripheral microbiomes in a range of environments, and will allow us to understand the physical and biotic factors that contribute most to differences in microbial diversity.

  16. compomics-utilities: an open-source Java library for computational proteomics.

    PubMed

    Barsnes, Harald; Vaudel, Marc; Colaert, Niklaas; Helsens, Kenny; Sickmann, Albert; Berven, Frode S; Martens, Lennart

    2011-03-08

    The growing interest in the field of proteomics has increased the demand for software tools and applications that process and analyze the resulting data. And even though the purpose of these tools can vary significantly, they usually share a basic set of features, including the handling of protein and peptide sequences, the visualization of (and interaction with) spectra and chromatograms, and the parsing of results from various proteomics search engines. Developers typically spend considerable time and effort implementing these support structures, which detracts from working on the novel aspects of their tool. In order to simplify the development of proteomics tools, we have implemented an open-source support library for computational proteomics, called compomics-utilities. The library contains a broad set of features required for reading, parsing, and analyzing proteomics data. compomics-utilities is already used by a long list of existing software, ensuring library stability and continued support and development. As a user-friendly, well-documented and open-source library, compomics-utilities greatly simplifies the implementation of the basic features needed in most proteomics tools. Implemented in 100% Java, compomics-utilities is fully portable across platforms and architectures. Our library thus allows the developers to focus on the novel aspects of their tools, rather than on the basic functions, which can contribute substantially to faster development, and better tools for proteomics.

  17. Prioritizing guideline topics: development and evaluation of a practical tool.

    PubMed

    Ketola, Eeva; Toropainen, Erja; Kaila, Minna; Luoto, Riitta; Mäkelä, Marjukka

    2007-08-01

    A clear process for selecting and adopting clinical practice guidelines in the new topic areas is needed. The aim of this study is to design and develop a practical tool to assess guideline topics that have been suggested to the organization responsible for producing guidelines. We carried out an iterative development, feasibility and validation study of a guideline topic prioritization tool. The setting included the guideline producer organization and the tax-funded health care system. In the first stage of the tool development, participants were researchers, members of the Current Care Board and experts from health care organizations. In the second stage, the evaluation was done internally within the project by three independent reviewers. The main outcome measures were responses to an evaluation questionnaire, qualitative process feedback and analysis of the performance of the instrument on a random set of guidelines. Evaluations by three independent reviewers revealed good agreement and face validity with respect to its feasibility as a planning tool at the guideline board level. Feedback from board members suggested that the instrument is useful in prioritizing guideline topics. This instrument was accepted for use by the Board. Further developments are needed to ensure feedback and acceptability of the instrument by those proposing topics.

  18. A Systematic Review of Family Meeting Tools in Palliative and Intensive Care Settings

    PubMed Central

    Singer, Adam E.; Ash, Tayla; Ochotorena, Claudia; Lorenz, Karl A.; Chong, Kelly; Shreve, Scott T.; Ahluwalia, Sangeeta C.

    2015-01-01

    Purpose Family meetings can be challenging, requiring a range of skills and participation. We sought to identify tools available to aid the conduct of family meetings in palliative, hospice, and intensive care unit settings. Methods We systematically reviewed PubMed for articles describing family meeting tools and abstracted information on tool type, usage, and content. Results We identified 16 articles containing 23 tools in 7 categories: meeting guide (n = 8), meeting planner (n = 5), documentation template (n = 4), meeting strategies (n = 2), decision aid/screener (n = 2), family checklist (n = 1), and training module (n = 1). We found considerable variation across tools in usage and content and a lack of tools supporting family engagement. Conclusion There is need to standardize family meeting tools and develop tools to help family members effectively engage in the process. PMID:26213225

  19. A Systematic Review of Family Meeting Tools in Palliative and Intensive Care Settings.

    PubMed

    Singer, Adam E; Ash, Tayla; Ochotorena, Claudia; Lorenz, Karl A; Chong, Kelly; Shreve, Scott T; Ahluwalia, Sangeeta C

    2016-09-01

    Family meetings can be challenging, requiring a range of skills and participation. We sought to identify tools available to aid the conduct of family meetings in palliative, hospice, and intensive care unit settings. We systematically reviewed PubMed for articles describing family meeting tools and abstracted information on tool type, usage, and content. We identified 16 articles containing 23 tools in 7 categories: meeting guide (n = 8), meeting planner (n = 5), documentation template (n = 4), meeting strategies (n = 2), decision aid/screener (n = 2), family checklist (n = 1), and training module (n = 1). We found considerable variation across tools in usage and content and a lack of tools supporting family engagement. There is need to standardize family meeting tools and develop tools to help family members effectively engage in the process. © The Author(s) 2015.

  20. The GenABEL Project for statistical genomics

    PubMed Central

    Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381

  1. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations.

    PubMed

    Dwivedi, Bhakti; Kowalski, Jeanne

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/.

  2. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations

    PubMed Central

    Dwivedi, Bhakti

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/. PMID:29415010

  3. From Blunt to Pointy Tools: Transcending Task Automation to Effective Instructional Practice with CaseMate

    ERIC Educational Resources Information Center

    Swan, Gerry

    2009-01-01

    While blogs, wikis and many other Web 2.0 applications can be employed in learning settings, instruction is not the primary purpose for these tools. The educational field must actively participate in the definition and development of what repurposed or new Web 2.0 applications means in educational settings. One way of viewing this needed…

  4. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  5. Integrated decision support tools for Puget Sound salmon recovery planning

    EPA Science Inventory

    We developed a set of tools to provide decision support for community-based salmon recovery planning in Salish Sea watersheds. Here we describe how these tools are being integrated and applied in collaboration with Puget Sound tribes and community stakeholders to address restora...

  6. Whole Watershed Restoration Planning Tools for Estimating Tradeoffs Among Multiple Objectives

    EPA Science Inventory

    We developed a set of decision support tools to assist whole watershed restoration planning in the Pacific Northwest. Here we describe how these tools are being integrated and applied in collaboration with tribes and community stakeholders to address restoration of hydrological ...

  7. A tool kit for evaluating electronic flight bags

    DOT National Transportation Integrated Search

    2006-09-01

    Over the past few years, the Volpe Center has developed a set of five tools that can be used to evaluate Electronic Flight Bags (EFBs) from a human factors perspective. The goal of these tools is to help streamline and standardize EFB human factors a...

  8. EURRECA: development of tools to improve the alignment of micronutrient recommendations.

    PubMed

    Matthys, C; Bucchini, L; Busstra, M C; Cavelaars, A E J M; Eleftheriou, P; Garcia-Alvarez, A; Fairweather-Tait, S; Gurinović, M; van Ommen, B; Contor, L

    2010-11-01

    Approaches through which reference values for micronutrients are derived, as well as the reference values themselves, vary considerably across countries. Harmonisation is needed to improve nutrition policy and public health strategies. The EURRECA (EURopean micronutrient RECommendations Aligned, http://www.eurreca.org) Network of Excellence is developing generic tools for systematically establishing and updating micronutrient reference values or recommendations. Different types of instruments (including best practice guidelines, interlinked web pages, online databases and decision trees) have been identified. The first set of instruments is for training purposes and includes mainly interactive digital learning materials. The second set of instruments comprises collection and interlinkage of diverse information sources that have widely varying contents and purposes. In general, these sources are collections of existing information. The purpose of the majority of these information sources is to provide guidance on best practice for use in a wider scientific community or for users and stakeholders of reference values. The third set of instruments includes decision trees and frameworks. The purpose of these tools is to guide non-scientists in decision making based on scientific evidence. This platform of instruments will, in particular in Central and Eastern European countries, contribute to future capacity-building development in nutrition. The use of these tools by the scientific community, the European Food Safety Authority, bodies responsible for setting national nutrient requirements and others should ultimately help to align nutrient-based recommendations across Europe. Therefore, EURRECA can contribute towards nutrition policy development and public health strategies.

  9. Comparing the performance of biomedical clustering methods.

    PubMed

    Wiwie, Christian; Baumbach, Jan; Röttger, Richard

    2015-11-01

    Identifying groups of similar objects is a popular first step in biomedical data analysis, but it is error-prone and impossible to perform manually. Many computational methods have been developed to tackle this problem. Here we assessed 13 well-known methods using 24 data sets ranging from gene expression to protein domains. Performance was judged on the basis of 13 common cluster validity indices. We developed a clustering analysis platform, ClustEval (http://clusteval.mpi-inf.mpg.de), to promote streamlined evaluation, comparison and reproducibility of clustering results in the future. This allowed us to objectively evaluate the performance of all tools on all data sets with up to 1,000 different parameter sets each, resulting in a total of more than 4 million calculated cluster validity indices. We observed that there was no universal best performer, but on the basis of this wide-ranging comparison we were able to develop a short guideline for biomedical clustering tasks. ClustEval allows biomedical researchers to pick the appropriate tool for their data type and allows method developers to compare their tool to the state of the art.

  10. Development and testing of a tool for assessing and resolving medication-related problems in older adults in an ambulatory care setting: the individualized medication assessment and planning (iMAP) tool.

    PubMed

    Crisp, Ginny D; Burkhart, Jena Ivey; Esserman, Denise A; Weinberger, Morris; Roth, Mary T

    2011-12-01

    Medication is one of the most important interventions for improving the health of older adults, yet it has great potential for causing harm. Clinical pharmacists are well positioned to engage in medication assessment and planning. The Individualized Medication Assessment and Planning (iMAP) tool was developed to aid clinical pharmacists in documenting medication-related problems (MRPs) and associated recommendations. The purpose of our study was to assess the reliability and usability of the iMAP tool in classifying MRPs and associated recommendations in older adults in the ambulatory care setting. Three cases, representative of older adults seen in an outpatient setting, were developed. Pilot testing was conducted and a "gold standard" key developed. Eight eligible pharmacists consented to participate in the study. They were instructed to read each case, make an assessment of MRPs, formulate a plan, and document the information using the iMAP tool. Inter-rater reliability was assessed for each case, comparing the pharmacists' identified MRPs and recommendations to the gold standard. Consistency of categorization across reviewers was assessed using the κ statistic or percent agreement. The mean κ across the 8 pharmacists in classifying MRPs compared with the gold standard was 0.74 (range, 0.54-1.00) for case 1 and 0.68 (range, 0.36-1.00) for case 2, indicating substantial agreement. For case 3, percent agreement was 63% (range, 40%-100%). The mean κ across the 8 pharmacists when classifying recommendations compared with the gold standard was 0.87 (range, 0.58-1.00) for case 1 and 0.88 (range, 0.75-1.00) for case 2, indicating almost perfect agreement. For case 3, percent agreement was 68% (range, 40%-100%). Clinical pharmacists found the iMAP tool easy to use. The iMAP tool provides a reliable and standardized approach for clinical pharmacists to use in the ambulatory care setting to classify MRPs and associated recommendations. Future studies will explore the predictive validity of the tool on clinical outcomes such as health care utilization. Copyright © 2011 Elsevier HS Journals, Inc. All rights reserved.

  11. Updating Risk Prediction Tools: A Case Study in Prostate Cancer

    PubMed Central

    Ankerst, Donna P.; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J.; Feng, Ziding; Sanda, Martin G.; Partin, Alan W.; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M.

    2013-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [−2]proPSA measured on an external case control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. PMID:22095849

  12. Updating risk prediction tools: a case study in prostate cancer.

    PubMed

    Ankerst, Donna P; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J; Feng, Ziding; Sanda, Martin G; Partin, Alan W; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M

    2012-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically, the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [-2]proPSA measured on an external case-control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A Novel Tool Improves Existing Estimates of Recent Tuberculosis Transmission in Settings of Sparse Data Collection.

    PubMed

    Kasaie, Parastu; Mathema, Barun; Kelton, W David; Azman, Andrew S; Pennington, Jeff; Dowdy, David W

    2015-01-01

    In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission ("recent transmission proportion"), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional 'n-1' approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the 'n-1' technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the 'n-1' model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models' performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data.

  14. A Novel Tool Improves Existing Estimates of Recent Tuberculosis Transmission in Settings of Sparse Data Collection

    PubMed Central

    Kasaie, Parastu; Mathema, Barun; Kelton, W. David; Azman, Andrew S.; Pennington, Jeff; Dowdy, David W.

    2015-01-01

    In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission (“recent transmission proportion”), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional ‘n-1’ approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the ‘n-1’ technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the ‘n-1’ model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models’ performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data. PMID:26679499

  15. Case Studies of Software Development Tools for Parallel Architectures

    DTIC Science & Technology

    1993-06-01

    Simulation ............................................. 29 4.7.3 Visualization...autonomous entities, each with its own state and set of behaviors, as in simulation , tracking, or Battle Management. Because C2 applications are often... simulation , that is used to help the developer solve the problems. The new tool/problem solution matrix is structured in terms of the software development

  16. Nutrition screening tools: an analysis of the evidence.

    PubMed

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  17. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  18. Designed tools for analysis of lithography patterns and nanostructures

    NASA Astrophysics Data System (ADS)

    Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann

    2017-03-01

    We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.

  19. Tools for Supporting Distributed Agile Project Planning

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Maurer, Frank; Morgan, Robert; Oliveira, Josyleuda

    Agile project planning plays an important part in agile software development. In distributed settings, project planning is severely impacted by the lack of face-to-face communication and the inability to share paper index cards amongst all meeting participants. To address these issues, several distributed agile planning tools were developed. The tools vary in features, functions and running platforms. In this chapter, we first summarize the requirements for distributed agile planning. Then we give an overview on existing agile planning tools. We also evaluate existing tools based on tool requirements. Finally, we present some practical advices for both designers and users of distributed agile planning tools.

  20. Modeling of Principal Flank Wear: An Empirical Approach Combining the Effect of Tool, Environment and Workpiece Hardness

    NASA Astrophysics Data System (ADS)

    Mia, Mozammel; Al Bashir, Mahmood; Dhar, Nikhil Ranjan

    2016-10-01

    Hard turning is increasingly employed in machining, lately, to replace time-consuming conventional turning followed by grinding process. An excessive amount of tool wear in hard turning is one of the main hurdles to be overcome. Many researchers have developed tool wear model, but most of them developed it for a particular work-tool-environment combination. No aggregate model is developed that can be used to predict the amount of principal flank wear for specific machining time. An empirical model of principal flank wear (VB) has been developed for the different hardness of workpiece (HRC40, HRC48 and HRC56) while turning by coated carbide insert with different configurations (SNMM and SNMG) under both dry and high pressure coolant conditions. Unlike other developed model, this model includes the use of dummy variables along with the base empirical equation to entail the effect of any changes in the input conditions on the response. The base empirical equation for principal flank wear is formulated adopting the Exponential Associate Function using the experimental results. The coefficient of dummy variable reflects the shifting of the response from one set of machining condition to another set of machining condition which is determined by simple linear regression. The independent cutting parameters (speed, rate, depth of cut) are kept constant while formulating and analyzing this model. The developed model is validated with different sets of machining responses in turning hardened medium carbon steel by coated carbide inserts. For any particular set, the model can be used to predict the amount of principal flank wear for specific machining time. Since the predicted results exhibit good resemblance with experimental data and the average percentage error is <10 %, this model can be used to predict the principal flank wear for stated conditions.

  1. A Tale of Two Regions: Landscape Ecological Planning for Shale Gas Energy Futures

    NASA Astrophysics Data System (ADS)

    Murtha, T., Jr.; Schroth, O.; Orland, B.; Goldberg, L.; Mazurczyk, T.

    2015-12-01

    As we increasingly embrace deep shale gas deposits to meet global energy demands new and dispersed local and regional policy and planning challenges emerge. Even in regions with long histories of energy extraction, such as coal, shale gas and the infrastructure needed to produce the gas and transport it to market offers uniquely complex transformations in land use and landcover not previously experienced. These transformations are fast paced, dispersed and can overwhelm local and regional planning and regulatory processes. Coupled to these transformations is a structural confounding factor. While extraction and testing are carried out locally, regulation and decision-making is multilayered, often influenced by national and international factors. Using a geodesign framework, this paper applies a set of geospatial landscape ecological planning tools in two shale gas settings. First, we describe and detail a series of ongoing studies and tools that we have developed for communities in the Marcellus Shale region of the eastern United States, specifically the northern tier of Pennsylvania. Second, we apply a subset of these tools to potential gas development areas of the Fylde region in Lancashire, United Kingdom. For the past five years we have tested, applied and refined a set of place based and data driven geospatial models for forecasting, envisioning, analyzing and evaluating shale gas activities in northern Pennsylvania. These models are continuously compared to important landscape ecological planning challenges and priorities in the region, e.g. visual and cultural resource preservation. Adapting and applying these tools to a different landscape allow us to not only isolate and define important regulatory and policy exigencies in each specific setting, but also to develop and refine these models for broader application. As we continue to explore increasingly complex energy solutions globally, we need an equally complex comparative set of landscape ecological planning tools to inform policy, design and regional planning. Adapting tools and techniques developed in Pennsylvania where shale gas extraction is ongoing to Lancashire, where industry is still in the exploratory phase offers a key opportunity to test and refine more generalizable models.

  2. Action learning: a tool for the development of strategic skills for Nurse Consultants?

    PubMed

    Young, Sarah; Nixon, Eileen; Hinge, Denise; McFadyen, Jan; Wright, Vanessa; Lambert, Pauline; Pilkington, Carolyn; Newsome, Christine

    2010-01-01

    This paper will discuss the process of action learning and the outcomes of using action learning as a tool to achieve a more strategic function from Nurse Consultant posts. It is documented that one of the most challenging aspect of Nurse Consultant roles, in terms of leadership, is the strategic contribution they make at a senior corporate Trust level, often across organizations and local health economies. A facilitated action learning set was established in Brighton, England, to support the strategic leadership development of eight nurse consultant posts across two NHS Trusts. Benefits to patient care, with regard to patient pathways and cross-organizational working, have been evident outcomes associated with the nurse consultant posts involved in the action learning set. Commitment by organizational nurse leaders is essential to address the challenges facing nurse consultants to implement change at strategic levels. The use of facilitated action learning had been a successful tool in developing the strategic skills of Nurse Consultant posts within this setting. Action learning sets may be successfully applied to a range of senior nursing posts with a strategic remit and may assist post holders in achieving better outcomes pertinent to their roles.

  3. CFD Multiphysics Tool

    NASA Technical Reports Server (NTRS)

    Perrell, Eric R.

    2005-01-01

    The recent bold initiatives to expand the human presence in space require innovative approaches to the design of propulsion systems whose underlying technology is not yet mature. The space propulsion community has identified a number of candidate concepts. A short list includes solar sails, high-energy-density chemical propellants, electric and electromagnetic accelerators, solar-thermal and nuclear-thermal expanders. For each of these, the underlying physics are relatively well understood. One could easily cite authoritative texts, addressing both the governing equations, and practical solution methods for, e.g. electromagnetic fields, heat transfer, radiation, thermophysics, structural dynamics, particulate kinematics, nuclear energy, power conversion, and fluid dynamics. One could also easily cite scholarly works in which complete equation sets for any one of these physical processes have been accurately solved relative to complex engineered systems. The Advanced Concepts and Analysis Office (ACAO), Space Transportation Directorate, NASA Marshall Space Flight Center, has recently released the first alpha version of a set of computer utilities for performing the applicable physical analyses relative to candidate deep-space propulsion systems such as those listed above. PARSEC, Preliminary Analysis of Revolutionary in-Space Engineering Concepts, enables rapid iterative calculations using several physics tools developed in-house. A complete cycle of the entire tool set takes about twenty minutes. PARSEC is a level-zero/level-one design tool. For PARSEC s proof-of-concept, and preliminary design decision-making, assumptions that significantly simplify the governing equation sets are necessary. To proceed to level-two, one wishes to retain modeling of the underlying physics as close as practical to known applicable first principles. This report describes results of collaboration between ACAO, and Embry-Riddle Aeronautical University (ERAU), to begin building a set of level-two design tools for PARSEC. The "CFD Multiphysics Tool" will be the propulsive element of the tool set. The name acknowledges that space propulsion performance assessment is primarily a fluid mechanics problem. At the core of the CFD Multiphysics Tool is an open-source CFD code, HYP, under development at ERAU. ERAU is renowned for its undergraduate degree program in Aerospace Engineering the largest in the nation. The strength of the program is its applications-oriented curriculum, which culminates in one of three two-course Engineering Design sequences: Aerospace Propulsion, Spacecraft, or Aircraft. This same philosophy applies to the HYP Project, albeit with fluid physics modeling commensurate with graduate research. HYP s purpose, like the Multiphysics Tool s, is to enable calculations of real (three-dimensional; geometrically complex; intended for hardware development) applications of high speed and propulsive fluid flows.

  4. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  5. Front-line ordering clinicians: matching workforce to workload.

    PubMed

    Fieldston, Evan S; Zaoutis, Lisa B; Hicks, Patricia J; Kolb, Susan; Sladek, Erin; Geiger, Debra; Agosto, Paula M; Boswinkel, Jan P; Bell, Louis M

    2014-07-01

    Matching workforce to workload is particularly important in healthcare delivery, where an excess of workload for the available workforce may negatively impact processes and outcomes of patient care and resident learning. Hospitals currently lack a means to measure and match dynamic workload and workforce factors. This article describes our work to develop and obtain consensus for use of an objective tool to dynamically match the front-line ordering clinician (FLOC) workforce to clinical workload in a variety of inpatient settings. We undertook development of a tool to represent hospital workload and workforce based on literature reviews, discussions with clinical leadership, and repeated validation sessions. We met with physicians and nurses from every clinical care area of our large, urban children's hospital at least twice. We successfully created a tool in a matrix format that is objective and flexible and can be applied to a variety of settings. We presented the tool in 14 hospital divisions and received widespread acceptance among physician, nursing, and administrative leadership. The hospital uses the tool to identify gaps in FLOC coverage and guide staffing decisions. Hospitals can better match workload to workforce if they can define and measure these elements. The Care Model Matrix is a flexible, objective tool that quantifies the multidimensional aspects of workload and workforce. The tool, which uses multiple variables that are easily modifiable, can be adapted to a variety of settings. © 2014 Society of Hospital Medicine.

  6. The development of an observational screening tool to assess safe, effective and appropriate walking aid use in people with multiple sclerosis.

    PubMed

    Eitzen, Abby; Finlayson, Marcia; Carolan-Laing, Leanne; Nacionales, Arthur Junn; Walker, Christie; O'Connor, Josephine; Asano, Miho; Coote, Susan

    2017-08-01

    The purpose of this study was to identify potential items for an observational screening tool to assess safe, effective and appropriate walking aid use among people with multiple sclerosis (MS). Such a tool is needed because of the association between fall risk and mobility aid use in this population. Four individuals with MS were videotaped using a one or two straight canes, crutches or a rollator in different settings. Seventeen health care professionals from Canada, Ireland and the United States were recruited, and viewed the videos, and were then interviewed about the use of the devices by the individuals in the videos. Interview questions addressed safety, effectiveness and appropriateness of the device in the setting. Data were analyzed qualitatively. Coding consistency across raters was evaluated and confirmed. Nineteen codes were identified as possible items for the screening tool. The most frequent issues raised regardless of setting and device were "device used for duration/abandoned", "appropriate device", "balance and stability", "device technique", "environmental modification" and "hands free." With the identification of a number of potential tool items, researchers can now move forward with the development of the tool. This will involve consultation with both healthcare professionals and people with MS. Implications for rehabilitation Falls among people with multiple sclerosis are associated with mobility device use and use of multiple devices is associated with greater falls risk. The ability to assess for safe, effective and efficient use of walking aids is therefore important, no tools currently exist for this purpose. The codes arising from this study will be used to develop a screening tool for safe, effective and efficient walking aid use with the aim of reducing falls risk.

  7. Questioning context: a set of interdisciplinary questions for investigating contextual factors affecting health decision making

    PubMed Central

    Charise, Andrea; Witteman, Holly; Whyte, Sarah; Sutton, Erica J.; Bender, Jacqueline L.; Massimi, Michael; Stephens, Lindsay; Evans, Joshua; Logie, Carmen; Mirza, Raza M.; Elf, Marie

    2011-01-01

    Abstract Objective  To combine insights from multiple disciplines into a set of questions that can be used to investigate contextual factors affecting health decision making. Background  Decision‐making processes and outcomes may be shaped by a range of non‐medical or ‘contextual’ factors particular to an individual including social, economic, political, geographical and institutional conditions. Research concerning contextual factors occurs across many disciplines and theoretical domains, but few conceptual tools have attempted to integrate and translate this wide‐ranging research for health decision‐making purposes. Methods  To formulate this tool we employed an iterative, collaborative process of scenario development and question generation. Five hypothetical health decision‐making scenarios (preventative, screening, curative, supportive and palliative) were developed and used to generate a set of exploratory questions that aim to highlight potential contextual factors across a range of health decisions. Findings  We present an exploratory tool consisting of questions organized into four thematic domains – Bodies, Technologies, Place and Work (BTPW) – articulating wide‐ranging contextual factors relevant to health decision making. The BTPW tool encompasses health‐related scholarship and research from a range of disciplines pertinent to health decision making, and identifies concrete points of intersection between its four thematic domains. Examples of the practical application of the questions are also provided. Conclusions  These exploratory questions provide an interdisciplinary toolkit for identifying the complex contextual factors affecting decision making. The set of questions comprised by the BTPW tool may be applied wholly or partially in the context of clinical practice, policy development and health‐related research. PMID:21029277

  8. Improving Escalation of Care: Development and Validation of the Quality of Information Transfer Tool.

    PubMed

    Johnston, Maximilian J; Arora, Sonal; Pucher, Philip H; Reissis, Yannis; Hull, Louise; Huddy, Jeremy R; King, Dominic; Darzi, Ara

    2016-03-01

    To develop and provide validity and feasibility evidence for the QUality of Information Transfer (QUIT) tool. Prompt escalation of care in the setting of patient deterioration can prevent further harm. Escalation and information transfer skills are not currently measured in surgery. This study comprised 3 phases: the development (phase 1), validation (phase 2), and feasibility analysis (phase 3) of the QUIT tool. Phase 1 involved identification of core skills needed for successful escalation of care through literature review and 33 semistructured interviews with stakeholders. Phase 2 involved the generation of validity evidence for the tool using a simulated setting. Thirty surgeons assessed a deteriorating postoperative patient in a simulated ward and escalated their care to a senior colleague. The face and content validity were assessed using a survey. Construct and concurrent validity of the tool were determined by comparing performance scores using the QUIT tool with those measured using the Situation-Background-Assessment-Recommendation (SBAR) tool. Phase 3 was conducted using direct observation of escalation scenarios on surgical wards in 2 hospitals. A 7-category assessment tool was developed from phase 1 consisting of 24 items. Twenty-one of 24 items had excellent content validity (content validity index >0.8). All 7 categories and 18 of 24 (P < 0.05) items demonstrated construct validity. The correlation between the QUIT and SBAR tools used was strong indicating concurrent validity (r = 0.694, P < 0.001). Real-time scoring of escalation referrals was feasible and indicated that doctors currently have better information transfer skills than nurses when faced with a deteriorating patient. A validated tool to assess information transfer for deteriorating surgical patients was developed and tested using simulation and real-time clinical scenarios. It may improve the quality and safety of patient care on the surgical ward.

  9. Conflict Resolution and Children's Behaviour: Observing and Understanding Social and Cooperative Play in Early Years Educational Settings

    ERIC Educational Resources Information Center

    Broadhead, Pat

    2009-01-01

    This paper draws from continuing research into the growth of sociability and cooperation in young children. It began in the mid-1980s and has continued periodically in a range of early years educational settings across the 3-6 age range. The research has underpinned the development of an observational tool. This tool--the Social Play Continuum or…

  10. Usability testing of a Falls Prevention Tool Kit for an inpatient acute care setting.

    PubMed

    Goldsmith, Denise; Zuyev, Lyubov; Benoit, Angie; Chang, Frank Y; Horsky, Jan; Dykes, Patricia

    2009-01-01

    Efforts to prevent falls in the hospital setting involves identifying patients at risk of falling and implementing fall prevention strategies. This poster describes the method and results of Performance Usability Testing on a web-based Fall Prevention Tool Kit (FPTK) developed as part of a research study, (Falls TIPS-Tailoring Interventions for Patient Safety) funded by The Robert Wood Johnson Foundation.

  11. The Individual Basic Facts Assessment Tool

    ERIC Educational Resources Information Center

    Tait-McCutcheon, Sandi; Drake, Michael

    2015-01-01

    There is an identified and growing need for a levelled diagnostic basic facts assessment tool that provides teachers with formative information about students' mastery of a broad range of basic fact sets. The Individual Basic Facts Assessment tool has been iteratively and cumulatively developed, trialled, and refined with input from teachers and…

  12. Customer-experienced rapid prototyping

    NASA Astrophysics Data System (ADS)

    Zhang, Lijuan; Zhang, Fu; Li, Anbo

    2008-12-01

    In order to describe accurately and comprehend quickly the perfect GIS requirements, this article will integrate the ideas of QFD (Quality Function Deployment) and UML (Unified Modeling Language), and analyze the deficiency of prototype development model, and will propose the idea of the Customer-Experienced Rapid Prototyping (CE-RP) and describe in detail the process and framework of the CE-RP, from the angle of the characteristics of Modern-GIS. The CE-RP is mainly composed of Customer Tool-Sets (CTS), Developer Tool-Sets (DTS) and Barrier-Free Semantic Interpreter (BF-SI) and performed by two roles of customer and developer. The main purpose of the CE-RP is to produce the unified and authorized requirements data models between customer and software developer.

  13. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  14. Knowledge-acquisition tools for medical knowledge-based systems.

    PubMed

    Lanzola, G; Quaglini, S; Stefanelli, M

    1995-03-01

    Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.

  15. User’s guide for the Delaware River Basin Streamflow Estimator Tool (DRB-SET)

    USGS Publications Warehouse

    Stuckey, Marla H.; Ulrich, James E.

    2016-06-09

    IntroductionThe Delaware River Basin Streamflow Estimator Tool (DRB-SET) is a tool for the simulation of streamflow at a daily time step for an ungaged stream location in the Delaware River Basin. DRB-SET was developed by the U.S. Geological Survey (USGS) and funded through WaterSMART as part of the National Water Census, a USGS research program on national water availability and use that develops new water accounting tools and assesses water availability at the regional and national scales. DRB-SET relates probability exceedances at a gaged location to those at an ungaged stream location. Once the ungaged stream location has been identified by the user, an appropriate streamgage is automatically selected in DRB-SET using streamflow correlation (map correlation method). Alternately, the user can manually select a different streamgage or use the closest streamgage. A report file is generated documenting the reference streamgage and ungaged stream location information, basin characteristics, any warnings, baseline (minimally altered) and altered (affected by regulation, diversion, mining, or other anthropogenic activities) daily mean streamflow, and the mean and median streamflow. The estimated daily flows for the ungaged stream location can be easily exported as a text file that can be used as input into a statistical software package to determine additional streamflow statistics, such as flow duration exceedance or streamflow frequency statistics.

  16. ISAAC - InterSpecies Analysing Application using Containers.

    PubMed

    Baier, Herbert; Schultz, Jörg

    2014-01-15

    Information about genes, transcripts and proteins is spread over a wide variety of databases. Different tools have been developed using these databases to identify biological signals in gene lists from large scale analysis. Mostly, they search for enrichments of specific features. But, these tools do not allow an explorative walk through different views and to change the gene lists according to newly upcoming stories. To fill this niche, we have developed ISAAC, the InterSpecies Analysing Application using Containers. The central idea of this web based tool is to enable the analysis of sets of genes, transcripts and proteins under different biological viewpoints and to interactively modify these sets at any point of the analysis. Detailed history and snapshot information allows tracing each action. Furthermore, one can easily switch back to previous states and perform new analyses. Currently, sets can be viewed in the context of genomes, protein functions, protein interactions, pathways, regulation, diseases and drugs. Additionally, users can switch between species with an automatic, orthology based translation of existing gene sets. As todays research usually is performed in larger teams and consortia, ISAAC provides group based functionalities. Here, sets as well as results of analyses can be exchanged between members of groups. ISAAC fills the gap between primary databases and tools for the analysis of large gene lists. With its highly modular, JavaEE based design, the implementation of new modules is straight forward. Furthermore, ISAAC comes with an extensive web-based administration interface including tools for the integration of third party data. Thus, a local installation is easily feasible. In summary, ISAAC is tailor made for highly explorative interactive analyses of gene, transcript and protein sets in a collaborative environment.

  17. Framing quality improvement tools and techniques in healthcare the case of improvement leaders' guides.

    PubMed

    Millar, Ross

    2013-01-01

    The purpose of this paper is to present a study of how quality improvement tools and techniques are framed within healthcare settings. The paper employs an interpretive approach to understand how quality improvement tools and techniques are mobilised and legitimated. It does so using a case study of the NHS Modernisation Agency Improvement Leaders' Guides in England. Improvement Leaders' Guides were framed within a service improvement approach encouraging the use of quality improvement tools and techniques within healthcare settings. Their use formed part of enacting tools and techniques across different contexts. Whilst this enactment was believed to support the mobilisation of tools and techniques, the experience also illustrated the challenges in distributing such approaches. The paper provides an important contribution in furthering our understanding of framing the "social act" of quality improvement. Given the ongoing emphasis on quality improvement in health systems and the persistent challenges involved, it also provides important information for healthcare leaders globally in seeking to develop, implement or modify similar tools and distribute leadership within health and social care settings.

  18. ProteoWizard: open source software for rapid proteomics tools development.

    PubMed

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  19. Sharing My Music with You: The Musical Presentation as a Tool for Exploring, Examining and Enhancing Self-Awareness in a Group Setting

    ERIC Educational Resources Information Center

    Bensimon, Moshe; Amir, Dorit

    2010-01-01

    Musical presentation (MP) is a diagnostic and therapeutic music therapy tool which focuses on the participant's emotional exploration and awareness-insight development. Using this tool people present themselves through music of their choice and subsequently receive feedback from their peers. This study investigates MP as a tool for enhancing…

  20. Development of a simulation evaluation tool for assessing nursing students' clinical judgment in caring for children with dehydration.

    PubMed

    Kim, Shin-Jeong; Kim, Sunghee; Kang, Kyung-Ah; Oh, Jina; Lee, Myung-Nam

    2016-02-01

    The lack of reliable and valid tools to evaluate learning outcomes during simulations has limited the adoption and progress of simulation-based nursing education. This study had two aims: (a) to develop a simulation evaluation tool (SET(c-dehydration)) to assess students' clinical judgment in caring for children with dehydration based on the Lasater Clinical Judgment Rubric (LCJR) and (b) to examine its reliability and validity. Undergraduate nursing students from two nursing schools in South Korea participated in this study from March 3 through June 10, 2014. The SET(c-dehydration) was developed, and 120 nursing students' clinical judgment was evaluated. Descriptive statistics, Cronbach's alpha, Cohen's kappa coefficient, and confirmatory factor analysis (CFA) were used to analyze the data. A 41-item version of the SET(c-dehydration) with three subscales was developed. Cohen's kappa (measuring inter-observer reliability) of the sessions ranged from .73 to .95, and Cronbach's alpha was .87. The mean total rating of the SET(c-dehydration) by the instructors was 1.92 (±.25), and the mean scores for the four LCJR dimensions of clinical judgment were as follows: noticing (1.74±.27), interpreting (1.85±.43), responding (2.17±.32), and reflecting (1.79±.35). CFA, which was performed to test construct validity, showed that the four dimensions of the SET(c-dehydration) was an appropriate framework. The SET(c-dehydration) provides a means to evaluate clinical judgment in simulation education. Its reliability and validity should be examined further. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Two-Photon Excitation, Fluorescence Microscopy, and Quantitative Measurement of Two-Photon Absorption Cross Sections

    NASA Astrophysics Data System (ADS)

    DeArmond, Fredrick Michael

    As optical microscopy techniques continue to improve, most notably the development of super-resolution optical microscopy which garnered the Nobel Prize in Chemistry in 2014, renewed emphasis has been placed on the development and use of fluorescence microscopy techniques. Of particular note is a renewed interest in multiphoton excitation due to a number of inherent properties of the technique including simplified optical filtering, increased sample penetration, and inherently confocal operation. With this renewed interest in multiphoton fluorescence microscopy, comes an increased demand for robust non-linear fluorescent markers, and characterization of the associated tool set. These factors have led to an experimental setup to allow a systematized approach for identifying and characterizing properties of fluorescent probes in the hopes that the tool set will provide researchers with additional information to guide their efforts in developing novel fluorophores suitable for use in advanced optical microscopy techniques as well as identifying trends for their synthesis. Hardware was setup around a software control system previously developed. Three experimental tool sets were set up, characterized, and applied over the course of this work. These tools include scanning multiphoton fluorescence microscope with single molecule sensitivity, an interferometric autocorrelator for precise determination of the bandwidth and pulse width of the ultrafast Titanium Sapphire excitation source, and a simplified fluorescence microscope for the measurement of two-photon absorption cross sections. Resulting values for two-photon absorption cross sections and two-photon absorption action cross sections for two standardized fluorophores, four commercially available fluorophores, and ten novel fluorophores are presented as well as absorption and emission spectra.

  2. A Clinical Tool for the Prediction of Venous Thromboembolism in Pediatric Trauma Patients.

    PubMed

    Connelly, Christopher R; Laird, Amy; Barton, Jeffrey S; Fischer, Peter E; Krishnaswami, Sanjay; Schreiber, Martin A; Zonies, David H; Watters, Jennifer M

    2016-01-01

    Although rare, the incidence of venous thromboembolism (VTE) in pediatric trauma patients is increasing, and the consequences of VTE in children are significant. Studies have demonstrated increasing VTE risk in older pediatric trauma patients and improved VTE rates with institutional interventions. While national evidence-based guidelines for VTE screening and prevention are in place for adults, none exist for pediatric patients, to our knowledge. To develop a risk prediction calculator for VTE in children admitted to the hospital after traumatic injury to assist efforts in developing screening and prophylaxis guidelines for this population. Retrospective review of 536,423 pediatric patients 0 to 17 years old using the National Trauma Data Bank from January 1, 2007, to December 31, 2012. Five mixed-effects logistic regression models of varying complexity were fit on a training data set. Model validity was determined by comparison of the area under the receiver operating characteristic curve (AUROC) for the training and validation data sets from the original model fit. A clinical tool to predict the risk of VTE based on individual patient clinical characteristics was developed from the optimal model. Diagnosis of VTE during hospital admission. Venous thromboembolism was diagnosed in 1141 of 536,423 children (overall rate, 0.2%). The AUROCs in the training data set were high (range, 0.873-0.946) for each model, with minimal AUROC attenuation in the validation data set. A prediction tool was developed from a model that achieved a balance of high performance (AUROCs, 0.945 and 0.932 in the training and validation data sets, respectively; P = .048) and parsimony. Points are assigned to each variable considered (Glasgow Coma Scale score, age, sex, intensive care unit admission, intubation, transfusion of blood products, central venous catheter placement, presence of pelvic or lower extremity fractures, and major surgery), and the points total is converted to a VTE risk score. The predicted risk of VTE ranged from 0.0% to 14.4%. We developed a simple clinical tool to predict the risk of developing VTE in pediatric trauma patients. It is based on a model created using a large national database and was internally validated. The clinical tool requires external validation but provides an initial step toward the development of the specific VTE protocols for pediatric trauma patients.

  3. ExAtlas: An interactive online tool for meta-analysis of gene expression data.

    PubMed

    Sharov, Alexei A; Schlessinger, David; Ko, Minoru S H

    2015-12-01

    We have developed ExAtlas, an on-line software tool for meta-analysis and visualization of gene expression data. In contrast to existing software tools, ExAtlas compares multi-component data sets and generates results for all combinations (e.g. all gene expression profiles versus all Gene Ontology annotations). ExAtlas handles both users' own data and data extracted semi-automatically from the public repository (GEO/NCBI database). ExAtlas provides a variety of tools for meta-analyses: (1) standard meta-analysis (fixed effects, random effects, z-score, and Fisher's methods); (2) analyses of global correlations between gene expression data sets; (3) gene set enrichment; (4) gene set overlap; (5) gene association by expression profile; (6) gene specificity; and (7) statistical analysis (ANOVA, pairwise comparison, and PCA). ExAtlas produces graphical outputs, including heatmaps, scatter-plots, bar-charts, and three-dimensional images. Some of the most widely used public data sets (e.g. GNF/BioGPS, Gene Ontology, KEGG, GAD phenotypes, BrainScan, ENCODE ChIP-seq, and protein-protein interaction) are pre-loaded and can be used for functional annotations.

  4. [The Italian instrument evaluating the nursing students clinical learning quality].

    PubMed

    Palese, Alvisa; Grassetti, Luca; Mansutti, Irene; Destrebecq, Anne; Terzoni, Stefano; Altini, Pietro; Bevilacqua, Anita; Brugnolli, Anna; Benaglio, Carla; Dal Ponte, Adriana; De Biasio, Laura; Dimonte, Valerio; Gambacorti, Benedetta; Fasci, Adriana; Grosso, Silvia; Mantovan, Franco; Marognolli, Oliva; Montalti, Sandra; Nicotera, Raffaela; Randon, Giulia; Stampfl, Brigitte; Tollini, Morena; Canzan, Federica; Saiani, Luisa; Zannini, Lucia

    2017-01-01

    . The Clinical Learning Quality Evaluation Index for nursing students. The Italian nursing programs, the need to introduce tools evaluating the quality of the clinical learning as perceived by nursing students. Several tools already exist, however, several limitations suggesting the need to develop a new tool. A national project aimed at developing and validating a new instrument capable of measuring the clinical learning quality as experience by nursing students. A validation study design was undertaken from 2015 to 2016. All nursing national programs (n=43) were invited to participate by including all nursing students attending regularly their clinical learning. The tool developed based upon a) literature, b) validated tools already established among other healthcare professionals, and c) consensus expressed by experts and nursing students, was administered to the eligible students. 9606 nursing in 27 universities (62.8%) participated. The psychometric properties of the new instrument ranged from good to excellent. According to the findings, the tool consists in 22 items and five factors: a) quality of the tutorial strategies, b) learning opportunities; c) safety and nursing care quality; d) self-direct learning; e) quality of the learning environment. The tool is already used. Its systematic adoption may support comparison among settings and across different programs; moreover, the tool may also support in accrediting new settings as well as in measuring the effects of strategies aimed at improving the quality of the clinical learning.

  5. What We Know and Don't Know About Measuring Quality in Early Childhood and School-Age Care and Education Settings. OPRE Issue Brief #1. Publication #2009-12

    ERIC Educational Resources Information Center

    Child Trends, 2009

    2009-01-01

    Measures assessing the quality of children's environments and interactions in nonparental care settings were developed originally for use in child care research and as self-assessment tools for practitioners. Within the last decade, these measurement tools have moved into the public policy arena, where they are now used to make programmatic…

  6. Integrating the hospital library with patient care, teaching and research: model and Web 2.0 tools to create a social and collaborative community of clinical research in a hospital setting.

    PubMed

    Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin

    2010-09-01

    Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.

  7. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    NASA Technical Reports Server (NTRS)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the MiniWall to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  8. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    NASA Technical Reports Server (NTRS)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the "MiniWall" to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP Server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  9. The scope of cell phones in diabetes management in developing country health care settings.

    PubMed

    Ajay, Vamadevan S; Prabhakaran, Dorairaj

    2011-05-01

    Diabetes has emerged as a major public health concern in developing nations. Health systems in most developing countries are yet to integrate effective prevention and control programs for diabetes into routine health care services. Given the inadequate human resources and underfunctioning health systems, we need novel and innovative approaches to combat diabetes in developing-country settings. In this regard, the tremendous advances in telecommunication technology, particularly cell phones, can be harnessed to improve diabetes care. Cell phones could serve as a tool for collecting information on surveillance, service delivery, evidence-based care, management, and supply systems pertaining to diabetes from primary care settings in addition to providing health messages as part of diabetes education. As a screening/diagnostic tool for diabetes, cell phones can aid the health workers in undertaking screening and diagnostic and follow-up care for diabetes in the community. Cell phones are also capable of acting as a vehicle for continuing medical education; a decision support system for evidence-based management; and a tool for patient education, self-management, and compliance. However, for widespread use, we need robust evaluations of cell phone applications in existing practices and appropriate interventions in diabetes. © 2011 Diabetes Technology Society.

  10. The Scope of Cell Phones in Diabetes Management in Developing Country Health Care Settings

    PubMed Central

    Ajay, Vamadevan S; Prabhakaran, Dorairaj

    2011-01-01

    Diabetes has emerged as a major public health concern in developing nations. Health systems in most developing countries are yet to integrate effective prevention and control programs for diabetes into routine health care services. Given the inadequate human resources and underfunctioning health systems, we need novel and innovative approaches to combat diabetes in developing-country settings. In this regard, the tremendous advances in telecommunication technology, particularly cell phones, can be harnessed to improve diabetes care. Cell phones could serve as a tool for collecting information on surveillance, service delivery, evidence-based care, management, and supply systems pertaining to diabetes from primary care settings in addition to providing health messages as part of diabetes education. As a screening/diagnostic tool for diabetes, cell phones can aid the health workers in undertaking screening and diagnostic and follow-up care for diabetes in the community. Cell phones are also capable of acting as a vehicle for continuing medical education; a decision support system for evidence-based management; and a tool for patient education, self-management, and compliance. However, for widespread use, we need robust evaluations of cell phone applications in existing practices and appropriate interventions in diabetes. PMID:21722593

  11. A Program Evaluation Tool for Dual Enrollment Transition Programs

    ERIC Educational Resources Information Center

    Grigal, Meg; Dwyre, Amy; Emmett, Joyce; Emmett, Richard

    2012-01-01

    This article describes the development and use of a program evaluation tool designed to support self-assessment of college-based dual enrollment transition programs serving students with intellectual disabilities between the ages of 18-21 in college settings. The authors describe the need for such an evaluation tool, outline the areas addressed by…

  12. Development of spatial data guidelines and standards: spatial data set documentation to support hydrologic analysis in the U.S. Geological Survey

    USGS Publications Warehouse

    Fulton, James L.

    1992-01-01

    Spatial data analysis has become an integral component in many surface and sub-surface hydrologic investigations within the U.S. Geological Survey (USGS). Currently, one of the largest costs in applying spatial data analysis is the cost of developing the needed spatial data. Therefore, guidelines and standards are required for the development of spatial data in order to allow for data sharing and reuse; this eliminates costly redevelopment. In order to attain this goal, the USGS is expanding efforts to identify guidelines and standards for the development of spatial data for hydrologic analysis. Because of the variety of project and database needs, the USGS has concentrated on developing standards for documenting spatial sets to aid in the assessment of data set quality and compatibility of different data sets. An interim data set documentation standard (1990) has been developed that provides a mechanism for associating a wide variety of information with a data set, including data about source material, data automation and editing procedures used, projection parameters, data statistics, descriptions of features and feature attributes, information on organizational contacts lists of operations performed on the data, and free-form comments and notes about the data, made at various times in the evolution of the data set. The interim data set documentation standard has been automated using a commercial geographic information system (GIS) and data set documentation software developed by the USGS. Where possible, USGS developed software is used to enter data into the data set documentation file automatically. The GIS software closely associates a data set with its data set documentation file; the documentation file is retained with the data set whenever it is modified, copied, or transferred to another computer system. The Water Resources Division of the USGS is continuing to develop spatial data and data processing standards, with emphasis on standards needed to support hydrologic analysis, hydrologic data processing, and publication of hydrologic thermatic maps. There is a need for the GIS vendor community to develop data set documentation tools similar to those developed by the USGS, or to incorporate USGS developed tools in their software.

  13. Decision Tools: What To Consider When Partnering for Learnware = Outils de decision: Facteurs a considerer dans la mise en place de partenariats pour les technologies d'apprentissage.

    ERIC Educational Resources Information Center

    Stahmer, Anna; Green, Lyndsay

    This report provides a set of decision tools for learnware developers in private companies, public organizations, and education institutions to use in developing strategic alliances or partnerships for the development, delivery, and marketing of learnware products and services designed to meet Canadians' lifelong learning needs. The report…

  14. Earth-Science Data Co-Locating Tool

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Block, Gary L.

    2012-01-01

    This software is used to locate Earth-science satellite data and climate-model analysis outputs in space and time. This enables the direct comparison of any set of data with different spatial and temporal resolutions. It is written in three separate modules that are clearly separated for their functionality and interface with other modules. This enables a fast development of supporting any new data set. In this updated version of the tool, several new front ends are developed for new products. This software finds co-locatable data pairs for given sets of data products and creates new data products that share the same spatial and temporal coordinates. This facilitates the direct comparison between the two heterogeneous datasets and the comprehensive and synergistic use of the datasets.

  15. Allied health clinicians using translational research in action to develop a reliable stroke audit tool.

    PubMed

    Abery, Philip; Kuys, Suzanne; Lynch, Mary; Low Choy, Nancy

    2018-05-23

    To design and establish reliability of a local stroke audit tool by engaging allied health clinicians within a privately funded hospital. Design: Two-stage study involving a modified Delphi process to inform stroke audit tool development and inter-tester reliability. Allied health clinicians. A modified Delphi process to select stroke guideline recommendations for inclusion in the audit tool. Reliability study: 1 allied health representative from each discipline audited 10 clinical records with sequential admissions to acute and rehabilitation services. Recommendations were admitted to the audit tool when 70% agreement was reached, with 50% set as the reserve agreement. Inter-tester reliability was determined using intra-class correlation coefficients (ICCs) across 10 clinical records. Twenty-two participants (92% female, 50% physiotherapists, 17% occupational therapists) completed the modified Delphi process. Across 6 voting rounds, 8 recommendations reached 70% agreement and 2 reached 50% agreement. Two recommendations (nutrition/hydration; goal setting) were added to ensure representation for all disciplines. Substantial consistency across raters was established for the audit tool applied in acute stroke (ICC .71; range .48 to .90) and rehabilitation (ICC.78; range .60 to .93) services. Allied health clinicians within a privately funded hospital generally agreed in an audit process to develop a reliable stroke audit tool. Allied health clinicians agreed on stroke guideline recommendations to inform a stroke audit tool. The stroke audit tool demonstrated substantial consistency supporting future use for service development. This process, which engages local clinicians, could be adopted by other facilities to design reliable audit tools to identify local service gaps to inform changes to clinical practice. © 2018 John Wiley & Sons, Ltd.

  16. Development of Reliable and Validated Tools to Evaluate Technical Resuscitation Skills in a Pediatric Simulation Setting: Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics.

    PubMed

    Faudeux, Camille; Tran, Antoine; Dupont, Audrey; Desmontils, Jonathan; Montaudié, Isabelle; Bréaud, Jean; Braun, Marc; Fournier, Jean-Paul; Bérard, Etienne; Berlengi, Noémie; Schweitzer, Cyril; Haas, Hervé; Caci, Hervé; Gatin, Amélie; Giovannini-Chami, Lisa

    2017-09-01

    To develop a reliable and validated tool to evaluate technical resuscitation skills in a pediatric simulation setting. Four Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics (RESCAPE) evaluation tools were created, following international guidelines: intraosseous needle insertion, bag mask ventilation, endotracheal intubation, and cardiac massage. We applied a modified Delphi methodology evaluation to binary rating items. Reliability was assessed comparing the ratings of 2 observers (1 in real time and 1 after a video-recorded review). The tools were assessed for content, construct, and criterion validity, and for sensitivity to change. Inter-rater reliability, evaluated with Cohen kappa coefficients, was perfect or near-perfect (>0.8) for 92.5% of items and each Cronbach alpha coefficient was ≥0.91. Principal component analyses showed that all 4 tools were unidimensional. Significant increases in median scores with increasing levels of medical expertise were demonstrated for RESCAPE-intraosseous needle insertion (P = .0002), RESCAPE-bag mask ventilation (P = .0002), RESCAPE-endotracheal intubation (P = .0001), and RESCAPE-cardiac massage (P = .0037). Significantly increased median scores over time were also demonstrated during a simulation-based educational program. RESCAPE tools are reliable and validated tools for the evaluation of technical resuscitation skills in pediatric settings during simulation-based educational programs. They might also be used for medical practice performance evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  18. NIKE: a new clinical tool for establishing levels of indications for cataract surgery.

    PubMed

    Lundström, Mats; Albrecht, Susanne; Håkansson, Ingemar; Lorefors, Ragnhild; Ohlsson, Sven; Polland, Werner; Schmid, Andrea; Svensson, Göran; Wendel, Eva

    2006-08-01

    The purpose of this study was to construct a new clinical tool for establishing levels of indications for cataract surgery, and to validate this tool. Teams from nine eye clinics reached an agreement about the need to develop a clinical tool for setting levels of indications for cataract surgery and about the items that should be included in the tool. The tool was to be called 'NIKE' (Nationell Indikationsmodell för Kataraktextraktion). The Canadian Cataract Priority Criteria Tool served as a model for the NIKE tool, which was modified for Swedish conditions. Items included in the tool were visual acuity of both eyes, patients' perceived difficulties in day-to-day life, cataract symptoms, the ability to live independently, and medical/ophthalmic reasons for surgery. The tool was validated and tested in 343 cataract surgery patients. Validity, stability and reliability were tested and the outcome of surgery was studied in relation to the indication setting. Four indication groups (IGs) were suggested. The group with the greatest indications for surgery was named group 1 and that with the lowest, group 4. Validity was proved to be good. Surgery had the greatest impact on the group with the highest indications for surgery. Test-retest reliability test and interexaminer tests of indication settings showed statistically significant intraclass correlations (intraclass correlation coefficients [ICCs] 0.526 and 0.923, respectively). A new clinical tool for indication setting in cataract surgery is presented. This tool, the NIKE, takes into account both visual acuity and the patient's perceived problems in day-to-day life because of cataract. The tool seems to be stable and reliable and neutral towards different examiners.

  19. Green Infrastructure Models and Tools

    EPA Science Inventory

    The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...

  20. An overview of new video coding tools under consideration for VP10: the successor to VP9

    NASA Astrophysics Data System (ADS)

    Mukherjee, Debargha; Su, Hui; Bankoski, James; Converse, Alex; Han, Jingning; Liu, Zoe; Xu, Yaowu

    2015-09-01

    Google started an opensource project, entitled the WebM Project, in 2010 to develop royaltyfree video codecs for the web. The present generation codec developed in the WebM project called VP9 was finalized in mid2013 and is currently being served extensively by YouTube, resulting in billions of views per day. Even though adoption of VP9 outside Google is still in its infancy, the WebM project has already embarked on an ambitious project to develop a next edition codec VP10 that achieves at least a generational bitrate reduction over the current generation codec VP9. Although the project is still in early stages, a set of new experimental coding tools have already been added to baseline VP9 to achieve modest coding gains over a large enough test set. This paper provides a technical overview of these coding tools.

  1. TBell: A mathematical tool for analyzing decision tables

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Chen, Zewei

    1994-01-01

    This paper describes the development of mathematical theory and software to analyze specifications that are developed using decision tables. A decision table is a tabular format for specifying a complex set of rules that chooses one of a number of alternative actions. The report also describes a prototype tool, called TBell, that automates certain types of analysis.

  2. Visual business ecosystem intelligence: lessons from the field.

    PubMed

    Basole, Rahul C

    2014-01-01

    Macroscopic insight into business ecosystems is becoming increasingly important. With the emergence of new digital business data, opportunities exist to develop rich, interactive visual-analytics tools. Georgia Institute of Technology researchers have been developing and implementing visual business ecosystem intelligence tools in corporate settings. This article discusses the challenges they faced, the lessons learned, and opportunities for future research.

  3. Cross-Cultural Education in U.S. Medical Schools: Development of an Assessment Tool.

    ERIC Educational Resources Information Center

    Dolhun, Eduardo Pena; Munoz, Claudia; Grumbach, Kevin

    2003-01-01

    Medical schools were invited to provide written and Web-based materials related to implementing cross-cultural competency in their curricula. A tool was developed to measure teaching methods, skill sets, and eight content areas in cross-cultural education. Most programs emphasized teaching general themes, such as the doctor-patient relationship,…

  4. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    PubMed Central

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426

  5. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    PubMed

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  6. Toward an International Classification of Functioning, Disability and Health clinical data collection tool: the Italian experience of developing simple, intuitive descriptions of the Rehabilitation Set categories.

    PubMed

    Selb, Melissa; Gimigliano, Francesca; Prodinger, Birgit; Stucki, Gerold; Pestelli, Germano; Iocco, Maurizio; Boldrini, Paolo

    2017-04-01

    As part of international efforts to develop and implement national models including the specification of ICF-based clinical data collection tools, the Italian rehabilitation community initiated a project to develop simple, intuitive descriptions of the ICF Rehabilitation Set, highlighting the core concept of each category in user-friendly language. This paper outlines the Italian experience in developing simple, intuitive descriptions of the ICF Rehabilitation Set as an ICF-based clinical data collection tool for Italy. Consensus process. Expert conference. Multidisciplinary group of rehabilitation professionals. The first of a two-stage consensus process involved developing an initial proposal for simple, intuitive descriptions of each ICF Rehabilitation Set category based on descriptions generated in a similar process in China. Stage two involved a consensus conference. Divided into three working groups, participants discussed and voted (vote A) whether the initially proposed descriptions of each ICF Rehabilitation Set category was simple and intuitive enough for use in daily practice. Afterwards the categories with descriptions considered ambiguous i.e. not simple and intuitive enough, were divided among the working groups, who were asked to propose a new description for the allocated categories. These proposals were then voted (vote B) on in a plenary session. The last step of the consensus conference required each working group to develop a new proposal for each and the same categories with descriptions still considered ambiguous. Participants then voted (final vote) for which of the three proposed descriptions they preferred. Nineteen clinicians from diverse rehabilitation disciplines from various regions of Italy participated in the consensus process. Three ICF categories already achieved consensus in vote A, while 20 ICF categories were accepted in vote B. The remaining 7 categories were decided in the final vote. The findings were discussed in light of current efforts toward developing strategies for ICF implementation, specifically for the application of an ICF-based clinical data collection tool, not only for Italy but also for the rest of Europe. Promising as minimal standards for monitoring the impact of interventions and for standardized reporting of functioning as a relevant outcome in rehabilitation.

  7. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    PubMed

    Kwasa, Judith; Cettomai, Deanna; Lwanya, Edwin; Osiemo, Dennis; Oyaro, Patrick; Birbeck, Gretchen L; Price, Richard W; Bukusi, Elizabeth A; Cohen, Craig R; Meyer, Ana-Claire L

    2012-01-01

    To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD) for use by primary health care workers (HCW) which would be feasible to implement in resource-limited settings. In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need. A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic. The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20%) of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65). This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  8. Multi-criteria development and incorporation into decision tools for health technology adoption.

    PubMed

    Poulin, Paule; Austen, Lea; Scott, Catherine M; Waddell, Cameron D; Dixon, Elijah; Poulin, Michelle; Lafrenière, René

    2013-01-01

    When introducing new health technologies, decision makers must integrate research evidence with local operational management information to guide decisions about whether and under what conditions the technology will be used. Multi-criteria decision analysis can support the adoption or prioritization of health interventions by using criteria to explicitly articulate the health organization's needs, limitations, and values in addition to evaluating evidence for safety and effectiveness. This paper seeks to describe the development of a framework to create agreed-upon criteria and decision tools to enhance a pre-existing local health technology assessment (HTA) decision support program. The authors compiled a list of published criteria from the literature, consulted with experts to refine the criteria list, and used a modified Delphi process with a group of key stakeholders to review, modify, and validate each criterion. In a workshop setting, the criteria were used to create decision tools. A set of user-validated criteria for new health technology evaluation and adoption was developed and integrated into the local HTA decision support program. Technology evaluation and decision guideline tools were created using these criteria to ensure that the decision process is systematic, consistent, and transparent. This framework can be used by others to develop decision-making criteria and tools to enhance similar technology adoption programs. The development of clear, user-validated criteria for evaluating new technologies adds a critical element to improve decision-making on technology adoption, and the decision tools ensure consistency, transparency, and real-world relevance.

  9. Chemical Sensor Array Response Modeling Using Quantitative Structure-Activity Relationships Technique

    NASA Astrophysics Data System (ADS)

    Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.

    We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.

  10. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  11. Listeriomics: an Interactive Web Platform for Systems Biology of Listeria

    PubMed Central

    Koutero, Mikael; Tchitchek, Nicolas; Cerutti, Franck; Lechat, Pierre; Maillet, Nicolas; Hoede, Claire; Chiapello, Hélène; Gaspin, Christine

    2017-01-01

    ABSTRACT As for many model organisms, the amount of Listeria omics data produced has recently increased exponentially. There are now >80 published complete Listeria genomes, around 350 different transcriptomic data sets, and 25 proteomic data sets available. The analysis of these data sets through a systems biology approach and the generation of tools for biologists to browse these various data are a challenge for bioinformaticians. We have developed a web-based platform, named Listeriomics, that integrates different tools for omics data analyses, i.e., (i) an interactive genome viewer to display gene expression arrays, tiling arrays, and sequencing data sets along with proteomics and genomics data sets; (ii) an expression and protein atlas that connects every gene, small RNA, antisense RNA, or protein with the most relevant omics data; (iii) a specific tool for exploring protein conservation through the Listeria phylogenomic tree; and (iv) a coexpression network tool for the discovery of potential new regulations. Our platform integrates all the complete Listeria species genomes, transcriptomes, and proteomes published to date. This website allows navigation among all these data sets with enriched metadata in a user-friendly format and can be used as a central database for systems biology analysis. IMPORTANCE In the last decades, Listeria has become a key model organism for the study of host-pathogen interactions, noncoding RNA regulation, and bacterial adaptation to stress. To study these mechanisms, several genomics, transcriptomics, and proteomics data sets have been produced. We have developed Listeriomics, an interactive web platform to browse and correlate these heterogeneous sources of information. Our website will allow listeriologists and microbiologists to decipher key regulation mechanism by using a systems biology approach. PMID:28317029

  12. Salmon recovery planning using the VELMA model

    EPA Science Inventory

    We developed a set of tools to provide decision support for community-based salmon recovery planning in Pacific Northwest watersheds. This seminar describes how these tools are being integrated and applied in collaboration with Puget Sound tribes and community stakeholders to add...

  13. Development of the TeamOBS-PPH - targeting clinical performance in postpartum hemorrhage.

    PubMed

    Brogaard, Lise; Hvidman, Lone; Hinshaw, Kim; Kierkegaard, Ole; Manser, Tanja; Musaeus, Peter; Arafeh, Julie; Daniels, Kay I; Judy, Amy E; Uldbjerg, Niels

    2018-06-01

    This study aimed to develop a valid and reliable TeamOBS-PPH tool for assessing clinical performance in the management of postpartum hemorrhage (PPH). The tool was evaluated using video-recordings of teams managing PPH in both real-life and simulated settings. A Delphi panel consisting of 12 obstetricians from the UK, Norway, Sweden, Iceland, and Denmark achieved consensus on (i) the elements to include in the assessment tool, (ii) the weighting of each element, and (iii) the final tool. The validity and reliability were evaluated according to Cook and Beckman. (Level 1) Four raters scored four video-recordings of in situ simulations of PPH. (Level 2) Two raters scored 85 video-recordings of real-life teams managing patients with PPH ≥1000 mL in two Danish hospitals. (Level 3) Two raters scored 15 video-recordings of in situ simulations of PPH from a US hospital. The tool was designed with scores from 0 to 100. (Level 1) Teams of novices had a median score of 54 (95% CI 48-60), whereas experienced teams had a median score of 75 (95% CI 71-79; p < 0.001). (Level 2) The intra-rater [intra-class correlation (ICC) = 0.96] and inter-rater (ICC = 0.83) agreements for real-life PPH were strong. The tool was applicable in all cases: atony, retained placenta, and lacerations. (Level 3) The tool was easily adapted to in situ simulation settings in the USA (ICC = 0.86). The TeamOBS-PPH tool appears to be valid and reliable for assessing clinical performance in real-life and simulated settings. The tool will be shared as the free TeamOBS App. © 2018 Nordic Federation of Societies of Obstetrics and Gynecology.

  14. Developing Electronic Health Record (EHR) Strategies Related to Health Center Patients' Social Determinants of Health.

    PubMed

    Gold, Rachel; Cottrell, Erika; Bunce, Arwen; Middendorf, Mary; Hollombe, Celine; Cowburn, Stuart; Mahr, Peter; Melgar, Gerardo

    2017-01-01

    "Social determinants of heath" (SDHs) are nonclinical factors that profoundly affect health. Helping community health centers (CHCs) document patients' SDH data in electronic health records (EHRs) could yield substantial health benefits, but little has been reported about CHCs' development of EHR-based tools for SDH data collection and presentation. We worked with 27 diverse CHC stakeholders to develop strategies for optimizing SDH data collection and presentation in their EHR, and approaches for integrating SDH data collection and the use of those data (eg, through referrals to community resources) into CHC workflows. We iteratively developed a set of EHR-based SDH data collection, summary, and referral tools for CHCs. We describe considerations that arose while developing the tools and present some preliminary lessons learned. Standardizing SDH data collection and presentation in EHRs could lead to improved patient and population health outcomes in CHCs and other care settings. We know of no previous reports of processes used to develop similar tools. This article provides an example of 1 such process. Lessons from our process may be useful to health care organizations interested in using EHRs to collect and act on SDH data. Research is needed to empirically test the generalizability of these lessons. © Copyright 2017 by the American Board of Family Medicine.

  15. Using a graphical programming language to write CAMAC/GPIB instrument drivers

    NASA Technical Reports Server (NTRS)

    Zambrana, Horacio; Johanson, William

    1991-01-01

    To reduce the complexities of conventional programming, graphical software was used in the development of instrumentation drivers. The graphical software provides a standard set of tools (graphical subroutines) which are sufficient to program the most sophisticated CAMAC/GPIB drivers. These tools were used and instrumentation drivers were successfully developed for operating CAMAC/GPIB hardware from two different manufacturers: LeCroy and DSP. The use of these tools is presented for programming a LeCroy A/D Waveform Analyzer.

  16. Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Palano, Mimmo

    2016-03-01

    We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.

  17. Towards sets of hazardous waste indicators. Essential tools for modern industrial management.

    PubMed

    Peterson, Peter J; Granados, Asa

    2002-01-01

    Decision-makers require useful tools, such as indicators, to help them make environmentally sound decisions leading to effective management of hazardous wastes. Four hazardous waste indicators are being tested for such a purpose by several countries within the Sustainable Development Indicator Programme of the United Nations Commission for Sustainable Development. However, these indicators only address the 'down-stream' end-of-pipe industrial situation. More creative thinking is clearly needed to develop a wider range of indicators that not only reflects all aspects of industrial production that generates hazardous waste but considers socio-economic implications of the waste as well. Sets of useful and innovative indicators are proposed that could be applied to the emerging paradigm shift away from conventional end-of-pipe management actions and towards preventive strategies that are being increasingly adopted by industry often in association with local and national governments. A methodological and conceptual framework for the development of a core-set of hazardous waste indicators has been developed. Some of the indicator sets outlined quantify preventive waste management strategies (including indicators for cleaner production, hazardous waste reduction/minimization and life cycle analysis), whilst other sets address proactive strategies (including changes in production and consumption patterns, eco-efficiency, eco-intensity and resource productivity). Indicators for quantifying transport of hazardous wastes are also described. It was concluded that a number of the indicators proposed could now be usefully implemented as management tools using existing industrial and economic data. As cleaner production technologies and waste minimization approaches are more widely deployed, and industry integrates environmental concerns at all levels of decision-making, it is expected that the necessary data for construction of the remaining indicators will soon become available.

  18. Developing a Multidisciplinary Team for Disorders of Sex Development: Planning, Implementation, and Operation Tools for Care Providers

    PubMed Central

    Moran, Mary Elizabeth; Karkazis, Katrina

    2012-01-01

    In the treatment of patients with disorders of sex development (DSD), multidisciplinary teams (MDTs) represent a new standard of care. While DSDs are too complex for care to be delivered effectively without specialized team management, these conditions are often considered to be too rare for their medical management to be a hospital priority. Many specialists involved in DSD care want to create a clinic or team, but there is no available guidance that bridges the gap between a group of like-minded DSD providers who want to improve care and the formation of a functional MDT. This is an important dilemma, and one with serious implications for the future of DSD care. If a network of multidisciplinary DSD teams is to be a reality, those directly involved in DSD care must be given the necessary program planning and team implementation tools. This paper offers a protocol and set of tools to meet this need. We present a 6-step process to team formation, and a sample set of tools that can be used to guide, develop, and evaluate a team throughout the course of its operation. PMID:22792098

  19. USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOL IN POLLUTION PREVENTION

    EPA Science Inventory

    Computer-Aided Process Engineering has become established in industry as a design tool. With the establishment of the CAPE-OPEN software specifications for process simulation environments. CAPE-OPEN provides a set of "middleware" standards that enable software developers to acces...

  20. Nisqually Community Forest VELMA modeling

    EPA Science Inventory

    We developed a set of modeling tools to support community-based forest management and salmon-recovery planning in Pacific Northwest watersheds. Here we describe how these tools are being applied to the Mashel River Watershed in collaboration with the Board of Directors of the Nis...

  1. Scheduling Results for the THEMIS Observation Scheduling Tool

    NASA Technical Reports Server (NTRS)

    Mclaren, David; Rabideau, Gregg; Chien, Steve; Knight, Russell; Anwar, Sadaat; Mehall, Greg; Christensen, Philip

    2011-01-01

    We describe a scheduling system intended to assist in the development of instrument data acquisitions for the THEMIS instrument, onboard the Mars Odyssey spacecraft, and compare results from multiple scheduling algorithms. This tool creates observations of both (a) targeted geographical regions of interest and (b) general mapping observations, while respecting spacecraft constraints such as data volume, observation timing, visibility, lighting, season, and science priorities. This tool therefore must address both geometric and state/timing/resource constraints. We describe a tool that maps geometric polygon overlap constraints to set covering constraints using a grid-based approach. These set covering constraints are then incorporated into a greedy optimization scheduling algorithm incorporating operations constraints to generate feasible schedules. The resultant tool generates schedules of hundreds of observations per week out of potential thousands of observations. This tool is currently under evaluation by the THEMIS observation planning team at Arizona State University.

  2. The Delivery of Health Promotion and Environmental Health Services; Public Health or Primary Care Settings?

    PubMed

    Bjørn Jensen, Lene; Lukic, Irena; Gulis, Gabriel

    2018-05-07

    The WHO Regional Office for Europe developed a set of public health functions resulting in the ten Essential Public Health Operations (EPHO). Public health or primary care settings seem to be favorable to embrace all actions included into EPHOs. The presented paper aims to guide readers on how to assign individual health promotion and environmental health services to public health or primary care settings. Survey tools were developed based on EPHO 2, 3 and 4; there were six key informant surveys out of 18 contacted completed via e-mails by informants working in Denmark on health promotion and five face-to-face interviews were conducted in Australia (Melbourne and Victoria state) with experts from environmental health, public health and a physician. Based on interviews, we developed a set of indicators to support the assignment process. Population or individual focus, a system approach or one-to-one approach, dealing with hazards or dealing with effects, being proactive or reactive were identified as main element of the decision tool. Assignment of public health services to one of two settings proved to be possible in some cases, whereas in many there is no clear distinction between the two settings. National context might be the one which guides delivery of public health services.

  3. Can I solve my structure by SAD phasing? Planning an experiment, scaling data and evaluating the useful anomalous correlation and anomalous signal.

    PubMed

    Terwilliger, Thomas C; Bunkóczi, Gábor; Hung, Li Wei; Zwart, Peter H; Smith, Janet L; Akey, David L; Adams, Paul D

    2016-03-01

    A key challenge in the SAD phasing method is solving a structure when the anomalous signal-to-noise ratio is low. Here, algorithms and tools for evaluating and optimizing the useful anomalous correlation and the anomalous signal in a SAD experiment are described. A simple theoretical framework [Terwilliger et al. (2016), Acta Cryst. D72, 346-358] is used to develop methods for planning a SAD experiment, scaling SAD data sets and estimating the useful anomalous correlation and anomalous signal in a SAD data set. The phenix.plan_sad_experiment tool uses a database of solved and unsolved SAD data sets and the expected characteristics of a SAD data set to estimate the probability that the anomalous substructure will be found in the SAD experiment and the expected map quality that would be obtained if the substructure were found. The phenix.scale_and_merge tool scales unmerged SAD data from one or more crystals using local scaling and optimizes the anomalous signal by identifying the systematic differences among data sets, and the phenix.anomalous_signal tool estimates the useful anomalous correlation and anomalous signal after collecting SAD data and estimates the probability that the data set can be solved and the likely figure of merit of phasing.

  4. A Systematic Approach to Capacity Strengthening of Laboratory Systems for Control of Neglected Tropical Diseases in Ghana, Kenya, Malawi and Sri Lanka

    PubMed Central

    Njelesani, Janet; Dacombe, Russell; Palmer, Tanith; Smith, Helen; Koudou, Benjamin; Bockarie, Moses; Bates, Imelda

    2014-01-01

    Background The lack of capacity in laboratory systems is a major barrier to achieving the aims of the London Declaration (2012) on neglected tropical diseases (NTDs). To counter this, capacity strengthening initiatives have been carried out in NTD laboratories worldwide. Many of these initiatives focus on individuals' skills or institutional processes and structures ignoring the crucial interactions between the laboratory and the wider national and international context. Furthermore, rigorous methods to assess these initiatives once they have been implemented are scarce. To address these gaps we developed a set of assessment and monitoring tools that can be used to determine the capacities required and achieved by laboratory systems at the individual, organizational, and national/international levels to support the control of NTDs. Methodology and principal findings We developed a set of qualitative and quantitative assessment and monitoring tools based on published evidence on optimal laboratory capacity. We implemented the tools with laboratory managers in Ghana, Malawi, Kenya, and Sri Lanka. Using the tools enabled us to identify strengths and gaps in the laboratory systems from the following perspectives: laboratory quality benchmarked against ISO 15189 standards, the potential for the laboratories to provide support to national and regional NTD control programmes, and the laboratory's position within relevant national and international networks and collaborations. Conclusion We have developed a set of mixed methods assessment and monitoring tools based on evidence derived from the components needed to strengthen the capacity of laboratory systems to control NTDs. Our tools help to systematically assess and monitor individual, organizational, and wider system level capacity of laboratory systems for NTD control and can be applied in different country contexts. PMID:24603407

  5. The Visual Geophysical Exploration Environment: A Multi-dimensional Scientific Visualization

    NASA Astrophysics Data System (ADS)

    Pandya, R. E.; Domenico, B.; Murray, D.; Marlino, M. R.

    2003-12-01

    The Visual Geophysical Exploration Environment (VGEE) is an online learning environment designed to help undergraduate students understand fundamental Earth system science concepts. The guiding principle of the VGEE is the importance of hands-on interaction with scientific visualization and data. The VGEE consists of four elements: 1) an online, inquiry-based curriculum for guiding student exploration; 2) a suite of El Nino-related data sets adapted for student use; 3) a learner-centered interface to a scientific visualization tool; and 4) a set of concept models (interactive tools that help students understand fundamental scientific concepts). There are two key innovations featured in this interactive poster session. One is the integration of concept models and the visualization tool. Concept models are simple, interactive, Java-based illustrations of fundamental physical principles. We developed eight concept models and integrated them into the visualization tool to enable students to probe data. The ability to probe data using a concept model addresses the common problem of transfer: the difficulty students have in applying theoretical knowledge to everyday phenomenon. The other innovation is a visualization environment and data that are discoverable in digital libraries, and installed, configured, and used for investigations over the web. By collaborating with the Integrated Data Viewer developers, we were able to embed a web-launchable visualization tool and access to distributed data sets into the online curricula. The Thematic Real-time Environmental Data Distributed Services (THREDDS) project is working to provide catalogs of datasets that can be used in new VGEE curricula under development. By cataloging this curricula in the Digital Library for Earth System Education (DLESE), learners and educators can discover the data and visualization tool within a framework that guides their use.

  6. The Applicability of Proposed Object-Oriented Metrics to Developer Feedback in Time to Impact Development

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.

    1996-01-01

    This paper looks closely at each of the software metrics generated by the McCabe object-Oriented Tool(TM) and its ability to convey timely information to developers. The metrics are examined for meaningfulness in terms of the scale assignable to the metric by the rules of measurement theory and the software dimension being measured. Recommendations are made as to the proper use of each metric and its ability to influence development at an early stage. The metrics of the McCabe Object-Oriented Tool(TM) set were selected because of the tool's use in a couple of NASA IV&V projects.

  7. Chapter 6 - Developing the LANDFIRE Vegetation and Biophysical Settings Map Unit Classifications for the LANDFIRE Prototype Project

    Treesearch

    Jennifer L. Long; Melanie Miller; James P. Menakis; Robert E. Keane

    2006-01-01

    The Landscape Fire and Resource Management Planning Tools Prototype Project, or LANDFIRE Prototype Project, required a system for classifying vegetation composition, biophysical settings, and vegetation structure to facilitate the mapping of vegetation and wildland fuel characteristics and the simulation of vegetation dynamics using landscape modeling. We developed...

  8. Brief Report: Applying an Indicator Set to Survey the Health of People with Intellectual Disabilities in Europe

    ERIC Educational Resources Information Center

    Walsh, Patricia Noonan

    2008-01-01

    This report gives an account of applying a health survey tool by the "Pomona" Group that earlier documented the process of developing a set of health indicators for people with intellectual disabilities in Europe. The "Pomona" health indicator set mirrors the much larger set of health indicators prepared by the European…

  9. Software Management Environment (SME): Components and algorithms

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1994-01-01

    This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'

  10. Software Management Environment (SME) concepts and architecture, revision 1

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1992-01-01

    This document presents the concepts and architecture of the Software Management Environment (SME), developed for the Software Engineering Branch of the Flight Dynamic Division (FDD) of GSFC. The SME provides an integrated set of experience-based management tools that can assist software development managers in managing and planning flight dynamics software development projects. This document provides a high-level description of the types of information required to implement such an automated management tool.

  11. Community-based participatory research and user-centered design in a diabetes medication information and decision tool.

    PubMed

    Henderson, Vida A; Barr, Kathryn L; An, Lawrence C; Guajardo, Claudia; Newhouse, William; Mase, Rebecca; Heisler, Michele

    2013-01-01

    Together, community-based participatory research (CBPR), user-centered design (UCD), and health information technology (HIT) offer promising approaches to improve health disparities in low-resource settings. This article describes the application of CBPR and UCD principles to the development of iDecide/Decido, an interactive, tailored, web-based diabetes medication education and decision support tool delivered by community health workers (CHWs) to African American and Latino participants with diabetes in Southwest and Eastside Detroit. The decision aid is offered in English or Spanish and is delivered on an iPad in participants' homes. The overlapping principles of CBPR and UCD used to develop iDecide/Decido include a user-focused or community approach, equitable academic and community partnership in all study phases, an iterative development process that relies on input from all stakeholders, and a program experience that is specified, adapted, and implemented with the target community. Collaboration between community members, researchers, and developers is especially evident in the program's design concept, animations, pictographs, issue cards, goal setting, tailoring, and additional CHW tools. The principles of CBPR and UCD can be successfully applied in developing health information tools that are easy to use and understand, interactive, and target health disparities.

  12. Classifying E-Trainer Standards

    ERIC Educational Resources Information Center

    Julien, Anne

    2005-01-01

    Purpose: To set-up a classification of the types of profiles and competencies that are required to set-up a good e-learning programme. This approach provides a framework within which a set of standards can be defined for e-trainers. Design/methodology/approach: Open and distance learning (ODL) has been developing in Europe, due to new tools in…

  13. Ready, Set, Respect! GLSEN's Elementary School Toolkit

    ERIC Educational Resources Information Center

    Gay, Lesbian and Straight Education Network (GLSEN), 2012

    2012-01-01

    "Ready, Set, Respect!" provides a set of tools to help elementary school educators ensure that all students feel safe and respected and develop respectful attitudes and behaviors. It is not a program to be followed but instead is designed to help educators prepare themselves for teaching about and modeling respect. The toolkit responds to…

  14. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing.

    PubMed

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D'Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-04-05

    International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants' comprehension of the study information was measured by using a validated digitised audio questionnaire. The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants' 'recall' and 'understanding' between first and second visits were statistically significant (F (1,41)=25.38, p<0.00001 and (F (1, 41) = 31.61, p<0.00001 respectively. Our locally developed multimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings.

  15. Evaluation and demonstration of commercialization potential of CCSI tools within gPROMS advanced simulation platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawal, Adekola; Schmal, Pieter; Ramos, Alfredo

    PSE, in the first phase of the CCSI commercialization project, set out to identify market opportunities for the CCSI tools combined with existing gPROMS platform capabilities and develop a clear technical plan for the proposed commercialization activities.

  16. Empowering districts to target priorities for improving child health service in Uganda using change management and rapid assessment methods.

    PubMed

    Odaga, John; Henriksson, Dorcus K; Nkolo, Charles; Tibeihaho, Hector; Musabe, Richard; Katusiime, Margaret; Sinabulya, Zaccheus; Mucunguzi, Stephen; Mbonye, Anthony K; Valadez, Joseph J

    2016-01-01

    Local health system managers in low- and middle-income countries have the responsibility to set health priorities and allocate resources accordingly. Although tools exist to aid this process, they are not widely applied for various reasons including non-availability, poor knowledge of the tools, and poor adaptability into the local context. In Uganda, delivery of basic services is devolved to the District Local Governments through the District Health Teams (DHTs). The Community and District Empowerment for Scale-up (CODES) project aims to provide a set of management tools that aid contextualised priority setting, fund allocation, and problem-solving in a systematic way to improve effective coverage and quality of child survival interventions. Although the various tools have previously been used at the national level, the project aims to combine them in an integral way for implementation at the district level. These tools include Lot Quality Assurance Sampling (LQAS) surveys to generate local evidence, Bottleneck analysis and Causal analysis as analytical tools, Continuous Quality Improvement, and Community Dialogues based on Citizen Report Cards and U reports. The tools enable identification of gaps, prioritisation of possible solutions, and allocation of resources accordingly. This paper presents some of the tools used by the project in five districts in Uganda during the proof-of-concept phase of the project. All five districts were trained and participated in LQAS surveys and readily adopted the tools for priority setting and resource allocation. All districts developed health operational work plans, which were based on the evidence and each of the districts implemented more than three of the priority activities which were included in their work plans. In the five districts, the CODES project demonstrated that DHTs can adopt and integrate these tools in the planning process by systematically identifying gaps and setting priority interventions for child survival.

  17. Empowering districts to target priorities for improving child health service in Uganda using change management and rapid assessment methods

    PubMed Central

    Odaga, John; Henriksson, Dorcus K.; Nkolo, Charles; Tibeihaho, Hector; Musabe, Richard; Katusiime, Margaret; Sinabulya, Zaccheus; Mucunguzi, Stephen; Mbonye, Anthony K.; Valadez, Joseph J.

    2016-01-01

    Background Local health system managers in low- and middle-income countries have the responsibility to set health priorities and allocate resources accordingly. Although tools exist to aid this process, they are not widely applied for various reasons including non-availability, poor knowledge of the tools, and poor adaptability into the local context. In Uganda, delivery of basic services is devolved to the District Local Governments through the District Health Teams (DHTs). The Community and District Empowerment for Scale-up (CODES) project aims to provide a set of management tools that aid contextualised priority setting, fund allocation, and problem-solving in a systematic way to improve effective coverage and quality of child survival interventions. Design Although the various tools have previously been used at the national level, the project aims to combine them in an integral way for implementation at the district level. These tools include Lot Quality Assurance Sampling (LQAS) surveys to generate local evidence, Bottleneck analysis and Causal analysis as analytical tools, Continuous Quality Improvement, and Community Dialogues based on Citizen Report Cards and U reports. The tools enable identification of gaps, prioritisation of possible solutions, and allocation of resources accordingly. This paper presents some of the tools used by the project in five districts in Uganda during the proof-of-concept phase of the project. Results All five districts were trained and participated in LQAS surveys and readily adopted the tools for priority setting and resource allocation. All districts developed health operational work plans, which were based on the evidence and each of the districts implemented more than three of the priority activities which were included in their work plans. Conclusions In the five districts, the CODES project demonstrated that DHTs can adopt and integrate these tools in the planning process by systematically identifying gaps and setting priority interventions for child survival. PMID:27225791

  18. Inventory on the dietary assessment tools available and needed in africa: a prerequisite for setting up a common methodological research infrastructure for nutritional surveillance, research, and prevention of diet-related non-communicable diseases.

    PubMed

    Pisa, Pedro T; Landais, Edwige; Margetts, Barrie; Vorster, Hester H; Friedenreich, Christine M; Huybrechts, Inge; Martin-Prevel, Yves; Branca, Francesco; Lee, Warren T K; Leclercq, Catherine; Jerling, Johann; Zotor, Francis; Amuna, Paul; Al Jawaldeh, Ayoub; Aderibigbe, Olaide Ruth; Amoussa, Waliou Hounkpatin; Anderson, Cheryl A M; Aounallah-Skhiri, Hajer; Atek, Madjid; Benhura, Chakare; Chifamba, Jephat; Covic, Namukolo; Dary, Omar; Delisle, Hélène; El Ati, Jalila; El Hamdouchi, Asmaa; El Rhazi, Karima; Faber, Mieke; Kalimbira, Alexander; Korkalo, Liisa; Kruger, Annamarie; Ledo, James; Machiweni, Tatenda; Mahachi, Carol; Mathe, Nonsikelelo; Mokori, Alex; Mouquet-Rivier, Claire; Mutie, Catherine; Nashandi, Hilde Liisa; Norris, Shane A; Onabanjo, Oluseye Olusegun; Rambeloson, Zo; Saha, Foudjo Brice U; Ubaoji, Kingsley Ikechukwu; Zaghloul, Sahar; Slimani, Nadia

    2018-01-02

    To carry out an inventory on the availability, challenges, and needs of dietary assessment (DA) methods in Africa as a pre-requisite to provide evidence, and set directions (strategies) for implementing common dietary methods and support web-research infrastructure across countries. The inventory was performed within the framework of the "Africa's Study on Physical Activity and Dietary Assessment Methods" (AS-PADAM) project. It involves international institutional and African networks. An inventory questionnaire was developed and disseminated through the networks. Eighteen countries responded to the dietary inventory questionnaire. Various DA tools were reported in Africa; 24-Hour Dietary Recall and Food Frequency Questionnaire were the most commonly used tools. Few tools were validated and tested for reliability. Face-to-face interview was the common method of administration. No computerized software or other new (web) technologies were reported. No tools were standardized across countries. The lack of comparable DA methods across represented countries is a major obstacle to implement comprehensive and joint nutrition-related programmes for surveillance, programme evaluation, research, and prevention. There is a need to develop new or adapt existing DA methods across countries by employing related research infrastructure that has been validated and standardized in other settings, with the view to standardizing methods for wider use.

  19. Piloting a programme tool to evaluate malaria case investigation and reactive case detection activities: results from 3 settings in the Asia Pacific.

    PubMed

    Cotter, Chris; Sudathip, Prayuth; Herdiana, Herdiana; Cao, Yuanyuan; Liu, Yaobao; Luo, Alex; Ranasinghe, Neil; Bennett, Adam; Cao, Jun; Gosling, Roly D

    2017-08-22

    Case investigation and reactive case detection (RACD) activities are widely-used in low transmission settings to determine the suspected origin of infection and identify and treat malaria infections nearby to the index patient household. Case investigation and RACD activities are time and resource intensive, include methodologies that vary across eliminating settings, and have no standardized metrics or tools available to monitor and evaluate them. In response to this gap, a simple programme tool was developed for monitoring and evaluating (M&E) RACD activities and piloted by national malaria programmes. During the development phase, four modules of the RACD M&E tool were created to assess and evaluate key case investigation and RACD activities and costs. A pilot phase was then carried out by programme implementers between 2013 and 2015, during which malaria surveillance teams in three different settings (China, Indonesia, Thailand) piloted the tool over a period of 3 months each. This study describes summary results of the pilots and feasibility and impact of the tool on programmes. All three study areas implemented the RACD M&E tool modules, and pilot users reported the tool and evaluation process were helpful to identify gaps in RACD programme activities. In the 45 health facilities evaluated, 71.8% (97/135; min 35.3-max 100.0%) of the proper notification and reporting forms and 20.0% (27/135; min 0.0-max 100.0%) of standard operating procedures (SOPs) were available to support malaria elimination activities. The tool highlighted gaps in reporting key data indicators on the completeness for malaria case reporting (98.8%; min 93.3-max 100.0%), case investigations (65.6%; min 61.8-max 78.4%) and RACD activities (70.0%; min 64.7-max 100.0%). Evaluation of the SOPs showed that knowledge and practices of malaria personnel varied within and between study areas. Average monthly costs for conducting case investigation and RACD activities showed variation between study areas (min USD $844.80-max USD $2038.00) for the malaria personnel, commodities, services and other costs required to carry out the activities. The RACD M&E tool was implemented in the three pilot areas, identifying key gaps that led to impacts on programme decision making. Study findings support the need for routine M&E of malaria case reporting, case investigation and RACD activities. Scale-up of the RACD M&E tool in malaria-eliminating settings will contribute to improved programme performance to the high level that is required to reach elimination.

  20. A toolbox and a record for scientific model development

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.

  1. Introducing the CUAHSI Hydrologic Information System Desktop Application (HydroDesktop) and Open Development Community

    NASA Astrophysics Data System (ADS)

    Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.

    2009-12-01

    The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.

  2. AeroADL: applying the integration of the Suomi-NPP science algorithms with the Algorithm Development Library to the calibration and validation task

    NASA Astrophysics Data System (ADS)

    Houchin, J. S.

    2014-09-01

    A common problem for the off-line validation of the calibration algorithms and algorithm coefficients is being able to run science data through the exact same software used for on-line calibration of that data. The Joint Polar Satellite System (JPSS) program solved part of this problem by making the Algorithm Development Library (ADL) available, which allows the operational algorithm code to be compiled and run on a desktop Linux workstation using flat file input and output. However, this solved only part of the problem, as the toolkit and methods to initiate the processing of data through the algorithms were geared specifically toward the algorithm developer, not the calibration analyst. In algorithm development mode, a limited number of sets of test data are staged for the algorithm once, and then run through the algorithm over and over as the software is developed and debugged. In calibration analyst mode, we are continually running new data sets through the algorithm, which requires significant effort to stage each of those data sets for the algorithm without additional tools. AeroADL solves this second problem by providing a set of scripts that wrap the ADL tools, providing both efficient means to stage and process an input data set, to override static calibration coefficient look-up-tables (LUT) with experimental versions of those tables, and to manage a library containing multiple versions of each of the static LUT files in such a way that the correct set of LUTs required for each algorithm are automatically provided to the algorithm without analyst effort. Using AeroADL, The Aerospace Corporation's analyst team has demonstrated the ability to quickly and efficiently perform analysis tasks for both the VIIRS and OMPS sensors with minimal training on the software tools.

  3. Defining the Core Archive Data Standards of the International Planetary Data Alliance (IPDA)

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Dan; Beebe, Reta; Guinness, Ed; Heather, David; Zender, Joe

    2007-01-01

    A goal of the International Planetary Data Alliance (lPDA) is to develop a set of archive data standards that enable the sharing of scientific data across international agencies and missions. To help achieve this goal, the IPDA steering committee initiated a six month proj ect to write requirements for and draft an information model based on the Planetary Data System (PDS) archive data standards. The project had a special emphasis on data formats. A set of use case scenarios were first developed from which a set of requirements were derived for the IPDA archive data standards. The special emphasis on data formats was addressed by identifying data formats that have been used by PDS nodes and other agencies in the creation of successful data sets for the Planetary Data System (PDS). The dependency of the IPDA information model on the PDS archive standards required the compilation of a formal specification of the archive standards currently in use by the PDS. An ontology modelling tool was chosen to capture the information model from various sources including the Planetary Science Data Dictionary [I] and the PDS Standards Reference [2]. Exports of the modelling information from the tool database were used to produce the information model document using an object-oriented notation for presenting the model. The tool exports can also be used for software development and are directly accessible by semantic web applications.

  4. The Development and Use of a Concept Mapping Assessment Tool with Young Children on Family Visits to a Live Butterfly Exhibit

    ERIC Educational Resources Information Center

    Mesa, Jennifer Cheryl

    2010-01-01

    Although young children are major audiences of science museums, limited evidence exists documenting changes in children's knowledge in these settings due in part to the limited number of valid and reliable assessment tools available for use with this population. The purposes of this study were to develop and validate a concept mapping assessment…

  5. An integrated set of UNIX based system tools at control room level

    NASA Astrophysics Data System (ADS)

    Potepan, F.; Scafuri, C.; Bortolotto, C.; Surace, G.

    1994-12-01

    The design effort of providing a simple point-and-click approach to the equipment access has led to the definition and realization of a modular set of software tools to be used at the ELETTRA control room level. Point-to-point equipment access requires neither programming nor specific knowledge of the control system architecture. The development and integration of communication, graphic, editing and global database modules are described in depth, followed by a report of their use in the first commissioning period.

  6. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    NASA Astrophysics Data System (ADS)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )

  7. Developmental screening tools: feasibility of use at primary healthcare level in low- and middle-income settings.

    PubMed

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-06-01

    An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducted to identify the tools, assess their psychometric properties, and feasibility of use in low- and middle-income countries (LMICs). Key indicators to examine feasibility in LMICs were derived from a consultation with 23 international experts. We identified 426 studies from which 14 tools used in LMICs were extracted for further examination. Three tools reported adequate psychometric properties and met most of the feasibility criteria. Three tools appear promising for use in identifying and monitoring young children with disabilities at primary healthcare level in LMICs. Further research and development are needed to optimize these tools.

  8. Assessing safety climate in acute hospital settings: a systematic review of the adequacy of the psychometric properties of survey measurement tools.

    PubMed

    Alsalem, Gheed; Bowie, Paul; Morrison, Jillian

    2018-05-10

    The perceived importance of safety culture in improving patient safety and its impact on patient outcomes has led to a growing interest in the assessment of safety climate in healthcare organizations; however, the rigour with which safety climate tools were developed and psychometrically tested was shown to be variable. This paper aims to identify and review questionnaire studies designed to measure safety climate in acute hospital settings, in order to assess the adequacy of reported psychometric properties of identified tools. A systematic review of published empirical literature was undertaken to examine sample characteristics and instrument details including safety climate dimensions, origin and theoretical basis, and extent of psychometric evaluation (content validity, criterion validity, construct validity and internal reliability). Five questionnaire tools, designed for general evaluation of safety climate in acute hospital settings, were included. Detailed inspection revealed ambiguity around concepts of safety culture and climate, safety climate dimensions and the methodological rigour associated with the design of these measures. Standard reporting of the psychometric properties of developed questionnaires was variable, although evidence of an improving trend in the quality of the reported psychometric properties of studies was noted. Evidence of the theoretical underpinnings of climate tools was limited, while a lack of clarity in the relationship between safety culture and patient outcome measures still exists. Evidence of the adequacy of the psychometric development of safety climate questionnaire tools is still limited. Research is necessary to resolve the controversies in the definitions and dimensions of safety culture and climate in healthcare and identify related inconsistencies. More importance should be given to the appropriate validation of safety climate questionnaires before extending their usage in healthcare contexts different from those in which they were originally developed. Mixed methods research to understand why psychometric assessment and measurement reporting practices can be inadequate and lacking in a theoretical basis is also necessary.

  9. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  10. 78 FR 69839 - Building Technologies Office Prioritization Tool

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-21

    ... innovative and cost-effective energy saving solutions: Supporting research and development of high impact... Description The tool was designed to inform programmatic decision-making and facilitate the setting of... quantitative analysis to assure only the highest impact measures are the focus of further effort. The approach...

  11. Automating Expertise in Collaborative Learning Environments

    ERIC Educational Resources Information Center

    LaVoie, Noelle; Streeter, Lynn; Lochbaum, Karen; Wroblewski, David; Boyce, Lisa; Krupnick, Charles; Psotka, Joseph

    2010-01-01

    We have developed a set of tools for improving online collaborative learning including an automated expert that monitors and moderates discussions, and additional tools to evaluate contributions, semantically search all posted comments, access a library of hundreds of digital books and provide reports to instructors. The technology behind these…

  12. EVA - A Textual Data Processing Tool.

    ERIC Educational Resources Information Center

    Jakopin, Primoz

    EVA, a text processing tool designed to be self-contained and useful for a variety of languages, is described briefly, and its extensive coded character set is illustrated. Features, specifications, and database functions are noted. Its application in development of a Slovenian literary dictionary is also described. (MSE)

  13. Health system context and implementation of evidence-based practices-development and validation of the Context Assessment for Community Health (COACH) tool for low- and middle-income settings.

    PubMed

    Bergström, Anna; Skeen, Sarah; Duc, Duong M; Blandon, Elmer Zelaya; Estabrooks, Carole; Gustavsson, Petter; Hoa, Dinh Thi Phuong; Källestål, Carina; Målqvist, Mats; Nga, Nguyen Thu; Persson, Lars-Åke; Pervin, Jesmin; Peterson, Stefan; Rahman, Anisur; Selling, Katarina; Squires, Janet E; Tomlinson, Mark; Waiswa, Peter; Wallin, Lars

    2015-08-15

    The gap between what is known and what is practiced results in health service users not benefitting from advances in healthcare, and in unnecessary costs. A supportive context is considered a key element for successful implementation of evidence-based practices (EBP). There were no tools available for the systematic mapping of aspects of organizational context influencing the implementation of EBPs in low- and middle-income countries (LMICs). Thus, this project aimed to develop and psychometrically validate a tool for this purpose. The development of the Context Assessment for Community Health (COACH) tool was premised on the context dimension in the Promoting Action on Research Implementation in Health Services framework, and is a derivative product of the Alberta Context Tool. Its development was undertaken in Bangladesh, Vietnam, Uganda, South Africa and Nicaragua in six phases: (1) defining dimensions and draft tool development, (2) content validity amongst in-country expert panels, (3) content validity amongst international experts, (4) response process validity, (5) translation and (6) evaluation of psychometric properties amongst 690 health workers in the five countries. The tool was validated for use amongst physicians, nurse/midwives and community health workers. The six phases of development resulted in a good fit between the theoretical dimensions of the COACH tool and its psychometric properties. The tool has 49 items measuring eight aspects of context: Resources, Community engagement, Commitment to work, Informal payment, Leadership, Work culture, Monitoring services for action and Sources of knowledge. Aspects of organizational context that were identified as influencing the implementation of EBPs in high-income settings were also found to be relevant in LMICs. However, there were additional aspects of context of relevance in LMICs specifically Resources, Community engagement, Commitment to work and Informal payment. Use of the COACH tool will allow for systematic description of the local healthcare context prior implementing healthcare interventions to allow for tailoring implementation strategies or as part of the evaluation of implementing healthcare interventions and thus allow for deeper insights into the process of implementing EBPs in LMICs.

  14. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calderer, Antoni; Yang, Xiaolei; Angelidis, Dionysios

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  15. Contributions of Academic Labs to the Discovery and Development of Chemical Biology Tools

    PubMed Central

    Huryn, Donna M.; Resnick, Lynn O.; Wipf, Peter

    2013-01-01

    The academic setting provides an environment that may foster success in the discovery of certain types of small molecule tools, while proving less suitable in others. For example, small molecule probes for poorly understood systems, those that exploit a specific resident expertise, and those whose commercial return is not apparent are ideally suited to be pursued in a university setting. In this perspective, we highlight five projects that emanated from academic research groups and generated valuable tool compounds that have been used to interrogate biological phenomena: Reactive oxygen species (ROS) sensors, GPR30 agonists and antagonists, selective CB2 agonists, Hsp70 modulators and beta-amyloid PET imaging agents. By continuing to take advantage of the unique expertise resident in university settings, and the ability to pursue novel projects that may have great scientific value, but limited or no immediate commercial value, probes from academic research groups continue to provide useful tools and generate a long-term resource for biomedical researchers. PMID:23672690

  16. Contributions of academic laboratories to the discovery and development of chemical biology tools.

    PubMed

    Huryn, Donna M; Resnick, Lynn O; Wipf, Peter

    2013-09-26

    The academic setting provides an environment that may foster success in the discovery of certain types of small molecule tools while proving less suitable in others. For example, small molecule probes for poorly understood systems, those that exploit a specific resident expertise, and those whose commercial return is not apparent are ideally suited to be pursued in a university setting. In this review, we highlight five projects that emanated from academic research groups and generated valuable tool compounds that have been used to interrogate biological phenomena: reactive oxygen species (ROS) sensors, GPR30 agonists and antagonists, selective CB2 agonists, Hsp70 modulators, and β-amyloid PET imaging agents. By taking advantage of the unique expertise resident in university settings and the ability to pursue novel projects that may have great scientific value but with limited or no immediate commercial value, probes from academic research groups continue to provide useful tools and generate a long-term resource for biomedical researchers.

  17. Visuals, Path Control, and Knowledge Gain: Variables that Affect Students' Approval and Enjoyment of a Multimedia Text as a Learning Tool

    ERIC Educational Resources Information Center

    George-Palilonis, Jennifer; Filak, Vincent

    2010-01-01

    As graphically driven, animated, interactive applications offer educators new opportunities for shaping course content, new avenues for research arise as well. Along with these developments comes a need to study the effectiveness of the individual tools at our disposal as well as various methods for integrating those tools in a classroom setting.…

  18. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Kunst, O.; Cubasch, U.

    2014-12-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: guest password: miklip

  19. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich

    2015-04-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: click on "Guest"

  20. Design of Mobile Health Tools to Promote Goal Achievement in Self-Management Tasks

    PubMed Central

    Henderson, Geoffrey; Parmanto, Bambang

    2017-01-01

    Background Goal-setting within rehabilitation is a common practice ultimately geared toward helping patients make functional progress. Objective The purposes of this study were to (1) qualitatively analyze data from a wellness program for patients with spina bifida (SB) and spinal cord injury (SCI) in order to generate software requirements for a goal-setting module to support their complex goal-setting routines, (2) design a prototype of a goal-setting module within an existing mobile health (mHealth) system, and (3) identify what educational content might be necessary to integrate into the system. Methods A total of 750 goals were analyzed from patients with SB and SCI enrolled in a wellness program. These goals were qualitatively analyzed in order to operationalize a set of software requirements for an mHealth goal-setting module and identify important educational content. Results Those of male sex (P=.02) and with SCI diagnosis (P<.001) were more likely to achieve goals than females or those with SB. Temporality (P<.001) and type (P<.001) of goal were associated with likelihood that the goal would be achieved. Nearly all (210/213; 98.6%) of the fact-finding goals were achieved. There was no significant difference in achievement based on goal theme. Checklists, data tracking, and fact-finding tools were identified as three functionalities that could support goal-setting and achievement in an mHealth system. Based on the qualitative analysis, a list of software requirements for a goal-setting module was generated, and a prototype was developed. Targets for educational content were also generated. Conclusions Innovative mHealth tools can be developed to support commonly set goals by individuals with disabilities. PMID:28739558

  1. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakhleh, Luay

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbialmore » genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.« less

  2. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  3. GSAC - Generic Seismic Application Computing

    NASA Astrophysics Data System (ADS)

    Herrmann, R. B.; Ammon, C. J.; Koper, K. D.

    2004-12-01

    With the success of the IRIS data management center, the use of large data sets in seismological research has become common. Such data sets, and especially the significantly larger data sets expected from EarthScope, present challenges for analysis with existing tools developed over the last 30 years. For much of the community, the primary format for data analysis is the Seismic Analysis Code (SAC) format developed by Lawrence Livermore National Laboratory. Although somewhat restrictive in meta-data storage, the simplicity and stability of the format has established it as an important component of seismological research. Tools for working with SAC files fall into two categories - custom research quality processing codes and shared display - processing tools such as SAC2000, MatSeis,etc., which were developed primarily for the needs of individual seismic research groups. While the current graphics display and platform dependence of SAC2000 may be resolved if the source code is released, the code complexity and the lack of large-data set analysis or even introductory tutorials could preclude code improvements and development of expertise in its use. We believe that there is a place for new, especially open source, tools. The GSAC effort is an approach that focuses on ease of use, computational speed, transportability, rapid addition of new features and openness so that new and advanced students, researchers and instructors can quickly browse and process large data sets. We highlight several approaches toward data processing under this model. gsac - part of the Computer Programs in Seismology 3.30 distribution has much of the functionality of SAC2000 and works on UNIX/LINUX/MacOS-X/Windows (CYGWIN). This is completely programmed in C from scratch, is small, fast, and easy to maintain and extend. It is command line based and is easily included within shell processing scripts. PySAC is a set of Python functions that allow easy access to SAC files and enable efficient manipulation of SAC files under a variety of operating systems. PySAC has proven to be valuable in organizing large data sets. An array processing package includes standard beamforming algorithms and a search based method for inference of slowness vectors. The search results can be visualized using GMT scripts output by the C programs, and the resulting snapshots can be combined into an animation of the time evolution of the 2D slowness field.

  4. Graphical Interfaces for Simulation.

    ERIC Educational Resources Information Center

    Hollan, J. D.; And Others

    This document presents a discussion of the development of a set of software tools to assist in the construction of interfaces to simulations and real-time systems. Presuppositions to the approach to interface design that was used are surveyed, the tools are described, and the conclusions drawn from these experiences in graphical interface design…

  5. Automated road segment creation process : a report on research sponsored by SaferSim.

    DOT National Transportation Integrated Search

    2016-08-01

    This report provides a summary of a set of tools that can be used to automate the process : of generating roadway surfaces from alignment and texture information. The tools developed : were created in Python 3.x and rely on the availability of two da...

  6. Contingency diagrams as teaching tools.

    PubMed

    Mattaini, M A

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching.

  7. Economic Valuation Tools and their Applications to Ecosystem Services

    EPA Science Inventory

    Many of the ecosystem services that people value are not normally bought and sold; yet, this does not mean they have no measurable economic value. Economists have developed a set of approaches and tools for valuing the full range of both market and “non-market” ecosystem goods an...

  8. GEANT4 and Secondary Particle Production

    NASA Technical Reports Server (NTRS)

    Patterson, Jeff

    2004-01-01

    GEANT 4 is a Monte Carlo tool set developed by the High Energy Physics Community (CERN, SLAC, etc) to perform simulations of complex particle detectors. GEANT4 is the ideal tool to study radiation transport and should be applied to space environments and the complex geometries of modern day spacecraft.

  9. Task scheduling in dataflow computer architectures

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1994-01-01

    Dataflow computers provide a platform for the solution of a large class of computational problems, which includes digital signal processing and image processing. Many typical applications are represented by a set of tasks which can be repetitively executed in parallel as specified by an associated dataflow graph. Research in this area aims to model these architectures, develop scheduling procedures, and predict the transient and steady state performance. Researchers at NASA have created a model and developed associated software tools which are capable of analyzing a dataflow graph and predicting its runtime performance under various resource and timing constraints. These models and tools were extended and used in this work. Experiments using these tools revealed certain properties of such graphs that require further study. Specifically, the transient behavior at the beginning of the execution of a graph can have a significant effect on the steady state performance. Transformation and retiming of the application algorithm and its initial conditions can produce a different transient behavior and consequently different steady state performance. The effect of such transformations on the resource requirements or under resource constraints requires extensive study. Task scheduling to obtain maximum performance (based on user-defined criteria), or to satisfy a set of resource constraints, can also be significantly affected by a transformation of the application algorithm. Since task scheduling is performed by heuristic algorithms, further research is needed to determine if new scheduling heuristics can be developed that can exploit such transformations. This work has provided the initial development for further long-term research efforts. A simulation tool was completed to provide insight into the transient and steady state execution of a dataflow graph. A set of scheduling algorithms was completed which can operate in conjunction with the modeling and performance tools previously developed. Initial studies on the performance of these algorithms were done to examine the effects of application algorithm transformations as measured by such quantities as number of processors, time between outputs, time between input and output, communication time, and memory size.

  10. Design and evaluation of a disaster preparedness logistics tool.

    PubMed

    Neches, Robert; Ryutov, Tatyana; Kichkaylo, Tatiana; Burke, Rita V; Claudius, Ilene A; Upperman, Jeffrey S

    2009-01-01

    The purpose of this article is to describe the development and testing of the Pediatric Emergency Decision Support System (PEDSS), a dynamic tool for pediatric victim disaster planning. This is a descriptive article outlining an innovative automated approach to pediatric decision support and disaster planning. Disaster Resource Centers and umbrella hospitals in Los Angeles County. The authors use a model set of hypothetical patients for our pediatric disaster planning approach. The authors developed the PEDSS software to accomplish two goals: (a) core that supports user interaction and data management requirements (e.g., accessing demographic information about a healthcare facility's catchment area) and (b) set of modules each addressing a critical disaster preparation issue. The authors believe the PEDSS tool will help hospital disaster response personnel produce and maintain disaster response plans that apply best practice pediatric recommendations to their particular local conditions and requirements.

  11. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  12. Family history tools in primary care: does one size fit all?

    PubMed

    Wilson, B J; Carroll, J C; Allanson, J; Little, J; Etchegary, H; Avard, D; Potter, B K; Castle, D; Grimshaw, J M; Chakraborty, P

    2012-01-01

    Family health history (FHH) has potential value in many health care settings. This review discusses the potential uses of FHH information in primary care and the need for tools to be designed accordingly. We developed a framework in which the attributes of FHH tools are mapped against these different purposes. It contains 7 attributes mapped against 5 purposes. In considering different FHH tool purposes, it is apparent that different attributes become more or less important, and that tools for different purposes require different implementation and evaluation strategies. The context in which a tool is used is also relevant to its effectiveness. For FHH tools, it is unlikely that 'one size fits all', although appreciation of different purposes, users and contexts should facilitate the development of different applications from single FHH platforms. Copyright © 2012 S. Karger AG, Basel.

  13. 5As Team obesity intervention in primary care: development and evaluation of shared decision-making weight management tools.

    PubMed

    Osunlana, A M; Asselin, J; Anderson, R; Ogunleye, A A; Cave, A; Sharma, A M; Campbell-Scherer, D L

    2015-08-01

    Despite several clinical practice guidelines, there remains a considerable gap in prevention and management of obesity in primary care. To address the need for changing provider behaviour, a randomized controlled trial with convergent mixed method evaluation, the 5As Team (5AsT) study, was conducted. As part of the 5AsT intervention, the 5AsT tool kit was developed. This paper describes the development process and evaluation of these tools. Tools were co-developed by the multidisciplinary research team and the 5AsT, which included registered nurses/nurse practitioners (n = 15), mental health workers (n = 7) and registered dieticians (n = 7), who were previously randomized to the 5AsT intervention group at a primary care network in Edmonton, Alberta, Canada. The 5AsT tool development occurred through a practice/implementation-oriented, need-based, iterative process during learning collaborative sessions of the 5AsT intervention. Feedback during tool development was received through field notes and final provider evaluation was carried out through anonymous questionnaires. Twelve tools were co-developed with 5AsT. All tools were evaluated as either 'most useful' or 'moderately useful' in primary care practice by the 5AsT. Four key findings during 5AsT tool development were the need for: tools that were adaptive, tools to facilitate interdisciplinary practice, tools to help patients understand realistic expectations for weight loss and shared decision-making tools for goal setting and relapse prevention. The 5AsT tools are primary care tools which extend the utility of the 5As of obesity management framework in clinical practice. © 2015 The Authors. Clinical Obesity published by John Wiley & Sons Ltd on behalf of World Obesity.

  14. 5As Team obesity intervention in primary care: development and evaluation of shared decision‐making weight management tools

    PubMed Central

    Asselin, J.; Anderson, R.; Ogunleye, A. A.; Cave, A.; Sharma, A. M.; Campbell‐Scherer, D. L.

    2015-01-01

    Summary Despite several clinical practice guidelines, there remains a considerable gap in prevention and management of obesity in primary care. To address the need for changing provider behaviour, a randomized controlled trial with convergent mixed method evaluation, the 5As Team (5AsT) study, was conducted. As part of the 5AsT intervention, the 5AsT tool kit was developed. This paper describes the development process and evaluation of these tools. Tools were co‐developed by the multidisciplinary research team and the 5AsT, which included registered nurses/nurse practitioners (n = 15), mental health workers (n = 7) and registered dieticians (n = 7), who were previously randomized to the 5AsT intervention group at a primary care network in Edmonton, Alberta, Canada. The 5AsT tool development occurred through a practice/implementation‐oriented, need‐based, iterative process during learning collaborative sessions of the 5AsT intervention. Feedback during tool development was received through field notes and final provider evaluation was carried out through anonymous questionnaires. Twelve tools were co‐developed with 5AsT. All tools were evaluated as either ‘most useful’ or ‘moderately useful’ in primary care practice by the 5AsT. Four key findings during 5AsT tool development were the need for: tools that were adaptive, tools to facilitate interdisciplinary practice, tools to help patients understand realistic expectations for weight loss and shared decision‐making tools for goal setting and relapse prevention. The 5AsT tools are primary care tools which extend the utility of the 5As of obesity management framework in clinical practice. PMID:26129630

  15. Implementing standardized, inter-unit communication in an international setting: handoff of patients from emergency medicine to internal medicine.

    PubMed

    Balhara, Kamna S; Peterson, Susan M; Elabd, Mohamed Moheb; Regan, Linda; Anton, Xavier; Al-Natour, Basil Ali; Hsieh, Yu-Hsiang; Scheulen, James; Stewart de Ramirez, Sarah A

    2018-04-01

    Standardized handoffs may reduce communication errors, but research on handoff in community and international settings is lacking. Our study at a community hospital in the United Arab Emirates characterizes existing handoff practices for admitted patients from emergency medicine (EM) to internal medicine (IM), develops a standardized handoff tool, and assesses its impact on communication and physician perceptions. EM physicians completed a survey regarding handoff practices and expectations. Trained observers utilized a checklist based on the Systems Engineering Initiative for Patient Safety model to observe 40 handoffs. EM and IM physicians collaboratively developed a written tool encouraging bedside handoff of admitted patients. After the intervention, surveys of EM physicians and 40 observations were subsequently repeated. 77.5% of initial observed handoffs occurred face-to-face, with 42.5% at bedside, and in four different languages. Most survey respondents considered face-to-face handoff ideal. Respondents noted 9-13 patients suffering harm due to handoff in the prior month. After handoff tool implementation, 97.5% of observed handoffs occurred face-to-face (versus 77.5%, p = 0.014), with 82.5% at bedside (versus 42.5%, p < 0.001), and all in English. Handoff was streamlined from 7 possible pathways to 3. Most post-intervention survey respondents reported improved workflow (77.8%) and safety (83.3%); none reported patient harm. Respondents and observers noted reduced inefficiency (p < 0.05). Our standardized tool increased face-to-face and bedside handoff, positively impacted workflow, and increased perceptions of safety by EM physicians in an international, non-academic setting. Our three-step approach can be applied towards developing standardized, context-specific inter-specialty handoff in a variety of settings.

  16. Conceptualisation and development of the Conversational Health Literacy Assessment Tool (CHAT).

    PubMed

    O'Hara, Jonathan; Hawkins, Melanie; Batterham, Roy; Dodson, Sarity; Osborne, Richard H; Beauchamp, Alison

    2018-03-22

    The aim of this study was to develop a tool to support health workers' ability to identify patients' multidimensional health literacy strengths and challenges. The tool was intended to be suitable for administration in healthcare settings where health workers must identify health literacy priorities as the basis for person-centred care. Development was based on a qualitative co-design process that used the Health Literacy Questionnaire (HLQ) as a framework to generate questions. Health workers were recruited to participate in an online consultation, a workshop, and two rounds of pilot testing. Participating health workers identified and refined ten questions that target five areas of assessment: supportive professional relationships, supportive personal relationships, health information access and comprehension, current health behaviours, and health promotion barriers and support. Preliminary evidence suggests that application of the Conversational Health Literacy Assessment Tool (CHAT) can support health workers to better understand the health literacy challenges and supportive resources of their patients. As an integrated clinical process, the CHAT can supplement existing intake and assessment procedures across healthcare settings to give insight into patients' circumstances so that decisions about care can be tailored to be more appropriate and effective.

  17. Design of Phoneme MIDI Codes Using the MIDI Encoding Tool “Auto-F” and Realizing Voice Synthesizing Functions Based on Musical Sounds

    NASA Astrophysics Data System (ADS)

    Modegi, Toshio

    Using our previously developed audio to MIDI code converter tool “Auto-F”, from given vocal acoustic signals we can create MIDI data, which enable to playback the voice-like signals with a standard MIDI synthesizer. Applying this tool, we are constructing a MIDI database, which consists of previously converted simple harmonic structured MIDI codes from a set of 71 Japanese male and female syllable recorded signals. And we are developing a novel voice synthesizing system based on harmonically synthesizing musical sounds, which can generate MIDI data and playback voice signals with a MIDI synthesizer by giving Japanese plain (kana) texts, referring to the syllable MIDI code database. In this paper, we propose an improved MIDI converter tool, which can produce temporally higher-resolution MIDI codes. Then we propose an algorithm separating a set of 20 consonant and vowel phoneme MIDI codes from 71 syllable MIDI converted codes in order to construct a voice synthesizing system. And, we present the evaluation results of voice synthesizing quality between these separated phoneme MIDI codes and their original syllable MIDI codes by our developed 4-syllable word listening tests.

  18. Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases

    PubMed Central

    Amos, Christopher I.; Bafna, Vineet; Hauser, Elizabeth R.; Hernandez, Ryan D.; Li, Chun; Liberles, David A.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Papanicolaou, George J.; Peng, Bo; Ritchie, Marylyn D.; Rosenfeld, Gabriel; Witte, John S.

    2014-01-01

    Genetic simulation programs are used to model data under specified assumptions to facilitate the understanding and study of complex genetic systems. Standardized data sets generated using genetic simulation are essential for the development and application of novel analytical tools in genetic epidemiology studies. With continuing advances in high-throughput genomic technologies and generation and analysis of larger, more complex data sets, there is a need for updating current approaches in genetic simulation modeling. To provide a forum to address current and emerging challenges in this area, the National Cancer Institute (NCI) sponsored a workshop, entitled “Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases” at the National Institutes of Health (NIH) in Bethesda, Maryland on March 11-12, 2014. The goals of the workshop were to: (i) identify opportunities, challenges and resource needs for the development and application of genetic simulation models; (ii) improve the integration of tools for modeling and analysis of simulated data; and (iii) foster collaborations to facilitate development and applications of genetic simulation. During the course of the meeting the group identified challenges and opportunities for the science of simulation, software and methods development, and collaboration. This paper summarizes key discussions at the meeting, and highlights important challenges and opportunities to advance the field of genetic simulation. PMID:25371374

  19. On transform coding tools under development for VP10

    NASA Astrophysics Data System (ADS)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  20. Can I solve my structure by SAD phasing? Planning an experiment, scaling data and evaluating the useful anomalous correlation and anomalous signal

    PubMed Central

    Terwilliger, Thomas C.; Bunkóczi, Gábor; Hung, Li-Wei; Zwart, Peter H.; Smith, Janet L.; Akey, David L.; Adams, Paul D.

    2016-01-01

    A key challenge in the SAD phasing method is solving a structure when the anomalous signal-to-noise ratio is low. Here, algorithms and tools for evaluating and optimizing the useful anomalous correlation and the anomalous signal in a SAD experiment are described. A simple theoretical framework [Terwilliger et al. (2016 ▸), Acta Cryst. D72, 346–358] is used to develop methods for planning a SAD experiment, scaling SAD data sets and estimating the useful anomalous correlation and anomalous signal in a SAD data set. The phenix.plan_sad_experiment tool uses a database of solved and unsolved SAD data sets and the expected characteristics of a SAD data set to estimate the probability that the anomalous substructure will be found in the SAD experiment and the expected map quality that would be obtained if the substructure were found. The phenix.scale_and_merge tool scales unmerged SAD data from one or more crystals using local scaling and optimizes the anomalous signal by identifying the systematic differences among data sets, and the phenix.anomalous_signal tool estimates the useful anomalous correlation and anomalous signal after collecting SAD data and estimates the probability that the data set can be solved and the likely figure of merit of phasing. PMID:26960123

  1. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  2. Propulsion Diagnostic Method Evaluation Strategy (ProDiMES) User's Guide

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.

    2010-01-01

    This report is a User's Guide for the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES). ProDiMES is a standard benchmarking problem and a set of evaluation metrics to enable the comparison of candidate aircraft engine gas path diagnostic methods. This Matlab (The Mathworks, Inc.) based software tool enables users to independently develop and evaluate diagnostic methods. Additionally, a set of blind test case data is also distributed as part of the software. This will enable the side-by-side comparison of diagnostic approaches developed by multiple users. The Users Guide describes the various components of ProDiMES, and provides instructions for the installation and operation of the tool.

  3. A software technology evaluation program

    NASA Technical Reports Server (NTRS)

    Novaes-Card, David N.

    1985-01-01

    A set of quantitative approaches is presented for evaluating software development methods and tools. The basic idea is to generate a set of goals which are refined into quantifiable questions which specify metrics to be collected on the software development and maintenance process and product. These metrics can be used to characterize, evaluate, predict, and motivate. They can be used in an active as well as passive way by learning form analyzing the data and improving the methods and tools based upon what is learned from that analysis. Several examples were given representing each of the different approaches to evaluation. The cost of the approaches varied inversely with the level of confidence in the interpretation of the results.

  4. Knowledge-based approach for generating target system specifications from a domain model

    NASA Technical Reports Server (NTRS)

    Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan

    1992-01-01

    Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.

  5. Outdoor environmental assessment of attention promoting settings for preschool children.

    PubMed

    Mårtensson, F; Boldemann, C; Söderström, M; Blennow, M; Englund, J-E; Grahn, P

    2009-12-01

    The restorative potential of green outdoor environments for children in preschool settings was investigated by measuring the attention of children playing in settings with different environmental features. Eleven preschools with outdoor environments typical for the Stockholm area were assessed using the outdoor play environment categories (OPEC) and the fraction of visible sky from play structures (sky view factor), and 198 children, aged 4.5-6.5 years, were rated by the staff for inattentive, hyperactive and impulsive behaviors with the ECADDES tool. Children playing in large and integrated outdoor areas containing large areas of trees, shrubbery and a hilly terrain showed less often behaviors of inattention (p<.05). The choice of tool for assessment of attention is discussed in relation to outdoor stay and play characteristics in Swedish preschool settings. The results indicate that the restorative potential of green outdoor environments applies also to preschool children and that environmental assessment tools as OPEC can be useful when to locate and develop health-promoting land adjacent to preschools.

  6. Method for automation of tool preproduction

    NASA Astrophysics Data System (ADS)

    Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.

    2018-03-01

    The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.

  7. The Harmony of Physics, Mathematics, and Music: A discovery in mathematical music theory is found to apply in physics

    NASA Astrophysics Data System (ADS)

    Krantz, Richard; Douthett, Jack

    2009-05-01

    Although it is common practice to borrow tools from mathematics to apply to physics or music, it is unusual to use tools developed in music theory to mathematically describe physical phenomena. So called ``Maximally Even Set'' theory fits this unusual case. In this poster, we summarize, by example, the theory of Maximally Even (ME) sets and show how this formalism leads to the distribution of black and white keys on the piano keyboard. We then show how ME sets lead to a generalization of the well-known ``Cycle-of-Fifths'' in music theory. Subsequently, we describe ordering in one-dimensional spin-1/2 anti-ferromagnets using ME sets showing that this description leads to a fractal ``Devil's Staircase'' magnetic phase diagram. Finally, we examine an extension of ME sets, ``Iterated Maximally Even Sets'' that describes chord structure in music.

  8. The Harmony of Physics, Mathematics, and Music: A discovery in mathematical music theory is found to apply in physics

    NASA Astrophysics Data System (ADS)

    Krantz, Richard; Douthett, Jack

    2009-10-01

    Although it is common practice to borrow tools from mathematics to apply to physics or music, it is unusual to use tools developed in music theory to mathematically describe physical phenomena. So called ``Maximally Even Set'' theory fits this unusual case. In this poster, we summarize, by example, the theory of Maximally Even (ME) sets and show how this formalism leads to the distribution of black and white keys on the piano keyboard. We then show how ME sets lead to a generalization of the well-known ``Cycle-of-Fifths'' in music theory. Subsequently, we describe ordering in one-dimensional spin-1/2 anti-ferromagnets using ME sets showing that this description leads to a fractal ``Devil's Staircase'' magnetic phase diagram. Finally, we examine an extension of ME sets, ``Iterated Maximally Even'' sets that describes chord structure in music.

  9. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing

    PubMed Central

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-01-01

    Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants’ comprehension of the study information was measured by using a validated digitised audio questionnaire. Results The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants’ ‘recall’ and ‘understanding’ between first and second visits were statistically significant (F (1,41)=25.38, p<0.00001 and (F (1, 41) = 31.61, p<0.00001 respectively. Conclusions Our locally developed multimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings. PMID:25133065

  10. A Comparison of Optical, Electrochemical, Magnetic, and Colorimetric Point-of-Care Biosensors for Infectious Disease Diagnosis.

    PubMed

    Pashchenko, Oleksandra; Shelby, Tyler; Banerjee, Tuhina; Santra, Santimukul

    2018-06-18

    Each year, infectious diseases are responsible for millions of deaths, most of which occur in the rural areas of developing countries. Many of the infectious disease diagnostic tools used today require a great deal of time, a laboratory setting, and trained personnel. Due to this, the need for effective point-of-care (POC) diagnostic tools is greatly increasing with an emphasis on affordability, portability, sensitivity, specificity, timeliness, and ease of use. In this Review, we discuss the various diagnostic modalities that have been utilized toward this end and are being further developed to create POC diagnostic technologies, and we focus on potential effectiveness in resource-limited settings. The main modalities discussed herein are optical-, electrochemical-, magnetic-, and colorimetric-based modalities utilized in diagnostic technologies for infectious diseases. Each of these modalities feature pros and cons when considering application in POC settings but, overall, reveal a promising outlook for the future of this field of technological development.

  11. Collaborative workbench for cyberinfrastructure to accelerate science algorithm development

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Maskey, M.; Kuo, K.; Lynnes, C.

    2013-12-01

    There are significant untapped resources for information and knowledge creation within the Earth Science community in the form of data, algorithms, services, analysis workflows or scripts, and the related knowledge about these resources. Despite the huge growth in social networking and collaboration platforms, these resources often reside on an investigator's workstation or laboratory and are rarely shared. A major reason for this is that there are very few scientific collaboration platforms, and those that exist typically require the use of a new set of analysis tools and paradigms to leverage the shared infrastructure. As a result, adoption of these collaborative platforms for science research is inhibited by the high cost to an individual scientist of switching from his or her own familiar environment and set of tools to a new environment and tool set. This presentation will describe an ongoing project developing an Earth Science Collaborative Workbench (CWB). The CWB approach will eliminate this barrier by augmenting a scientist's current research environment and tool set to allow him or her to easily share diverse data and algorithms. The CWB will leverage evolving technologies such as commodity computing and social networking to design an architecture for scalable collaboration that will support the emerging vision of an Earth Science Collaboratory. The CWB is being implemented on the robust and open source Eclipse framework and will be compatible with widely used scientific analysis tools such as IDL. The myScience Catalog built into CWB will capture and track metadata and provenance about data and algorithms for the researchers in a non-intrusive manner with minimal overhead. Seamless interfaces to multiple Cloud services will support sharing algorithms, data, and analysis results, as well as access to storage and computer resources. A Community Catalog will track the use of shared science artifacts and manage collaborations among researchers.

  12. Development approach to an enterprise-wide medication reconciliation tool in a free-standing pediatric hospital with commercial best-of-breed systems.

    PubMed

    Yu, Feliciano B; Leising, Scott; Turner, Scott

    2007-10-11

    Medication reconciliation is essential to providing a safer patient environment during transitions of care in the clinical setting. Current solutions include a mixed-bag of paper and electronic processes. Best-of-breed health information systems architecture poses a specific challenge to organizations that have limited software development resources. Using readily available service-oriented technology, a prototype for an integrated medication reconciliation tool is developed for use in an academic pediatric hospital with commercial systems.

  13. Common Effects Methodology for Pesticides

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  14. STS 135 Landing

    NASA Image and Video Library

    2017-12-08

    Goddard's Ritsko Wins 2011 SAVE Award The winner of the 2011 SAVE Award is Matthew Ritsko, a Goddard financial manager. His tool lending library would track and enable sharing of expensive space-flight tools and hardware after projects no longer need them. This set of images represents the types of tools used at NASA. To read more go to: www.nasa.gov/topics/people/features/ritsko-save.html Exploration Systems Project Manager Mike Weiss speaks about a Hubble Servicing Mission hand tool, developed at Goddard. Credit: NASA/GSFC/Debbie McCallum

  15. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  16. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 15: Administrative Information, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…

  17. Reliability and Validity of the Alberta Context Tool (ACT) with Professional Nurses: Findings from a Multi-Study Analysis

    PubMed Central

    Squires, Janet E.; Hayduk, Leslie; Hutchinson, Alison M.; Mallick, Ranjeeta; Norton, Peter G.; Cummings, Greta G.; Estabrooks, Carole A.

    2015-01-01

    Although organizational context is central to evidence-based practice, underdeveloped measurement hindersitsassessment. The Alberta Context Tool, comprised of 59 items that tap10 modifiable contextual concepts, was developed to address this gap. The purpose of this study to examine the reliability and validity of scores obtained when the Alberta Context Tool is completed by professional nurses across different healthcare settings. Five separate studies (N = 2361 nurses across different care settings) comprised the study sample. Reliability and validity were assessed. Cronbach’s alpha exceeded 0.70 for9/10 Alberta Context Tool concepts. Item-total correlations exceeded acceptable standards for 56/59items. Confirmatory Factor Analysescoordinated acceptably with the Alberta Context Tool’s proposed latent structure. The mean values for each Alberta Context Tool concept increased from low to high levels of research utilization(as hypothesized) further supporting its validity. This study provides robust evidence forreliability and validity of scores obtained with the Alberta Context Tool when administered to professional nurses. PMID:26098857

  18. Development and field testing of a decision support tool to facilitate shared decision making in contraceptive counseling.

    PubMed

    Dehlendorf, Christine; Fitzpatrick, Judith; Steinauer, Jody; Swiader, Lawrence; Grumbach, Kevin; Hall, Cara; Kuppermann, Miriam

    2017-07-01

    We developed and formatively evaluated a tablet-based decision support tool for use by women prior to a contraceptive counseling visit to help them engage in shared decision making regarding method selection. Drawing upon formative work around women's preferences for contraceptive counseling and conceptual understanding of health care decision making, we iteratively developed a storyboard and then digital prototypes, based on best practices for decision support tool development. Pilot testing using both quantitative and qualitative data and cognitive testing was conducted. We obtained feedback from patient and provider advisory groups throughout the development process. Ninety-six percent of women who used the tool in pilot testing reported that it helped them choose a method, and qualitative interviews indicated acceptability of the tool's content and presentation. Compared to the control group, women who used the tool demonstrated trends toward increased likelihood of complete satisfaction with their method. Participant responses to cognitive testing were used in tool refinement. Our decision support tool appears acceptable to women in the family planning setting. Formative evaluation of the tool supports its utility among patients making contraceptive decisions, which can be further evaluated in a randomized controlled trial. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. GIS learning tool for world's largest earthquakes and their causes

    NASA Astrophysics Data System (ADS)

    Chatterjee, Moumita

    The objective of this thesis is to increase awareness about earthquakes among people, especially young students by showing the five largest and two most predictable earthquake locations in the world and their plate tectonic settings. This is a geographic based interactive tool which could be used for learning about the cause of great earthquakes in the past and the safest places on the earth in order to avoid direct effect of earthquakes. This approach provides an effective way of learning for the students as it is very user friendly and more aligned to the interests of the younger generation. In this tool the user can click on the various points located on the world map which will open a picture and link to the webpage for that point, showing detailed information of the earthquake history of that place including magnitude of quake, year of past quakes and the plate tectonic settings that made this place earthquake prone. Apart from knowing the earthquake related information students will also be able to customize the tool to suit their needs or interests. Students will be able to add/remove layers, measure distance between any two points on the map, select any place on the map and know more information for that place, create a layer from this set to do a detail analysis, run a query, change display settings, etc. At the end of this tool the user has to go through the earthquake safely guidelines in order to be safe during an earthquake. This tool uses Java as programming language and uses Map Objects Java Edition (MOJO) provided by ESRI. This tool is developed for educational purpose and hence its interface has been kept simple and easy to use so that students can gain maximum knowledge through it instead of having a hard time to install it. There are lots of details to explore which can help more about what a GIS based tool is capable of. Only thing needed to run this tool is latest JAVA edition installed in their machine. This approach makes study more fun and interactive while educating students about a very important natural disaster which has been threatening us in recent years. This tool has been developed to increase awareness of the cause and effect of earthquake and how to be safe if that kind of disaster happens.

  20. An interactive distance solution for stroke rehabilitation in the home setting - A feasibility study.

    PubMed

    Palmcrantz, Susanne; Borg, Jörgen; Sommerfeld, Disa; Plantin, Jeanette; Wall, Anneli; Ehn, Maria; Sjölinder, Marie; Boman, Inga-Lill

    2017-09-01

    In this study an interactive distance solution (called the DISKO tool) was developed to enable home-based motor training after stroke. The overall aim was to explore the feasibility and safety of using the DISKO-tool, customized for interactive stroke rehabilitation in the home setting, in different rehabilitation phases after stroke. Fifteen patients in three different stages in the continuum of rehabilitation after stroke participated in a home-based training program using the DISKO-tool. The program included 15 training sessions with recurrent follow-ups by the integrated application for video communication with a physiotherapist. Safety and feasibility were assessed from patients, physiotherapists, and a technician using logbooks, interviews, and a questionnaire. Qualitative content analysis and descriptive statistics were used in the analysis. Fourteen out of 15 patients finalized the training period with a mean of 19.5 minutes spent on training at each session. The DISKO-tool was found to be useful and safe by patients and physiotherapists. This study demonstrates the feasibility and safety of the DISKO-tool and provides guidance in further development and testing of interactive distance technology for home rehabilitation, to be used by health care professionals and patients in different phases of rehabilitation after stroke.

  1. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  2. A service concept and tools to improve maternal and newborn health in Nigeria and Uganda.

    PubMed

    Salgado, Mariana; Wendland, Melanie; Rodriguez, Damaris; Bohren, Meghan A; Oladapo, Olufemi T; Ojelade, Olubunmi A; Mugerwa, Kidza; Fawole, Bukola

    2017-12-01

    The "Better Outcomes in Labor Difficulty" (BOLD) project used a service design process to design a set of tools to improve quality of care during childbirth by strengthening linkages between communities and health facilities in Nigeria and Uganda. This paper describes the Passport to Safer Birth concept and the tools developed as a result. Service design methods were used to identify facilitators and barriers to quality care, and to develop human-centered solutions. The service design process had three phases: Research for Design, Concept Design, and Detail Design, undertaken in eight hospitals and catchment communities. The service concept "Better Beginnings" comprises three tools. The "Pregnancy Purse" provides educational information to women throughout pregnancy. The "Birth Board" is a visual communication tool that presents the labor and childbirth process. The "Family Pass" is a set of wearable passes for the woman and her supporter to facilitate communication of care preferences. The Better Beginnings service concept and tools form the basis for the promotion of access to information and knowledge acquisition, and could improve communication between the healthcare provider, the woman, and her family during childbirth. © 2017 International Federation of Gynecology and Obstetrics. The World Health Organization retains copyright and all other rights in the manuscript of this article as submitted for publication.

  3. A comprehensive comparison of tools for differential ChIP-seq analysis

    PubMed Central

    Steinhauser, Sebastian; Kurzawa, Nils; Eils, Roland

    2016-01-01

    ChIP-seq has become a widely adopted genomic assay in recent years to determine binding sites for transcription factors or enrichments for specific histone modifications. Beside detection of enriched or bound regions, an important question is to determine differences between conditions. While this is a common analysis for gene expression, for which a large number of computational approaches have been validated, the same question for ChIP-seq is particularly challenging owing to the complexity of ChIP-seq data in terms of noisiness and variability. Many different tools have been developed and published in recent years. However, a comprehensive comparison and review of these tools is still missing. Here, we have reviewed 14 tools, which have been developed to determine differential enrichment between two conditions. They differ in their algorithmic setups, and also in the range of applicability. Hence, we have benchmarked these tools on real data sets for transcription factors and histone modifications, as well as on simulated data sets to quantitatively evaluate their performance. Overall, there is a great variety in the type of signal detected by these tools with a surprisingly low level of agreement. Depending on the type of analysis performed, the choice of method will crucially impact the outcome. PMID:26764273

  4. De-MetaST-BLAST: A Tool for the Validation of Degenerate Primer Sets and Data Mining of Publicly Available Metagenomes

    PubMed Central

    Gulvik, Christopher A.; Effler, T. Chad; Wilhelm, Steven W.; Buchan, Alison

    2012-01-01

    Development and use of primer sets to amplify nucleic acid sequences of interest is fundamental to studies spanning many life science disciplines. As such, the validation of primer sets is essential. Several computer programs have been created to aid in the initial selection of primer sequences that may or may not require multiple nucleotide combinations (i.e., degeneracies). Conversely, validation of primer specificity has remained largely unchanged for several decades, and there are currently few available programs that allows for an evaluation of primers containing degenerate nucleotide bases. To alleviate this gap, we developed the program De-MetaST that performs an in silico amplification using user defined nucleotide sequence dataset(s) and primer sequences that may contain degenerate bases. The program returns an output file that contains the in silico amplicons. When De-MetaST is paired with NCBI’s BLAST (De-MetaST-BLAST), the program also returns the top 10 nr NCBI database hits for each recovered in silico amplicon. While the original motivation for development of this search tool was degenerate primer validation using the wealth of nucleotide sequences available in environmental metagenome and metatranscriptome databases, this search tool has potential utility in many data mining applications. PMID:23189198

  5. Assessment of COTS IR image simulation tools for ATR development

    NASA Astrophysics Data System (ADS)

    Seidel, Heiko; Stahl, Christoph; Bjerkeli, Frode; Skaaren-Fystro, Paal

    2005-05-01

    Following the tendency of increased use of imaging sensors in military aircraft, future fighter pilots will need onboard artificial intelligence e.g. ATR for aiding them in image interpretation and target designation. The European Aeronautic Defence and Space Company (EADS) in Germany has developed an advanced method for automatic target recognition (ATR) which is based on adaptive neural networks. This ATR method can assist the crew of military aircraft like the Eurofighter in sensor image monitoring and thereby reduce the workload in the cockpit and increase the mission efficiency. The EADS ATR approach can be adapted for imagery of visual, infrared and SAR sensors because of the training-based classifiers of the ATR method. For the optimal adaptation of these classifiers they have to be trained with appropriate and sufficient image data. The training images must show the target objects from different aspect angles, ranges, environmental conditions, etc. Incomplete training sets lead to a degradation of classifier performance. Additionally, ground truth information i.e. scenario conditions like class type and position of targets is necessary for the optimal adaptation of the ATR method. In Summer 2003, EADS started a cooperation with Kongsberg Defence & Aerospace (KDA) from Norway. The EADS/KDA approach is to provide additional image data sets for training-based ATR through IR image simulation. The joint study aims to investigate the benefits of enhancing incomplete training sets for classifier adaptation by simulated synthetic imagery. EADS/KDA identified the requirements of a commercial-off-the-shelf IR simulation tool capable of delivering appropriate synthetic imagery for ATR development. A market study of available IR simulation tools and suppliers was performed. After that the most promising tool was benchmarked according to several criteria e.g. thermal emission model, sensor model, targets model, non-radiometric image features etc., resulting in a recommendation. The synthetic image data that are used for the investigation are generated using the recommended tool. Within the scope of this study, ATR performance on IR imagery using classifiers trained on real, synthetic and mixed image sets was evaluated. The performance of the adapted classifiers is assessed using recorded IR imagery with known ground-truth and recommendations are given for the use of COTS IR image simulation tools for ATR development.

  6. Web tools for predictive toxicology model building.

    PubMed

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  7. Merging and Visualization of Archived Oceanographic Acoustic, Optical, and Sensor Data to Support Improved Access and Interpretation

    NASA Astrophysics Data System (ADS)

    Malik, M. A.; Cantwell, K. L.; Reser, B.; Gray, L. M.

    2016-02-01

    Marine researchers and managers routinely rely on interdisciplinary data sets collected using hull-mounted sonars, towed sensors, or submersible vehicles. These data sets can be broadly categorized into acoustic remote sensing, imagery-based observations, water property measurements, and physical samples. The resulting raw data sets are overwhelmingly large and complex, and often require specialized software and training to process. To address these challenges, NOAA's Office of Ocean Exploration and Research (OER) is developing tools to improve the discoverability of raw data sets and integration of quality-controlled processed data in order to facilitate re-use of archived oceanographic data. Majority of recently collected OER raw oceanographic data can be retrieved from national data archives (e.g. NCEI and NOAA central library). Merging of disperse data sets by scientists with diverse expertise, however remains problematic. Initial efforts at OER have focused on merging geospatial acoustic remote sensing data with imagery and water property measurements that typically lack direct geo-referencing. OER has developed `smart' ship and submersible tracks that can provide a synopsis of geospatial coverage of various data sets. Tools under development enable scientists to quickly assess the relevance of archived OER data to their respective research or management interests, and enable quick access to the desired raw and processed data sets. Pre-processing of the data and visualization to combine various data sets also offers benefits to streamline data quality assurance and quality control efforts.

  8. Closeout of CRADA JSA 2012S004: Chapter 5, Integrated Control System, of the document of the ESS Conceptual Design Report, publicly available at https://europeanspallationsource.se/accelerator-documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Satogata, Todd

    2013-04-22

    The integrated control system (ICS) is responsible for the whole ESS machine and facility: accelerator, target, neutron scattering instruments and conventional facilities. This unified approach keeps the costs of development, maintenance and support relatively low. ESS has selected a standardised, field-proven controls framework, the Experimental Physics and Industrial Control System (EPICS), which was originally developed jointly by Argonne and Los Alamos National Laboratories. Complementing this selection are best practices and experience from similar facilities regarding platform standardisation, control system development and device integration and commissioning. The components of ICS include the control system core, the control boxes, the BLED databasemore » management system, and the human machine interface. The control system core is a set of systems and tools that make it possible for the control system to provide required data, information and services to engineers, operators, physicists and the facility itself. The core components are the timing system that makes possible clock synchronisation across the facility, the machine protection system (MPS) and the personnel protection system (PPS) that prevent damage to the machine and personnel, and a set of control system services. Control boxes are servers that control a collection of equipment (for example a radio frequency cavity). The integrated control system will include many control boxes that can be assigned to one supplier, such as an internal team, a collaborating institute or a commercial vendor. This approach facilitates a clear division of responsibilities and makes integration much easier. A control box is composed of a standardised hardware platform, components, development tools and services. On the top level, it interfaces with the core control system components (timing, MPS, PPS) and with the human-machine interface. At the bottom, it interfaces with the equipment and parts of the facility through a set of analog and digital signals, real-time control loops and other communication buses. The ICS central data management system is named BLED (beam line element databases). BLED is a set of databases, tools and services that is used to store, manage and access data. It holds vital control system configuration and physics-related (lattice) information about the accelerator, target and instruments. It facilitates control system configuration by bringing together direct input-output controller (IOC) con guration and real-time data from proton and neutron beam line models. BLED also simplifies development and speeds up the code-test-debug cycle. The set of tools that access BLED will be tailored to the needs of different categories of users, such as ESS staff physicists, engineers, and operators; external partner laboratories; and visiting experimental instrument users. The human-machine interface is vital to providing a high-quality experience to ICS users. It encompasses a wide array of devices and software tools, from control room screens to engineer terminal windows; from beam physics data tools to post-mortem data analysis tools. It serves users with a wide range of skills from widely varied backgrounds. The Controls Group is developing a set of user profiles to accommodate this diverse range of use-cases and users.« less

  9. Improving hydrologic disaster forecasting and response for transportation by assimilating and fusing NASA and other data sets : final report.

    DOT National Transportation Integrated Search

    2017-04-15

    In this 3-year project, the research team developed the Hydrologic Disaster Forecast and Response (HDFR) system, a set of integrated software tools for end users that streamlines hydrologic prediction workflows involving automated retrieval of hetero...

  10. Developing a Policy for Delegation of Nursing Care in the School Setting

    ERIC Educational Resources Information Center

    Spriggle, Melinda

    2009-01-01

    School nurses are in a unique position to provide care for students with special health care needs in the school setting. The incidence of chronic conditions and improved technology necessitate care of complex health care needs that had formerly been managed in inpatient settings. Delegation is a tool that may be used by registered nurses to allow…

  11. Development of a regional groundwater flow model for the area of the Idaho National Engineering Laboratory, Eastern Snake River Plain Aquifer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCarthy, J.M.; Arnett, R.C.; Neupauer, R.M.

    This report documents a study conducted to develop a regional groundwater flow model for the Eastern Snake River Plain Aquifer in the area of the Idaho National Engineering Laboratory. The model was developed to support Waste Area Group 10, Operable Unit 10-04 groundwater flow and transport studies. The products of this study are this report and a set of computational tools designed to numerically model the regional groundwater flow in the Eastern Snake River Plain aquifer. The objective of developing the current model was to create a tool for defining the regional groundwater flow at the INEL. The model wasmore » developed to (a) support future transport modeling for WAG 10-04 by providing the regional groundwater flow information needed for the WAG 10-04 risk assessment, (b) define the regional groundwater flow setting for modeling groundwater contaminant transport at the scale of the individual WAGs, (c) provide a tool for improving the understanding of the groundwater flow system below the INEL, and (d) consolidate the existing regional groundwater modeling information into one usable model. The current model is appropriate for defining the regional flow setting for flow submodels as well as hypothesis testing to better understand the regional groundwater flow in the area of the INEL. The scale of the submodels must be chosen based on accuracy required for the study.« less

  12. Development of an Epilepsy Nursing Communication Tool: Improving the Quality of Interactions Between Nurses and Patients With Seizures

    PubMed Central

    Buelow, Janice; Miller, Wendy; Fishman, Jesse

    2018-01-01

    ABSTRACT Background: Nurses have become increasingly involved in overseeing the management of patients with complex medical conditions, including those with epilepsy. Nurses who are not specialists in epilepsy can play a central role in providing optimal care, education, and support to their patients with epilepsy, given the proper tools. Objective: Our objective was to create a tool that can be used by nurses in the clinic setting to help facilitate discussion of topics relevant to enhancing medical care and management of patients with epilepsy. To address this need, a panel of epilepsy nursing experts used a patient-centered care approach to develop an Epilepsy Nursing Communication Tool (ENCT). Methods: An initial set of topics and questions was created based on findings from a literature review. Eight nurse experts reviewed and revised the ENCT using focus groups and discussion forums. The revised ENCT was provided to nurses who care for patients with epilepsy but had not been involved in ENCT development. Nurses were asked to rate the usability and feasibility on a 5-point scale to assess whether the tool captured important topics and was easy to use. Results: Ten nurses provided usability and feasibility assessments. Results indicated strong tool utility, with median scores of 4.5, 4, and 4 for usefulness, ease of use, and acceptability, respectively. Conclusions: The preliminary ENCT shows promise in providing a tool that nurses can use in their interactions with patients with epilepsy to help address the complexity of disease management, which may help improve overall patient care. PMID:29505437

  13. Incorporating Online Tools in Tertiary Education

    ERIC Educational Resources Information Center

    Steenkamp, Leon P.; Rudman, Riaan J.

    2013-01-01

    Students currently studying at tertiary institutions have developed a set of attitudes and aptitudes as a result of growing up in an IT and media-rich environment. These attitudes and aptitudes influence how they learn and in order to be effective, lecturers must adapt to address their learning preferences and use the online teaching tools that…

  14. Visual Tools for Eliciting Connections and Cohesiveness in Mixed Methods Research

    ERIC Educational Resources Information Center

    Murawska, Jaclyn M.; Walker, David A.

    2017-01-01

    In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…

  15. Improving STEM Program Quality in Out-of-School-Time: Tool Development and Validation

    ERIC Educational Resources Information Center

    Shah, Ashima Mathur; Wylie, Caroline; Gitomer, Drew; Noam, Gil

    2018-01-01

    In and out-of-school time (OST) experiences are viewed as complementary in contributing to students' interest, engagement, and performance in science, technology, engineering, and mathematics (STEM). While tools exist to measure quality in general afterschool settings and others to measure structured science classroom experiences, there is a need…

  16. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  17. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  18. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  19. Waste in health information systems: a systematic review.

    PubMed

    Awang Kalong, Nadia; Yusof, Maryati

    2017-05-08

    Purpose The purpose of this paper is to discuss a systematic review on waste identification related to health information systems (HIS) in Lean transformation. Design/methodology/approach A systematic review was conducted on 19 studies to evaluate Lean transformation and tools used to remove waste related to HIS in clinical settings. Findings Ten waste categories were identified, along with their relationships and applications of Lean tool types related to HIS. Different Lean tools were used at the early and final stages of Lean transformation; the tool selection depended on the waste characteristic. Nine studies reported a positive impact from Lean transformation in improving daily work processes. The selection of Lean tools should be made based on the timing, purpose and characteristics of waste to be removed. Research limitations/implications Overview of waste and its category within HIS and its analysis from socio-technical perspectives enabled the identification of its root cause in a holistic and rigorous manner. Practical implications Understanding waste types, their root cause and review of Lean tools could subsequently lead to the identification of mitigation approach to prevent future error occurrence. Originality/value Specific waste models for HIS settings are yet to be developed. Hence, the identification of the waste categories could guide future implementation of Lean transformations in HIS settings.

  20. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  1. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    PubMed

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  2. Terminology tools: state of the art and practical lessons.

    PubMed

    Cimino, J J

    2001-01-01

    As controlled medical terminologies evolve from simple code-name-hierarchy arrangements, into rich, knowledge-based ontologies of medical concepts, increased demands are placed on both the developers and users of the terminologies. In response, researchers have begun developing tools to address their needs. The aims of this article are to review previous work done to develop these tools and then to describe work done at Columbia University and New York Presbyterian Hospital (NYPH). Researchers working with the Systematized Nomenclature of Medicine (SNOMED), the Unified Medical Language System (UMLS), and NYPH's Medical Entities Dictionary (MED) have created a wide variety of terminology browsers, editors and servers to facilitate creation, maintenance and use of these terminologies. Although much work has been done, no generally available tools have yet emerged. Consensus on requirement for tool functions, especially terminology servers is emerging. Tools at NYPH have been used successfully to support the integration of clinical applications and the merger of health care institutions. Significant advancement has occurred over the past fifteen years in the development of sophisticated controlled terminologies and the tools to support them. The tool set at NYPH provides a case study to demonstrate one feasible architecture.

  3. Development of the Policy Indicator Checklist: A Tool to Identify and Measure Policies for Calorie-Dense Foods and Sugar-Sweetened Beverages Across Multiple Settings

    PubMed Central

    Hallett, Allen M.; Parker, Nathan; Kudia, Ousswa; Kao, Dennis; Modelska, Maria; Rifai, Hanadi; O’Connor, Daniel P.

    2015-01-01

    Objectives. We developed the policy indicator checklist (PIC) to identify and measure policies for calorie-dense foods and sugar-sweetened beverages to determine how policies are clustered across multiple settings. Methods. In 2012 and 2013 we used existing literature, policy documents, government recommendations, and instruments to identify key policies. We then developed the PIC to examine the policy environments across 3 settings (communities, schools, and early care and education centers) in 8 communities participating in the Childhood Obesity Research Demonstration Project. Results. Principal components analysis revealed 5 components related to calorie-dense food policies and 4 components related to sugar-sweetened beverage policies. Communities with higher youth and racial/ethnic minority populations tended to have fewer and weaker policy environments concerning calorie-dense foods and healthy foods and beverages. Conclusions. The PIC was a helpful tool to identify policies that promote healthy food environments across multiple settings and to measure and compare the overall policy environments across communities. There is need for improved coordination across settings, particularly in areas with greater concentration of youths and racial/ethnic minority populations. Policies to support healthy eating are not equally distributed across communities, and disparities continue to exist in nutrition policies. PMID:25790397

  4. Development of the policy indicator checklist: a tool to identify and measure policies for calorie-dense foods and sugar-sweetened beverages across multiple settings.

    PubMed

    Lee, Rebecca E; Hallett, Allen M; Parker, Nathan; Kudia, Ousswa; Kao, Dennis; Modelska, Maria; Rifai, Hanadi; O'Connor, Daniel P

    2015-05-01

    We developed the policy indicator checklist (PIC) to identify and measure policies for calorie-dense foods and sugar-sweetened beverages to determine how policies are clustered across multiple settings. In 2012 and 2013 we used existing literature, policy documents, government recommendations, and instruments to identify key policies. We then developed the PIC to examine the policy environments across 3 settings (communities, schools, and early care and education centers) in 8 communities participating in the Childhood Obesity Research Demonstration Project. Principal components analysis revealed 5 components related to calorie-dense food policies and 4 components related to sugar-sweetened beverage policies. Communities with higher youth and racial/ethnic minority populations tended to have fewer and weaker policy environments concerning calorie-dense foods and healthy foods and beverages. The PIC was a helpful tool to identify policies that promote healthy food environments across multiple settings and to measure and compare the overall policy environments across communities. There is need for improved coordination across settings, particularly in areas with greater concentration of youths and racial/ethnic minority populations. Policies to support healthy eating are not equally distributed across communities, and disparities continue to exist in nutrition policies.

  5. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1993-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X DataSlice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  6. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1992-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X Data Slice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  7. Hospital process orientation from an operations management perspective: development of a measurement tool and practical testing in three ophthalmic practices

    PubMed Central

    2013-01-01

    Background Although research interest in hospital process orientation (HPO) is growing, the development of a measurement tool to assess process orientation (PO) has not been very successful yet. To view a hospital as a series of processes organized around patients with a similar demand seems to be an attractive proposition, but it is hard to operationalize this idea in a measurement tool that can actually measure the level of PO. This research contributes to HPO from an operations management (OM) perspective by addressing the alignment, integration and coordination of activities within patient care processes. The objective of this study was to develop and practically test a new measurement tool for assessing the degree of PO within hospitals using existing tools. Methods Through a literature search we identified a number of constructs to measure PO in hospital settings. These constructs were further operationalized, using an OM perspective. Based on five dimensions of an existing questionnaire a new HPO-measurement tool was developed to measure the degree of PO within hospitals on the basis of respondents’ perception. The HPO-measurement tool was pre-tested in a non-participating hospital and discussed with experts in a focus group. The multicentre exploratory case study was conducted in the ophthalmic practices of three different types of Dutch hospitals. In total 26 employees from three disciplines participated. After filling in the questionnaire an interview was held with each participant to check the validity and the reliability of the measurement tool. Results The application of the HPO-measurement tool, analysis of the scores and interviews with the participants resulted in the possibility to identify differences of PO performance and the areas of improvement – from a PO point of view – within each hospital. The result of refinement of the items of the measurement tool after practical testing is a set of 41 items to assess the degree of PO from an OM perspective within hospitals. Conclusions The development and practically testing of a new HPO-measurement tool improves the understanding and application of PO in hospitals and the reliability of the measurement tool. The study shows that PO is a complex concept and appears still hard to objectify. PMID:24219362

  8. Hospital process orientation from an operations management perspective: development of a measurement tool and practical testing in three ophthalmic practices.

    PubMed

    Gonçalves, Pedro D; Hagenbeek, Marie Louise; Vissers, Jan M H

    2013-11-13

    Although research interest in hospital process orientation (HPO) is growing, the development of a measurement tool to assess process orientation (PO) has not been very successful yet. To view a hospital as a series of processes organized around patients with a similar demand seems to be an attractive proposition, but it is hard to operationalize this idea in a measurement tool that can actually measure the level of PO. This research contributes to HPO from an operations management (OM) perspective by addressing the alignment, integration and coordination of activities within patient care processes. The objective of this study was to develop and practically test a new measurement tool for assessing the degree of PO within hospitals using existing tools. Through a literature search we identified a number of constructs to measure PO in hospital settings. These constructs were further operationalized, using an OM perspective. Based on five dimensions of an existing questionnaire a new HPO-measurement tool was developed to measure the degree of PO within hospitals on the basis of respondents' perception. The HPO-measurement tool was pre-tested in a non-participating hospital and discussed with experts in a focus group. The multicentre exploratory case study was conducted in the ophthalmic practices of three different types of Dutch hospitals. In total 26 employees from three disciplines participated. After filling in the questionnaire an interview was held with each participant to check the validity and the reliability of the measurement tool. The application of the HPO-measurement tool, analysis of the scores and interviews with the participants resulted in the possibility to identify differences of PO performance and the areas of improvement--from a PO point of view--within each hospital. The result of refinement of the items of the measurement tool after practical testing is a set of 41 items to assess the degree of PO from an OM perspective within hospitals. The development and practically testing of a new HPO-measurement tool improves the understanding and application of PO in hospitals and the reliability of the measurement tool. The study shows that PO is a complex concept and appears still hard to objectify.

  9. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  10. Measuring Afterschool Program Quality Using Setting-Level Observational Approaches

    ERIC Educational Resources Information Center

    Oh, Yoonkyung; Osgood, D. Wayne; Smith, Emilie P.

    2015-01-01

    The importance of afterschool hours for youth development is widely acknowledged, and afterschool settings have recently received increasing attention as an important venue for youth interventions, bringing a growing need for reliable and valid measures of afterschool quality. This study examined the extent to which the two observational tools,…

  11. Organizational Constraints and Goal Setting

    ERIC Educational Resources Information Center

    Putney, Frederick B.; Wotman, Stephen

    1978-01-01

    Management modeling techniques are applied to setting operational and capital goals using cost analysis techniques in this case study at the Columbia University School of Dental and Oral Surgery. The model was created as a planning tool used in developing a financially feasible operating plan and a 100 percent physical renewal plan. (LBH)

  12. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    EPA Science Inventory

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  13. Modeling Zone-3 Protection with Generic Relay Models for Dynamic Contingency Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vyakaranam, Bharat GNVSR; Diao, Ruisheng

    This paper presents a cohesive approach for calculating and coordinating the settings of multiple zone-3 protections for dynamic contingency analysis. The zone-3 protections are represented by generic distance relay models. A two-step approach for determining zone-3 relay settings is proposed. The first step is to calculate settings, particularly, the reach, of each zone-3 relay individually by iteratively running line open-end fault short circuit analysis; the blinder is also employed and properly set to meet the industry standard under extreme loading conditions. The second step is to systematically coordinate the protection settings of the zone-3 relays. The main objective of thismore » coordination step is to address the over-reaching issues. We have developed a tool to automate the proposed approach and generate the settings of all distance relays in a PSS/E dyr format file. The calculated zone-3 settings have been tested on a modified IEEE 300 system using a dynamic contingency analysis tool (DCAT).« less

  14. A survey of tools for variant analysis of next-generation genome sequencing data

    PubMed Central

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  15. A Python tool to set up relative free energy calculations in GROMACS

    PubMed Central

    Klimovich, Pavel V.; Mobley, David L.

    2015-01-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper [14], recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge [16]. Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations. PMID:26487189

  16. Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner

    NASA Astrophysics Data System (ADS)

    Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.

    2015-02-01

    Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.

  17. PAINT: a promoter analysis and interaction network generation tool for gene regulatory network identification.

    PubMed

    Vadigepalli, Rajanikanth; Chakravarthula, Praveen; Zak, Daniel E; Schwaber, James S; Gonye, Gregory E

    2003-01-01

    We have developed a bioinformatics tool named PAINT that automates the promoter analysis of a given set of genes for the presence of transcription factor binding sites. Based on coincidence of regulatory sites, this tool produces an interaction matrix that represents a candidate transcriptional regulatory network. This tool currently consists of (1) a database of promoter sequences of known or predicted genes in the Ensembl annotated mouse genome database, (2) various modules that can retrieve and process the promoter sequences for binding sites of known transcription factors, and (3) modules for visualization and analysis of the resulting set of candidate network connections. This information provides a substantially pruned list of genes and transcription factors that can be examined in detail in further experimental studies on gene regulation. Also, the candidate network can be incorporated into network identification methods in the form of constraints on feasible structures in order to render the algorithms tractable for large-scale systems. The tool can also produce output in various formats suitable for use in external visualization and analysis software. In this manuscript, PAINT is demonstrated in two case studies involving analysis of differentially regulated genes chosen from two microarray data sets. The first set is from a neuroblastoma N1E-115 cell differentiation experiment, and the second set is from neuroblastoma N1E-115 cells at different time intervals following exposure to neuropeptide angiotensin II. PAINT is available for use as an agent in BioSPICE simulation and analysis framework (www.biospice.org), and can also be accessed via a WWW interface at www.dbi.tju.edu/dbi/tools/paint/.

  18. Automation of Ocean Product Metrics

    DTIC Science & Technology

    2008-09-30

    Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data

  19. Usability Testing of the iPhone App to Improve Pain Assessment for Older Adults with Cognitive Impairment (Prehospital Setting): A Qualitative Study.

    PubMed

    Docking, Rachael E; Lane, Matthew; Schofield, Pat A

    2017-03-15

    Pain assessment in older adults with cognitive impairment is often challenging, and paramedics are not given sufficient tools/training to assess pain. The development of a mobile app may improve pain assessment and management in this vulnerable population. We conducted usability testing of a newly developed iPhone pain assessment application with potential users, in this case as a tool for clinical paramedic practice to improve pain assessment of older adults with cognitive impairment. We conducted usability testing with paramedic students and a Delphi panel of qualified paramedics. Participants studied the app and paper-based algorithm from which the app was developed. The potential use for the app was discussed. Usability testing focus groups were recorded, transcribed verbatim, and analyzed using a thematic approach. Proposed recommendations were disseminated to the Delphi panel that reviewed and confirmed them. Twenty-four paramedic students from two UK ambulance services participated in the focus groups. Usability of the app and its potential were viewed positively. Four major themes were identified: 1) overall opinion of the app for use in paramedic services; 2) incorporating technological applications into the health care setting; 3) improving knowledge and governance; and 4) alternative uses for the app. Subthemes were identified and are presented. Our results indicate that the pain assessment app constitutes a potentially useful tool in the prehospital setting. By providing access to a tool specifically developed to help identify/assess pain in a user-friendly format, paramedics are likely to have increased knowledge and confidence in assessing pain in patients with dementia. © 2017 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  20. Development of a Support Tool for Complex Decision-Making in the Provision of Rural Maternity Care

    PubMed Central

    Hearns, Glen; Klein, Michael C.; Trousdale, William; Ulrich, Catherine; Butcher, David; Miewald, Christiana; Lindstrom, Ronald; Eftekhary, Sahba; Rosinski, Jessica; Gómez-Ramírez, Oralia; Procyk, Andrea

    2010-01-01

    Context: Decisions in the organization of safe and effective rural maternity care are complex, difficult, value laden and fraught with uncertainty, and must often be based on imperfect information. Decision analysis offers tools for addressing these complexities in order to help decision-makers determine the best use of resources and to appreciate the downstream effects of their decisions. Objective: To develop a maternity care decision-making tool for the British Columbia Northern Health Authority (NH) for use in low birth volume settings. Design: Based on interviews with community members, providers, recipients and decision-makers, and employing a formal decision analysis approach, we sought to clarify the influences affecting rural maternity care and develop a process to generate a set of value-focused objectives for use in designing and evaluating rural maternity care alternatives. Setting: Four low-volume communities with variable resources (with and without on-site births, with or without caesarean section capability) were chosen. Participants: Physicians (20), nurses (18), midwives and maternity support service providers (4), local business leaders, economic development officials and elected officials (12), First Nations (women [pregnant and non-pregnant], chiefs and band members) (40), social workers (3), pregnant women (2) and NH decision-makers/administrators (17). Results: We developed a Decision Support Manual to assist with assessing community needs and values, context for decision-making, capacity of the health authority or healthcare providers, identification of key objectives for decision-making, developing alternatives for care, and a process for making trade-offs and balancing multiple objectives. The manual was deemed an effective tool for the purpose by the client, NH. Conclusions: Beyond assisting the decision-making process itself, the methodology provides a transparent communication tool to assist in making difficult decisions. While the manual was specifically intended to deal with rural maternity issues, the NH decision-makers feel the method can be easily adapted to assist decision-making in other contexts in medicine where there are conflicting objectives, values and opinions. Decisions on the location of new facilities or infrastructure, or enhancing or altering services such as surgical or palliative care, would be examples of complex decisions that might benefit from this methodology. PMID:21286270

  1. Nanobioinformatics: Emerging Computational Tools to Understand Nano-Bio Interaction

    DTIC Science & Technology

    2012-11-16

    followed for using animals for toxicity studies, Organization for Economic Co- operation and Development ( OECD ) has set guidelines for toxicity studies...operation and Development ( OECD ) has set guidelines for toxicity studies in guideline number 420, which says that only dosages of 50-2000 mg/kg body weight...GSH, SOD, GSSH, MDA, ALK , ALT, LDH), Cell lines. Preprocessing: After collection of data from the published articles preprocessing of the data is

  2. A New Simulation Framework for Autonomy in Robotic Missions

    NASA Technical Reports Server (NTRS)

    Flueckiger, Lorenzo; Neukom, Christian

    2003-01-01

    Autonomy is a key factor in remote robotic exploration and there is significant activity addressing the application of autonomy to remote robots. It has become increasingly important to have simulation tools available to test the autonomy algorithms. While indus1;rial robotics benefits from a variety of high quality simulation tools, researchers developing autonomous software are still dependent primarily on block-world simulations. The Mission Simulation Facility I(MSF) project addresses this shortcoming with a simulation toolkit that will enable developers of autonomous control systems to test their system s performance against a set of integrated, standardized simulations of NASA mission scenarios. MSF provides a distributed architecture that connects the autonomous system to a set of simulated components replacing the robot hardware and its environment.

  3. A novel way of integrating rule-based knowledge into a web ontology language framework.

    PubMed

    Gamberger, Dragan; Krstaçić, Goran; Jović, Alan

    2013-01-01

    Web ontology language (OWL), used in combination with the Protégé visual interface, is a modern standard for development and maintenance of ontologies and a powerful tool for knowledge presentation. In this work, we describe a novel possibility to use OWL also for the conceptualization of knowledge presented by a set of rules. In this approach, rules are represented as a hierarchy of actionable classes with necessary and sufficient conditions defined by the description logic formalism. The advantages are that: the set of the rules is not an unordered set anymore, the concepts defined in descriptive ontologies can be used directly in the bodies of rules, and Protégé presents an intuitive tool for editing the set of rules. Standard ontology reasoning processes are not applicable in this framework, but experiments conducted on the rule sets have demonstrated that the reasoning problems can be successfully solved.

  4. The center for causal discovery of biomedical knowledge from big data

    PubMed Central

    Bahar, Ivet; Becich, Michael J; Benos, Panayiotis V; Berg, Jeremy; Espino, Jeremy U; Glymour, Clark; Jacobson, Rebecca Crowley; Kienholz, Michelle; Lee, Adrian V; Lu, Xinghua; Scheines, Richard

    2015-01-01

    The Big Data to Knowledge (BD2K) Center for Causal Discovery is developing and disseminating an integrated set of open source tools that support causal modeling and discovery of biomedical knowledge from large and complex biomedical datasets. The Center integrates teams of biomedical and data scientists focused on the refinement of existing and the development of new constraint-based and Bayesian algorithms based on causal Bayesian networks, the optimization of software for efficient operation in a supercomputing environment, and the testing of algorithms and software developed using real data from 3 representative driving biomedical projects: cancer driver mutations, lung disease, and the functional connectome of the human brain. Associated training activities provide both biomedical and data scientists with the knowledge and skills needed to apply and extend these tools. Collaborative activities with the BD2K Consortium further advance causal discovery tools and integrate tools and resources developed by other centers. PMID:26138794

  5. A web-based neurological pain classifier tool utilizing Bayesian decision theory for pain classification in spinal cord injury patients

    NASA Astrophysics Data System (ADS)

    Verma, Sneha K.; Chun, Sophia; Liu, Brent J.

    2014-03-01

    Pain is a common complication after spinal cord injury with prevalence estimates ranging 77% to 81%, which highly affects a patient's lifestyle and well-being. In the current clinical setting paper-based forms are used to classify pain correctly, however, the accuracy of diagnoses and optimal management of pain largely depend on the expert reviewer, which in many cases is not possible because of very few experts in this field. The need for a clinical decision support system that can be used by expert and non-expert clinicians has been cited in literature, but such a system has not been developed. We have designed and developed a stand-alone tool for correctly classifying pain type in spinal cord injury (SCI) patients, using Bayesian decision theory. Various machine learning simulation methods are used to verify the algorithm using a pilot study data set, which consists of 48 patients data set. The data set consists of the paper-based forms, collected at Long Beach VA clinic with pain classification done by expert in the field. Using the WEKA as the machine learning tool we have tested on the 48 patient dataset that the hypothesis that attributes collected on the forms and the pain location marked by patients have very significant impact on the pain type classification. This tool will be integrated with an imaging informatics system to support a clinical study that will test the effectiveness of using Proton Beam radiotherapy for treating spinal cord injury (SCI) related neuropathic pain as an alternative to invasive surgical lesioning.

  6. Development of a multilevel health and safety climate survey tool within a mining setting.

    PubMed

    Parker, Anthony W; Tones, Megan J; Ritchie, Gabrielle E

    2017-09-01

    This study aimed to design, implement and evaluate the reliability and validity of a multifactorial and multilevel health and safety climate survey (HSCS) tool with utility in the Australian mining setting. An 84-item questionnaire was developed and pilot tested on a sample of 302 Australian miners across two open cut sites. A 67-item, 10 factor solution was obtained via exploratory factor analysis (EFA) representing prioritization and attitudes to health and safety across multiple domains and organizational levels. Each factor demonstrated a high level of internal reliability, and a series of ANOVAs determined a high level of consistency in responses across the workforce, and generally irrespective of age, experience or job category. Participants tended to hold favorable views of occupational health and safety (OH&S) climate at the management, supervisor, workgroup and individual level. The survey tool demonstrated reliability and validity for use within an open cut Australian mining setting and supports a multilevel, industry specific approach to OH&S climate. Findings suggested a need for mining companies to maintain high OH&S standards to minimize risks to employee health and safety. Future research is required to determine the ability of this measure to predict OH&S outcomes and its utility within other mine settings. As this tool integrates health and safety, it may have benefits for assessment, monitoring and evaluation in the industry, and improving the understanding of how health and safety climate interact at multiple levels to influence OH&S outcomes. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  7. User's manual for tooth contact analysis of face-milled spiral bevel gears with given machine-tool settings

    NASA Technical Reports Server (NTRS)

    Litvin, Faydor L.; Zhang, YI; Chen, Jui-Sheng

    1991-01-01

    Research was performed to develop a computer program that will: (1) simulate the meshing and bearing contact for face milled spiral beval gears with given machine tool settings; and (2) to obtain the output, some of the data is required for hydrodynamic analysis. It is assumed that the machine tool settings and the blank data will be taken from the Gleason summaries. The theoretical aspects of the program are based on 'Local Synthesis and Tooth Contact Analysis of Face Mill Milled Spiral Bevel Gears'. The difference between the computer programs developed herein and the other one is as follows: (1) the mean contact point of tooth surfaces for gears with given machine tool settings must be determined iteratively, while parameters (H and V) are changed (H represents displacement along the pinion axis, V represents the gear displacement that is perpendicular to the plane drawn through the axes of the pinion and the gear of their initial positions), this means that when V differs from zero, the axis of the pionion and the gear are crossed but not intersected; (2) in addition to the regular output data (transmission errors and bearing contact), the new computer program provides information about the contacting force for each contact point and the sliding and the so-called rolling velocity. The following topics are covered: (1) instructions for the users as to how to insert the input data; (2) explanations regarding the output data; (3) numerical example; and (4) listing of the program.

  8. Calculation of Coupled Vibroacoustics Response Estimates from a Library of Available Uncoupled Transfer Function Sets

    NASA Technical Reports Server (NTRS)

    Smith, Andrew; LaVerde, Bruce; Hunt, Ron; Fulcher, Clay; Towner, Robert; McDonald, Emmett

    2012-01-01

    The design and theoretical basis of a new database tool that quickly generates vibroacoustic response estimates using a library of transfer functions (TFs) is discussed. During the early stages of a launch vehicle development program, these response estimates can be used to provide vibration environment specification to hardware vendors. The tool accesses TFs from a database, combines the TFs, and multiplies these by input excitations to estimate vibration responses. The database is populated with two sets of uncoupled TFs; the first set representing vibration response of a bare panel, designated as H(sup s), and the second set representing the response of the free-free component equipment by itself, designated as H(sup c). For a particular configuration undergoing analysis, the appropriate H(sup s) and H(sup c) are selected and coupled to generate an integrated TF, designated as H(sup s +c). This integrated TF is then used with the appropriate input excitations to estimate vibration responses. This simple yet powerful tool enables a user to estimate vibration responses without directly using finite element models, so long as suitable H(sup s) and H(sup c) sets are defined in the database libraries. The paper discusses the preparation of the database tool and provides the assumptions and methodologies necessary to combine H(sup s) and H(sup c) sets into an integrated H(sup s + c). An experimental validation of the approach is also presented.

  9. WHAM!: a web-based visualization suite for user-defined analysis of metagenomic shotgun sequencing data.

    PubMed

    Devlin, Joseph C; Battaglia, Thomas; Blaser, Martin J; Ruggles, Kelly V

    2018-06-25

    Exploration of large data sets, such as shotgun metagenomic sequence or expression data, by biomedical experts and medical professionals remains as a major bottleneck in the scientific discovery process. Although tools for this purpose exist for 16S ribosomal RNA sequencing analysis, there is a growing but still insufficient number of user-friendly interactive visualization workflows for easy data exploration and figure generation. The development of such platforms for this purpose is necessary to accelerate and streamline microbiome laboratory research. We developed the Workflow Hub for Automated Metagenomic Exploration (WHAM!) as a web-based interactive tool capable of user-directed data visualization and statistical analysis of annotated shotgun metagenomic and metatranscriptomic data sets. WHAM! includes exploratory and hypothesis-based gene and taxa search modules for visualizing differences in microbial taxa and gene family expression across experimental groups, and for creating publication quality figures without the need for command line interface or in-house bioinformatics. WHAM! is an interactive and customizable tool for downstream metagenomic and metatranscriptomic analysis providing a user-friendly interface allowing for easy data exploration by microbiome and ecological experts to facilitate discovery in multi-dimensional and large-scale data sets.

  10. Initial development of prototype performance model for highway design

    DOT National Transportation Integrated Search

    1997-12-01

    The Federal Highway Administration (FHWA) has undertaken a multiyear project to develop the Interactive Highway Safety Design Model (IHSDM), which is a CADD-based integrated set of software tools to analyze a highway design to identify safety issues ...

  11. Common Effects Methodology National Stakeholder Meeting December 1, 2010

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  12. Transputer parallel processing at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1989-01-01

    The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.

  13. Characterizing Aerosols over Southeast Asia using the AERONET Data Synergy Tool

    NASA Technical Reports Server (NTRS)

    Giles, David M.; Holben, Brent N.; Eck, Thomas F.; Slutsker, Ilya; Slutsker, Ilya; Welton, Ellsworth, J.; Chin, Mian; Kucsera, Thomas; Schmaltz, Jeffery E.; Diehl, Thomas; hide

    2007-01-01

    Biomass burning, urban pollution and dust aerosols have significant impacts on the radiative forcing of the atmosphere over Asia. In order to better quanti@ these aerosol characteristics, the Aerosol Robotic Network (AERONET) has established over 200 sites worldwide with an emphasis in recent years on the Asian continent - specifically Southeast Asia. A total of approximately 15 AERONET sun photometer instruments have been deployed to China, India, Pakistan, Thailand, and Vietnam. Sun photometer spectral aerosol optical depth measurements as well as microphysical and optical aerosol retrievals over Southeast Asia will be analyzed and discussed with supporting ground-based instrument, satellite, and model data sets, which are freely available via the AERONET Data Synergy tool at the AERONET web site (http://aeronet.gsfc.nasa.gov). This web-based data tool provides access to groundbased (AERONET and MPLNET), satellite (MODIS, SeaWiFS, TOMS, and OMI) and model (GOCART and back trajectory analyses) databases via one web portal. Future development of the AERONET Data Synergy Tool will include the expansion of current data sets as well as the implementation of other Earth Science data sets pertinent to advancing aerosol research.

  14. Web-based monitoring tools for Resistive Plate Chambers in the CMS experiment at CERN

    NASA Astrophysics Data System (ADS)

    Kim, M. S.; Ban, Y.; Cai, J.; Li, Q.; Liu, S.; Qian, S.; Wang, D.; Xu, Z.; Zhang, F.; Choi, Y.; Kim, D.; Goh, J.; Choi, S.; Hong, B.; Kang, J. W.; Kang, M.; Kwon, J. H.; Lee, K. S.; Lee, S. K.; Park, S. K.; Pant, L. M.; Mohanty, A. K.; Chudasama, R.; Singh, J. B.; Bhatnagar, V.; Mehta, A.; Kumar, R.; Cauwenbergh, S.; Costantini, S.; Cimmino, A.; Crucy, S.; Fagot, A.; Garcia, G.; Ocampo, A.; Poyraz, D.; Salva, S.; Thyssen, F.; Tytgat, M.; Zaganidis, N.; Doninck, W. V.; Cabrera, A.; Chaparro, L.; Gomez, J. P.; Gomez, B.; Sanabria, J. C.; Avila, C.; Ahmad, A.; Muhammad, S.; Shoaib, M.; Hoorani, H.; Awan, I.; Ali, I.; Ahmed, W.; Asghar, M. I.; Shahzad, H.; Sayed, A.; Ibrahim, A.; Aly, S.; Assran, Y.; Radi, A.; Elkafrawy, T.; Sharma, A.; Colafranceschi, S.; Abbrescia, M.; Calabria, C.; Colaleo, A.; Iaselli, G.; Loddo, F.; Maggi, M.; Nuzzo, S.; Pugliese, G.; Radogna, R.; Venditti, R.; Verwilligen, P.; Benussi, L.; Bianco, S.; Piccolo, D.; Paolucci, P.; Buontempo, S.; Cavallo, N.; Merola, M.; Fabozzi, F.; Iorio, O. M.; Braghieri, A.; Montagna, P.; Riccardi, C.; Salvini, P.; Vitulo, P.; Vai, I.; Magnani, A.; Dimitrov, A.; Litov, L.; Pavlov, B.; Petkov, P.; Aleksandrov, A.; Genchev, V.; Iaydjiev, P.; Rodozov, M.; Sultanov, G.; Vutova, M.; Stoykova, S.; Hadjiiska, R.; Ibargüen, H. S.; Morales, M. I. P.; Bernardino, S. C.; Bagaturia, I.; Tsamalaidze, Z.; Crotty, I.

    2014-10-01

    The Resistive Plate Chambers (RPC) are used in the CMS experiment at the trigger level and also in the standard offline muon reconstruction. In order to guarantee the quality of the data collected and to monitor online the detector performance, a set of tools has been developed in CMS which is heavily used in the RPC system. The Web-based monitoring (WBM) is a set of java servlets that allows users to check the performance of the hardware during data taking, providing distributions and history plots of all the parameters. The functionalities of the RPC WBM monitoring tools are presented along with studies of the detector performance as a function of growing luminosity and environmental conditions that are tracked over time.

  15. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  16. Creation of a Web-Based GIS Server and Custom Geoprocessing Tools for Enhanced Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.

    2010-12-01

    Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster creation, profile, TRMM). The generation of a wide range of derivative maps (e.g., buffer zone, contour map, graphs, temporal rainfall distribution maps) from various map layers (e.g., geologic maps, geophysics, satellite images) allows for more user flexibility. The use of these tools along with Google Map’s API which enables the website user to utilize high quality GeoEye 2 images provide by Google in conjunction with our data, creates a more complete image of the area being observed and allows for custom derivative maps to be created in the field and viewed immediately on the web, processes that were restricted to offline databases.

  17. MAGMA: Generalized Gene-Set Analysis of GWAS Data

    PubMed Central

    de Leeuw, Christiaan A.; Mooij, Joris M.; Heskes, Tom; Posthuma, Danielle

    2015-01-01

    By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn’s Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn’s Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn’s Disease data was found to be considerably faster as well. PMID:25885710

  18. MAGMA: generalized gene-set analysis of GWAS data.

    PubMed

    de Leeuw, Christiaan A; Mooij, Joris M; Heskes, Tom; Posthuma, Danielle

    2015-04-01

    By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn's Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn's Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn's Disease data was found to be considerably faster as well.

  19. The Electronic Patient Reported Outcome Tool: Testing Usability and Feasibility of a Mobile App and Portal to Support Care for Patients With Complex Chronic Disease and Disability in Primary Care Settings

    PubMed Central

    Gill, Ashlinder; Khan, Anum Irfan; Hans, Parminder Kaur; Kuluski, Kerry; Cott, Cheryl

    2016-01-01

    Background People experiencing complex chronic disease and disability (CCDD) face some of the greatest challenges of any patient population. Primary care providers find it difficult to manage multiple discordant conditions and symptoms and often complex social challenges experienced by these patients. The electronic Patient Reported Outcome (ePRO) tool is designed to overcome some of these challenges by supporting goal-oriented primary care delivery. Using the tool, patients and providers collaboratively develop health care goals on a portal linked to a mobile device to help patients and providers track progress between visits. Objectives This study tested the usability and feasibility of adopting the ePRO tool into a single interdisciplinary primary health care practice in Toronto, Canada. The Fit between Individuals, Fask, and Technology (FITT) framework was used to guide our assessment and explore whether the ePRO tool is: (1) feasible for adoption in interdisciplinary primary health care practices and (2) usable from both the patient and provider perspectives. This usability pilot is part of a broader user-centered design development strategy. Methods A 4-week pilot study was conducted in which patients and providers used the ePRO tool to develop health-related goals, which patients then monitored using a mobile device. Patients and providers collaboratively set goals using the system during an initial visit and had at least 1 follow-up visit at the end of the pilot to discuss progress. Focus groups and interviews were conducted with patients and providers to capture usability and feasibility measures. Data from the ePRO system were extracted to provide information regarding tool usage. Results Six providers and 11 patients participated in the study; 3 patients dropped out mainly owing to health issues. The remaining 8 patients completed 210 monitoring protocols, equal to over 1300 questions, with patients often answering questions daily. Providers and patients accessed the portal on an average of 10 and 1.5 times, respectively. Users found the system easy to use, some patients reporting that the tool helped in their ability to self-manage, catalyzed a sense of responsibility over their care, and improved patient-centered care delivery. Some providers found that the tool helped focus conversations on goal setting. However, the tool did not fit well with provider workflows, monitoring questions were not adequately tailored to individual patient needs, and daily reporting became tedious and time-consuming for patients. Conclusions Although our study suggests relatively low usability and feasibility of the ePRO tool, we are encouraged by the early impact on patient outcomes and generally positive responses from both user groups regarding the potential of the tool to improve care for patients with CCDD. As is consistent with our user-centered design development approach, we have modified the tool based on user feedback, and are now testing the redeveloped tool through an exploratory trial. PMID:27256035

  20. Assessing Child Nutrient Intakes Using a Tablet-Based 24-Hour Recall Tool in Rural Zambia.

    PubMed

    Caswell, Bess L; Talegawkar, Sameera A; Dyer, Brian; Siamusantu, Ward; Klemm, Rolf D W; Palmer, Amanda C

    2015-12-01

    Detailed dietary intake data in low-income populations are needed for research and program evaluation. However, collection of such data by paper-based 24-hour recall imposes substantial demands for staff time and expertise, training, materials, and data entry. To describe our development and use of a tablet-based 24-hour recall tool for conducting dietary intake surveys in remote settings. We designed a 24-hour recall tool using Open Data Kit software on an Android tablet platform. The tool contains a list of local foods, questions on portion size, cooking method, ingredients, and food source and prompts to guide interviewers. We used this tool to interview caregivers on dietary intakes of children participating in an efficacy trial of provitamin A-biofortified maize conducted in Mkushi, a rural district in central Zambia. Participants were children aged 4 to 8 years not yet enrolled in school (n = 938). Dietary intake data were converted to nutrient intakes using local food composition and recipe tables. We developed a tablet-based 24-hour recall tool and used it to collect dietary data among 928 children. The majority of foods consumed were maize, leafy vegetable, or small fish dishes. Median daily energy intake was 6416 kJ (1469 kcal). Food and nutrient intakes assessed using the tablet-based tool were consistent with those reported in prior research. The tool was easily used by interviewers without prior nutrition training or computing experience. Challenges remain to improve programming, but the tool is an innovation that enables efficient collection of 24-hour recall data in remote settings. © The Author(s) 2015.

  1. Simulink/PARS Integration Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vacaliuc, B.; Nakhaee, N.

    2013-12-18

    The state of the art for signal processor hardware has far out-paced the development tools for placing applications on that hardware. In addition, signal processors are available in a variety of architectures, each uniquely capable of handling specific types of signal processing efficiently. With these processors becoming smaller and demanding less power, it has become possible to group multiple processors, a heterogeneous set of processors, into single systems. Different portions of the desired problem set can be assigned to different processor types as appropriate. As software development tools do not keep pace with these processors, especially when multiple processors ofmore » different types are used, a method is needed to enable software code portability among multiple processors and multiple types of processors along with their respective software environments. Sundance DSP, Inc. has developed a software toolkit called “PARS”, whose objective is to provide a framework that uses suites of tools provided by different vendors, along with modeling tools and a real time operating system, to build an application that spans different processor types. The software language used to express the behavior of the system is a very high level modeling language, “Simulink”, a MathWorks product. ORNL has used this toolkit to effectively implement several deliverables. This CRADA describes this collaboration between ORNL and Sundance DSP, Inc.« less

  2. A New Network Modeling Tool for the Ground-based Nuclear Explosion Monitoring Community

    NASA Astrophysics Data System (ADS)

    Merchant, B. J.; Chael, E. P.; Young, C. J.

    2013-12-01

    Network simulations have long been used to assess the performance of monitoring networks to detect events for such purposes as planning station deployments and network resilience to outages. The standard tool has been the SAIC-developed NetSim package. With correct parameters, NetSim can produce useful simulations; however, the package has several shortcomings: an older language (FORTRAN), an emphasis on seismic monitoring with limited support for other technologies, limited documentation, and a limited parameter set. Thus, we are developing NetMOD (Network Monitoring for Optimal Detection), a Java-based tool designed to assess the performance of ground-based networks. NetMOD's advantages include: coded in a modern language that is multi-platform, utilizes modern computing performance (e.g. multi-core processors), incorporates monitoring technologies other than seismic, and includes a well-validated default parameter set for the IMS stations. NetMOD is designed to be extendable through a plugin infrastructure, so new phenomenological models can be added. Development of the Seismic Detection Plugin is being pursued first. Seismic location and infrasound and hydroacoustic detection plugins will follow. By making NetMOD an open-release package, it can hopefully provide a common tool that the monitoring community can use to produce assessments of monitoring networks and to verify assessments made by others.

  3. Managing the "Performance" in Performance Management.

    ERIC Educational Resources Information Center

    Repinski, Marilyn; Bartsch, Maryjo

    1996-01-01

    Describes a five-step approach to performance management which includes (1) redefining tasks; (2) identifying skills; (3) determining what development tools are necessary; (4) prioritizing skills development; and (5) developing an action plan. Presents a hiring model that includes job analysis, job description, selection, goal setting, evaluation,…

  4. Creating a culture of professional development: a milestone pathway tool for registered nurses.

    PubMed

    Cooper, Elizabeth

    2009-11-01

    The nursing shortage continues to be a significant threat to health care. Creating a culture of professional development in health care institutions is one way to combat this shortage. Professional development refers to a constant commitment to maintain one's knowledge and skill base. Increasing professional development opportunities in the health care setting has been shown to affect nurse retention and satisfaction. Several approaches have been developed to increase professional development among nurses. However, for the most part, these are "one size fits all" approaches that direct nurses to progress in lock step fashion in skill and knowledge acquisition within a specialty. This article introduces a milestone pathway tool for registered nurses designed to enhance professional development that is unique to the individual nurse and the specific nursing unit. This tool provides a unit-specific concept map, a milestone pathway template, and a personal professional development plan. Copyright 2009, SLACK Incorporated.

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  6. Data Mining Tools Make Flights Safer, More Efficient

    NASA Technical Reports Server (NTRS)

    2014-01-01

    A small data mining team at Ames Research Center developed a set of algorithms ideal for combing through flight data to find anomalies. Dallas-based Southwest Airlines Co. signed a Space Act Agreement with Ames in 2011 to access the tools, helping the company refine its safety practices, improve its safety reviews, and increase flight efficiencies.

  7. Specificity and Sensitivity Ratios of the Pediatric Language Acquisition Screening Tool for Early Referral-Revised.

    ERIC Educational Resources Information Center

    Sherman, Tracy; Shulman, Brian B.

    1999-01-01

    This study examined test characteristics of the Pediatric Language Acquisition Screening Tool for Early Referral-Revised (PLASTER-R), a set of developmental questionnaires for children 3 to 60 months of age. The PLASTER-R was moderately to highly successful in identifying children within normal limits for language development. Test-retest…

  8. The Center/TRACON Automation System (CTAS): A video presentation

    NASA Technical Reports Server (NTRS)

    Green, Steven M.; Freeman, Jeannine

    1992-01-01

    NASA Ames, working with the FAA, has developed a highly effective set of automation tools for aiding the air traffic controller in traffic management within the terminal area. To effectively demonstrate these tools, the video AAV-1372, entitled 'Center/TRACON Automation System,' was produced. The script to the video is provided along with instructions for its acquisition.

  9. Initial Validation of the Prekindergarten Classroom Observation Tool and Goal Setting System for Data-Based Coaching

    ERIC Educational Resources Information Center

    Crawford, April D.; Zucker, Tricia A.; Williams, Jeffrey M.; Bhavsar, Vibhuti; Landry, Susan H.

    2013-01-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based…

  10. A Real-Time Plagiarism Detection Tool for Computer-Based Assessments

    ERIC Educational Resources Information Center

    Jeske, Heimo J.; Lall, Manoj; Kogeda, Okuthe P.

    2018-01-01

    Aim/Purpose: The aim of this article is to develop a tool to detect plagiarism in real time amongst students being evaluated for learning in a computer-based assessment setting. Background: Cheating or copying all or part of source code of a program is a serious concern to academic institutions. Many academic institutions apply a combination of…

  11. The Engineering of Engineering Education: Curriculum Development from a Designer's Point of View

    ERIC Educational Resources Information Center

    Rompelman, Otto; De Graaff, Erik

    2006-01-01

    Engineers have a set of powerful tools at their disposal for designing robust and reliable technical systems. In educational design these tools are seldom applied. This paper explores the application of concepts from the systems approach in an educational context. The paradigms of design methodology and systems engineering appear to be suitable…

  12. Numerical tension adjustment of x-ray membrane to represent goat skin kompang

    NASA Astrophysics Data System (ADS)

    Siswanto, Waluyo Adi; Abdullah, Muhammad Syiddiq Bin

    2017-04-01

    This paper presents a numerical membrane model of traditional musical instrument kompang that will be used to find the parameter of membrane tension of x-ray membrane representing the classical goat-skin membrane of kompang. In this study, the experiment towards the kompang is first conducted in an acoustical anechoic enclosure and in parallel a mathematical model of the kompang membrane is developed to simulate the vibration of the kompang membrane in polar coordinate by implementing Fourier-Bessel wave function. The wave equation in polar direction in mode 0,1 is applied to provide the corresponding natural frequencies of the circular membrane. The value of initial and boundary conditions in the function is determined from experiment to allow the correct development of numerical equation. The numerical mathematical model is coded in SMath for the accurate numerical analysis as well as the plotting tool. Two kompang membrane cases with different membrane materials, i.e. goat skin and x-ray film membranes with fixed radius of 0.1 m are used in the experiment. An alternative of kompang's membrane made of x-ray film with the appropriate tension setting can be used to represent the sound of traditional goat-skin kompang. The tension setting of the membrane to resemble the goat-skin is 24N. An effective numerical tool has been develop to help kompang maker to set the tension of x-ray membrane. In the future application, any tradional kompang with different size can be replaced by another membrane material if the tension is set to the correct tension value. The developed numerical tool is useful and handy to calculate the tension of the alternative membrane material.

  13. Software project management tools in global software development: a systematic mapping study.

    PubMed

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  14. Method Of Wire Insertion For Electric Machine Stators

    DOEpatents

    Brown, David L; Stabel, Gerald R; Lawrence, Robert Anthony

    2005-02-08

    A method of inserting coils in slots of a stator is provided. The method includes interleaving a first set of first phase windings and a first set of second phase windings on an insertion tool. The method also includes activating the insertion tool to radially insert the first set of first phase windings and the first set of second phase windings in the slots of the stator. In one embodiment, interleaving the first set of first phase windings and the first set of second phase windings on the insertion tool includes forming the first set of first phase windings in first phase openings defined in the insertion tool, and forming the first set of second phase windings in second phase openings defined in the insertion tool.

  15. An exploration of inter-organisational partnership assessment tools in the context of Australian Aboriginal-mainstream partnerships: a scoping review of the literature.

    PubMed

    Tsou, Christina; Haynes, Emma; Warner, Wayne D; Gray, Gordon; Thompson, Sandra C

    2015-04-23

    The need for better partnerships between Aboriginal organisations and mainstream agencies demands attention on process and relational elements of these partnerships, and improving partnership functioning through transformative or iterative evaluation procedures. This paper presents the findings of a literature review which examines the usefulness of existing partnership tools to the Australian Aboriginal-mainstream partnership (AMP) context. Three sets of best practice principles for successful AMP were selected based on authors' knowledge and experience. Items in each set of principles were separated into process and relational elements and used to guide the analysis of partnership assessment tools. The review and analysis of partnership assessment tools were conducted in three distinct but related parts. Part 1- identify and select reviews of partnership tools; part 2 - identify and select partnership self-assessment tool; part 3 - analysis of selected tools using AMP principles. The focus on relational and process elements in the partnership tools reviewed is consistent with the focus of Australian AMP principles by reconciliation advocates; however, historical context, lived experience, cultural context and approaches of Australian Aboriginal people represent key deficiencies in the tools reviewed. The overall assessment indicated that the New York Partnership Self-Assessment Tool and the VicHealth Partnership Analysis Tools reflect the greatest number of AMP principles followed by the Nuffield Partnership Assessment Tool. The New York PSAT has the strongest alignment with the relational elements while VicHealth and Nuffield tools showed greatest alignment with the process elements in the chosen AMP principles. Partnership tools offer opportunities for providing evidence based support to partnership development. The multiplicity of tools in existence and the reported uniqueness of each partnership, mean the development of a generic partnership analysis for AMP may not be a viable option for future effort.

  16. Nursing Minimum Data Set Based on EHR Archetypes Approach.

    PubMed

    Spigolon, Dandara N; Moro, Cláudia M C

    2012-01-01

    The establishment of a Nursing Minimum Data Set (NMDS) can facilitate the use of health information systems. The adoption of these sets and represent them based on archetypes are a way of developing and support health systems. The objective of this paper is to describe the definition of a minimum data set for nursing in endometriosis represent with archetypes. The study was divided into two steps: Defining the Nursing Minimum Data Set to endometriosis, and Development archetypes related to the NMDS. The nursing data set to endometriosis was represented in the form of archetype, using the whole perception of the evaluation item, organs and senses. This form of representation is an important tool for semantic interoperability and knowledge representation for health information systems.

  17. Nursing Minimum Data Set Based on EHR Archetypes Approach

    PubMed Central

    Spigolon, Dandara N.; Moro, Cláudia M.C.

    2012-01-01

    The establishment of a Nursing Minimum Data Set (NMDS) can facilitate the use of health information systems. The adoption of these sets and represent them based on archetypes are a way of developing and support health systems. The objective of this paper is to describe the definition of a minimum data set for nursing in endometriosis represent with archetypes. The study was divided into two steps: Defining the Nursing Minimum Data Set to endometriosis, and Development archetypes related to the NMDS. The nursing data set to endometriosis was represented in the form of archetype, using the whole perception of the evaluation item, organs and senses. This form of representation is an important tool for semantic interoperability and knowledge representation for health information systems. PMID:24199126

  18. BioAssay Research Database (BARD): chemical biology and probe-development enabled by structured metadata and result types

    PubMed Central

    Howe, E.A.; de Souza, A.; Lahr, D.L.; Chatwin, S.; Montgomery, P.; Alexander, B.R.; Nguyen, D.-T.; Cruz, Y.; Stonich, D.A.; Walzer, G.; Rose, J.T.; Picard, S.C.; Liu, Z.; Rose, J.N.; Xiang, X.; Asiedu, J.; Durkin, D.; Levine, J.; Yang, J.J.; Schürer, S.C.; Braisted, J.C.; Southall, N.; Southern, M.R.; Chung, T.D.Y.; Brudz, S.; Tanega, C.; Schreiber, S.L.; Bittker, J.A.; Guha, R.; Clemons, P.A.

    2015-01-01

    BARD, the BioAssay Research Database (https://bard.nih.gov/) is a public database and suite of tools developed to provide access to bioassay data produced by the NIH Molecular Libraries Program (MLP). Data from 631 MLP projects were migrated to a new structured vocabulary designed to capture bioassay data in a formalized manner, with particular emphasis placed on the description of assay protocols. New data can be submitted to BARD with a user-friendly set of tools that assist in the creation of appropriately formatted datasets and assay definitions. Data published through the BARD application program interface (API) can be accessed by researchers using web-based query tools or a desktop client. Third-party developers wishing to create new tools can use the API to produce stand-alone tools or new plug-ins that can be integrated into BARD. The entire BARD suite of tools therefore supports three classes of researcher: those who wish to publish data, those who wish to mine data for testable hypotheses, and those in the developer community who wish to build tools that leverage this carefully curated chemical biology resource. PMID:25477388

  19. Experimental validation of the RATE tool for inferring HLA restrictions of T cell epitopes.

    PubMed

    Paul, Sinu; Arlehamn, Cecilia S Lindestam; Schulten, Veronique; Westernberg, Luise; Sidney, John; Peters, Bjoern; Sette, Alessandro

    2017-06-21

    The RATE tool was recently developed to computationally infer the HLA restriction of given epitopes from immune response data of HLA typed subjects without additional cumbersome experimentation. Here, RATE was validated using experimentally defined restriction data from a set of 191 tuberculosis-derived epitopes and 63 healthy individuals with MTB infection from the Western Cape Region of South Africa. Using this experimental dataset, the parameters utilized by the RATE tool to infer restriction were optimized, which included relative frequency (RF) of the subjects responding to a given epitope and expressing a given allele as compared to the general test population and the associated p-value in a Fisher's exact test. We also examined the potential for further optimization based on the predicted binding affinity of epitopes to potential restricting HLA alleles, and the absolute number of individuals expressing a given allele and responding to the specific epitope. Different statistical measures, including Matthew's correlation coefficient, accuracy, sensitivity and specificity were used to evaluate performance of RATE as a function of these criteria. Based on our results we recommend selection of HLA restrictions with cutoffs of p-value < 0.01 and RF ≥ 1.3. The usefulness of the tool was demonstrated by inferring new HLA restrictions for epitope sets where restrictions could not be experimentally determined due to lack of necessary cell lines and for an additional data set related to recognition of pollen derived epitopes from allergic patients. Experimental data sets were used to validate RATE tool and the parameters used by the RATE tool to infer restriction were optimized. New HLA restrictions were identified using the optimized RATE tool.

  20. DATA FOR ENVIRONMENTAL MODELING: AN OVERVIEW

    EPA Science Inventory

    The objective of the project described here, entitled Data for Environmental Modeling (D4EM), is the development of a comprehensive set of software tools that allow an environmental model developer to automatically populate model input files with environmental data available from...

  1. Development of CMS monitoring procedures : technical summary.

    DOT National Transportation Integrated Search

    1998-04-01

    This research study is concerned with the development of a set of procedures for monitoring congestion using GPS and GIS. These procedures are meant to be used more as a planning tool than for everyday traffic monitoring. Under this assumption, a ser...

  2. Common Effects Methodology National Stakeholder Meeting December 1, 2010 White Papers

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  3. Common Effects Methodology Regional Stakeholder Meeting January 11 -22, 2010

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  4. Safety of Rural Nursing Home-to-Emergency Department Transfers: Improving Communication and Patient Information Sharing Across Settings.

    PubMed

    Tupper, Judith B; Gray, Carolyn E; Pearson, Karen B; Coburn, Andrew F

    2015-01-01

    The "siloed" approach to healthcare delivery contributes to communication challenges and to potential patient harm when patients transfer between settings. This article reports on the evaluation of a demonstration in 10 rural communities to improve the safety of nursing facility (NF) transfers to hospital emergency departments by forming interprofessional teams of hospital, emergency medical service, and NF staff to develop and implement tools and protocols for standardizing critical interfacility communication pathways and information sharing. We worked with each of the 10 teams to document current communication processes and information sharing tools and to design, implement, and evaluate strategies/tools to increase effective communication and sharing of patient information across settings. A mixed methods approach was used to evaluate changes from baseline in documentation of patient information shared across settings during the transfer process. Study findings showed significant improvement in key areas across the three settings, including infection status and baseline mental functioning. Improvement strategies and performance varied across settings; however, accurate and consistent information sharing of advance directives and medication lists remains a challenge. Study results demonstrate that with neutral facilitation and technical support, collaborative interfacility teams can assess and effectively address communication and information sharing problems that threaten patient safety.

  5. Osteoporosis risk prediction for bone mineral density assessment of postmenopausal women using machine learning.

    PubMed

    Yoo, Tae Keun; Kim, Sung Kean; Kim, Deok Won; Choi, Joon Yul; Lee, Wan Hyung; Oh, Ein; Park, Eun-Cheol

    2013-11-01

    A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women compared to the ability of conventional clinical decision tools. We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Examination Surveys. The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests, artificial neural networks (ANN), and logistic regression (LR) based on simple surveys. The machine learning models were compared to four conventional clinical decision tools: osteoporosis self-assessment tool (OST), osteoporosis risk assessment instrument (ORAI), simple calculated osteoporosis risk estimation (SCORE), and osteoporosis index of risk (OSIRIS). SVM had significantly better area under the curve (AUC) of the receiver operating characteristic than ANN, LR, OST, ORAI, SCORE, and OSIRIS for the training set. SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0% at total hip, femoral neck, or lumbar spine for the testing set. The significant factors selected by SVM were age, height, weight, body mass index, duration of menopause, duration of breast feeding, estrogen therapy, hyperlipidemia, hypertension, osteoarthritis, and diabetes mellitus. Considering various predictors associated with low bone density, the machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.

  6. Multisite Evaluation of a Data Quality Tool for Patient-Level Clinical Data Sets

    PubMed Central

    Huser, Vojtech; DeFalco, Frank J.; Schuemie, Martijn; Ryan, Patrick B.; Shang, Ning; Velez, Mark; Park, Rae Woong; Boyce, Richard D.; Duke, Jon; Khare, Ritu; Utidjian, Levon; Bailey, Charles

    2016-01-01

    Introduction: Data quality and fitness for analysis are crucial if outputs of analyses of electronic health record data or administrative claims data should be trusted by the public and the research community. Methods: We describe a data quality analysis tool (called Achilles Heel) developed by the Observational Health Data Sciences and Informatics Collaborative (OHDSI) and compare outputs from this tool as it was applied to 24 large healthcare datasets across seven different organizations. Results: We highlight 12 data quality rules that identified issues in at least 10 of the 24 datasets and provide a full set of 71 rules identified in at least one dataset. Achilles Heel is a freely available software that provides a useful starter set of data quality rules with the ability to add additional rules. We also present results of a structured email-based interview of all participating sites that collected qualitative comments about the value of Achilles Heel for data quality evaluation. Discussion: Our analysis represents the first comparison of outputs from a data quality tool that implements a fixed (but extensible) set of data quality rules. Thanks to a common data model, we were able to compare quickly multiple datasets originating from several countries in America, Europe and Asia. PMID:28154833

  7. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  8. Efficient monitoring of CRAB jobs at CMS

    NASA Astrophysics Data System (ADS)

    Silva, J. M. D.; Balcas, J.; Belforte, S.; Ciangottini, D.; Mascheroni, M.; Rupeika, E. A.; Ivanov, T. T.; Hernandez, J. M.; Vaandering, E.

    2017-10-01

    CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates the design choices and gives a report on our experience with the tools we developed and the external ones we used.

  9. Efficient Monitoring of CRAB Jobs at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, J. M.D.; Balcas, J.; Belforte, S.

    CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates themore » design choices and gives a report on our experience with the tools we developed and the external ones we used.« less

  10. Methods for transition toward computer assisted cognitive examination.

    PubMed

    Jurica, P; Valenzi, S; Struzik, Z R; Cichocki, A

    2015-01-01

    We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.

  11. Development of a research ethics knowledge and analytical skills assessment tool.

    PubMed

    Taylor, Holly A; Kass, Nancy E; Ali, Joseph; Sisson, Stephen; Bertram, Amanda; Bhan, Anant

    2012-04-01

    The goal of this project was to develop and validate a new tool to evaluate learners' knowledge and skills related to research ethics. A core set of 50 questions from existing computer-based online teaching modules were identified, refined and supplemented to create a set of 74 multiple-choice, true/false and short answer questions. The questions were pilot-tested and item discrimination was calculated for each question. Poorly performing items were eliminated or refined. Two comparable assessment tools were created. These assessment tools were administered as a pre-test and post-test to a cohort of 58 Indian junior health research investigators before and after exposure to a new course on research ethics. Half of the investigators were exposed to the course online, the other half in person. Item discrimination was calculated for each question and Cronbach's α for each assessment tool. A final version of the assessment tool that incorporated the best questions from the pre-/post-test phase was used to assess retention of research ethics knowledge and skills 3 months after course delivery. The final version of the REKASA includes 41 items and had a Cronbach's α of 0.837. The results illustrate, in one sample of learners, the successful, systematic development and use of a knowledge and skills assessment tool in research ethics capable of not only measuring basic knowledge in research ethics and oversight but also assessing learners' ability to apply ethics knowledge to the analytical task of reasoning through research ethics cases, without reliance on essay or discussion-based examination. These promising preliminary findings should be confirmed with additional groups of learners.

  12. Frailty in trauma: A systematic review of the surgical literature for clinical assessment tools.

    PubMed

    McDonald, Victoria S; Thompson, Kimberly A; Lewis, Paul R; Sise, C Beth; Sise, Michael J; Shackford, Steven R

    2016-05-01

    Elderly trauma patients have outcomes worse than those of similarly injured younger patients. Although patient age and comorbidities explain some of the difference, the contribution of frailty to outcomes is largely unknown because of the lack of assessment tools developed specifically to assess frailty in the trauma population. This systematic review of the surgical literature identifies currently available frailty clinical assessment tools and evaluates the potential of each instrument to assess frailty in elderly patients with trauma. This review was registered with PROSPERO (the international prospective register of systematic reviews, registration number CRD42014015350). Publications in English from January 1995 to October 2014 were identified by a comprehensive search strategy in MEDLINE, EMBASE, and CINAHL, supplemented by manual screening of article bibliographies and subjected to three tiers of review. Forty-two studies reporting on frailty assessment tools were selected for analysis. Criteria for objectivity, feasibility in the trauma setting, and utility to predict trauma outcomes were formulated and used to evaluate the tools, including their subscales and individual items. Thirty-two unique frailty assessment tools were identified. Of those, 4 tools as a whole, 2 subscales, and 29 individual items qualified as objective, feasible, and useful in the clinical assessment of trauma patients. The single existing tool developed specifically to assess frailty in trauma did not meet evaluation criteria. Few frailty assessment tools in the surgical literature qualify as objective, feasible, and useful measures of frailty in the trauma population. However, a number of individual tool items and subscales could be combined to assess frailty in the trauma setting. Research to determine the accuracy of these measures and the magnitude of the contribution of frailty to trauma outcomes is needed. Systematic review, level III.

  13. Using Children's Picture Books about Autism as Resources in Inclusive Classrooms

    ERIC Educational Resources Information Center

    Sigmon, Miranda L.; Tackett, Mary E.; Azano, Amy Price

    2016-01-01

    This article focuses on developing teacher understanding of how to carefully select and use children's picture books about autism as a tool for teaching awareness, empathy, and acceptance in an elementary classroom setting. We describe how the increased rate of autism and growing practice of inclusive educational settings affect classroom practice…

  14. Developing self-regulation in early childhood☆

    PubMed Central

    Rothbart, Mary K.; Tang, Yiyuan

    2014-01-01

    Studies using fMRI at rest and during task performance have revealed a set of brain areas and their connections that can be linked to the ability of children to regulate their thoughts, actions and emotions. Higher self-regulation has also been related favorable outcomes in adulthood. These findings have set the occasion for methods of improving self-regulation via training. A tool kit of such methods is now available. It remains to be seen if educators will use these new findings and tools to forge practical methods for improving the lives of the world's children. PMID:24563845

  15. Development of a personalized decision aid for breast cancer risk reduction and management.

    PubMed

    Ozanne, Elissa M; Howe, Rebecca; Omer, Zehra; Esserman, Laura J

    2014-01-14

    Breast cancer risk reduction has the potential to decrease the incidence of the disease, yet remains underused. We report on the development a web-based tool that provides automated risk assessment and personalized decision support designed for collaborative use between patients and clinicians. Under Institutional Review Board approval, we evaluated the decision tool through a patient focus group, usability testing, and provider interviews (including breast specialists, primary care physicians, genetic counselors). This included demonstrations and data collection at two scientific conferences (2009 International Shared Decision Making Conference, 2009 San Antonio Breast Cancer Symposium). Overall, the evaluations were favorable. The patient focus group evaluations and usability testing (N = 34) provided qualitative feedback about format and design; 88% of these participants found the tool useful and 94% found it easy to use. 91% of the providers (N = 23) indicated that they would use the tool in their clinical setting. BreastHealthDecisions.org represents a new approach to breast cancer prevention care and a framework for high quality preventive healthcare. The ability to integrate risk assessment and decision support in real time will allow for informed, value-driven, and patient-centered breast cancer prevention decisions. The tool is being further evaluated in the clinical setting.

  16. Informed consent comprehension in African research settings.

    PubMed

    Afolabi, Muhammed O; Okebe, Joseph U; McGrath, Nuala; Larson, Heidi J; Bojang, Kalifa; Chandramohan, Daniel

    2014-06-01

    Previous reviews on participants' comprehension of informed consent information have focused on developed countries. Experience has shown that ethical standards developed on Western values may not be appropriate for African settings where research concepts are unfamiliar. We undertook this review to describe how informed consent comprehension is defined and measured in African research settings. We conducted a comprehensive search involving five electronic databases: Medline, Embase, Global Health, EthxWeb and Bioethics Literature Database (BELIT). We also examined African Index Medicus and Google Scholar for relevant publications on informed consent comprehension in clinical studies conducted in sub-Saharan Africa. 29 studies satisfied the inclusion criteria; meta-analysis was possible in 21 studies. We further conducted a direct comparison of participants' comprehension on domains of informed consent in all eligible studies. Comprehension of key concepts of informed consent varies considerably from country to country and depends on the nature and complexity of the study. Meta-analysis showed that 47% of a total of 1633 participants across four studies demonstrated comprehension about randomisation (95% CI 13.9-80.9%). Similarly, 48% of 3946 participants in six studies had understanding about placebo (95% CI 19.0-77.5%), while only 30% of 753 participants in five studies understood the concept of therapeutic misconception (95% CI 4.6-66.7%). Measurement tools for informed consent comprehension were developed with little or no validation. Assessment of comprehension was carried out at variable times after disclosure of study information. No uniform definition of informed consent comprehension exists to form the basis for development of an appropriate tool to measure comprehension in African participants. Comprehension of key concepts of informed consent is poor among study participants across Africa. There is a vital need to develop a uniform definition for informed consent comprehension in low literacy research settings in Africa. This will be an essential step towards developing appropriate tools that can adequately measure informed consent comprehension. This may consequently suggest adequate measures to improve the informed consent procedure. © 2014 John Wiley & Sons Ltd.

  17. Using competences and competence tools in workforce development.

    PubMed

    Green, Tess; Dickerson, Claire; Blass, Eddie

    The NHS Knowledge and Skills Framework (KSF) has been a driving force in the move to competence-based workforce development in the NHS. Skills for Health has developed national workforce competences that aim to improve behavioural performance, and in turn increase productivity. This article describes five projects established to test Skills for Health national workforce competences, electronic tools and products in different settings in the NHS. Competences and competence tools were used to redesign services, develop job roles, identify skills gaps and develop learning programmes. Reported benefits of the projects included increased clarity and a structured, consistent and standardized approach to workforce development. Findings from the evaluation of the tools were positive in terms of their overall usefulness and provision of related training/support. Reported constraints of using the competences and tools included issues relating to their availability, content and organization. It is recognized that a highly skilled and flexible workforce is important to the delivery of high-quality health care. These projects suggest that Skills for Health competences can be used as a 'common currency' in workforce development in the UK health sector. This would support the need to adapt rapidly to changing service needs.

  18. Airborne Turbulence Detection System Certification Tool Set

    NASA Technical Reports Server (NTRS)

    Hamilton, David W.; Proctor, Fred H.

    2006-01-01

    A methodology and a corresponding set of simulation tools for testing and evaluating turbulence detection sensors has been presented. The tool set is available to industry and the FAA for certification of radar based airborne turbulence detection systems. The tool set consists of simulated data sets representing convectively induced turbulence, an airborne radar simulation system, hazard tables to convert the radar observable to an aircraft load, documentation, a hazard metric "truth" algorithm, and criteria for scoring the predictions. Analysis indicates that flight test data supports spatial buffers for scoring detections. Also, flight data and demonstrations with the tool set suggest the need for a magnitude buffer.

  19. Construction and completion of flux balance models from pathway databases.

    PubMed

    Latendresse, Mario; Krummenacker, Markus; Trupp, Miles; Karp, Peter D

    2012-02-01

    Flux balance analysis (FBA) is a well-known technique for genome-scale modeling of metabolic flux. Typically, an FBA formulation requires the accurate specification of four sets: biochemical reactions, biomass metabolites, nutrients and secreted metabolites. The development of FBA models can be time consuming and tedious because of the difficulty in assembling completely accurate descriptions of these sets, and in identifying errors in the composition of these sets. For example, the presence of a single non-producible metabolite in the biomass will make the entire model infeasible. Other difficulties in FBA modeling are that model distributions, and predicted fluxes, can be cryptic and difficult to understand. We present a multiple gap-filling method to accelerate the development of FBA models using a new tool, called MetaFlux, based on mixed integer linear programming (MILP). The method suggests corrections to the sets of reactions, biomass metabolites, nutrients and secretions. The method generates FBA models directly from Pathway/Genome Databases. Thus, FBA models developed in this framework are easily queried and visualized using the Pathway Tools software. Predicted fluxes are more easily comprehended by visualizing them on diagrams of individual metabolic pathways or of metabolic maps. MetaFlux can also remove redundant high-flux loops, solve FBA models once they are generated and model the effects of gene knockouts. MetaFlux has been validated through construction of FBA models for Escherichia coli and Homo sapiens. Pathway Tools with MetaFlux is freely available to academic users, and for a fee to commercial users. Download from: biocyc.org/download.shtml. mario.latendresse@sri.com Supplementary data are available at Bioinformatics online.

  20. Measuring teamwork and taskwork of community-based “teams” delivering life-saving health interventions in rural Zambia: a qualitative study

    PubMed Central

    2013-01-01

    Background The use of teams is a well-known approach in a variety of settings, including health care, in both developed and developing countries. Team performance is comprised of teamwork and task work, and ascertaining whether a team is performing as expected to achieve the desired outcome has rarely been done in health care settings in resource-limited countries. Measuring teamwork requires identifying dimensions of teamwork or processes that comprise the teamwork construct, while taskwork requires identifying specific team functions. Since 2008 a community-based project in rural Zambia has teamed community health workers (CHWs) and traditional birth attendants (TBAs), supported by Neighborhood Health Committees (NHCs), to provide essential newborn and continuous curative care for children 0–59 months. This paper describes the process of developing a measure of teamwork and taskwork for community-based health teams in rural Zambia. Methods Six group discussions and pile-sorting sessions were conducted with three NHCs and three groups of CHW-TBA teams. Each session comprised six individuals. Results We selected 17 factors identified by participants as relevant for measuring teamwork in this rural setting. Participants endorsed seven functions as important to measure taskwork. To explain team performance, we assigned 20 factors into three sub-groups: personal, community-related and service-related. Conclusion Community and culturally relevant processes, functions and factors were used to develop a tool for measuring teamwork and taskwork in this rural community and the tool was quite unique from tools used in developed countries. PMID:23802766

  1. Adding an Expert to the Team: The Expert Flight Plan Critic

    ERIC Educational Resources Information Center

    Gibbons, Andrew; Waki, Randy; Fairweather, Peter

    2008-01-01

    This paper reports the development of a practical tool that provides expert feedback to students following an extended simulation exercise in cross-country flight planning. In contrast to development for laboratory settings, the development of an expert instructional product for everyday use posed some interesting challenges, including dealing…

  2. The AAPT/ComPADRE Digital Library: Supporting Physics Education at All Levels

    NASA Astrophysics Data System (ADS)

    Mason, Bruce

    For more than a decade, the AAPT/ComPADRE Digital Library has been providing online resources, tools, and services that support broad communities of physics faculty and physics education researchers. This online library provides vetted resources for teachers and students, an environment for authors and developers to share their work, and the collaboration tools for a diverse set of users. This talk will focus on the recent collaborations and developments being hosted on or developed with ComPADRE. Examples include PhysPort, making the tools and resources developed by physics education researchers more accessible, the Open Source Physics project, expanding the use of numerical modeling at all levels of physics education, and PICUP, a community for those promoting computation in the physics curriculum. NSF-0435336, 0532798, 0840768, 0937836.

  3. ROBIS: A new tool to assess risk of bias in systematic reviews was developed

    PubMed Central

    Whiting, Penny; Savović, Jelena; Higgins, Julian P.T.; Caldwell, Deborah M.; Reeves, Barnaby C.; Shea, Beverley; Davies, Philippa; Kleijnen, Jos; Churchill, Rachel

    2016-01-01

    Objective To develop ROBIS, a new tool for assessing the risk of bias in systematic reviews (rather than in primary studies). Study Design and Setting We used four-stage approach to develop ROBIS: define the scope, review the evidence base, hold a face-to-face meeting, and refine the tool through piloting. Results ROBIS is currently aimed at four broad categories of reviews mainly within health care settings: interventions, diagnosis, prognosis, and etiology. The target audience of ROBIS is primarily guideline developers, authors of overviews of systematic reviews (“reviews of reviews”), and review authors who might want to assess or avoid risk of bias in their reviews. The tool is completed in three phases: (1) assess relevance (optional), (2) identify concerns with the review process, and (3) judge risk of bias. Phase 2 covers four domains through which bias may be introduced into a systematic review: study eligibility criteria; identification and selection of studies; data collection and study appraisal; and synthesis and findings. Phase 3 assesses the overall risk of bias in the interpretation of review findings and whether this considered limitations identified in any of the phase 2 domains. Signaling questions are included to help judge concerns with the review process (phase 2) and the overall risk of bias in the review (phase 3); these questions flag aspects of review design related to the potential for bias and aim to help assessors judge risk of bias in the review process, results, and conclusions. Conclusions ROBIS is the first rigorously developed tool designed specifically to assess the risk of bias in systematic reviews. PMID:26092286

  4. Development and Validation of the Controller Acceptance Rating Scale (CARS): Results of Empirical Research

    NASA Technical Reports Server (NTRS)

    Lee, Katharine K.; Kerns, Karol; Bone, Randall

    2001-01-01

    The measurement of operational acceptability is important for the development, implementation, and evolution of air traffic management decision support tools. The Controller Acceptance Rating Scale was developed at NASA Ames Research Center for the development and evaluation of the Passive Final Approach Spacing Tool. CARS was modeled after a well-known pilot evaluation rating instrument, the Cooper-Harper Scale, and has since been used in the evaluation of the User Request Evaluation Tool, developed by MITRE's Center for Advanced Aviation System Development. In this paper, we provide a discussion of the development of CARS and an analysis of the empirical data collected with CARS to examine construct validity. Results of intraclass correlations indicated statistically significant reliability for the CARS. From the subjective workload data that were collected in conjunction with the CARS, it appears that the expected set of workload attributes was correlated with the CARS. As expected, the analysis also showed that CARS was a sensitive indicator of the impact of decision support tools on controller operations. Suggestions for future CARS development and its improvement are also provided.

  5. Reaping the benefits of an open systems approach: getting the commercial approach right

    NASA Astrophysics Data System (ADS)

    Pearson, Gavin; Dawe, Tony; Stubbs, Peter; Worthington, Olwen

    2016-05-01

    Critical to reaping the benefits of an Open System Approach within Defence, or any other sector, is the ability to design the appropriate commercial model (or framework). This paper reports on the development and testing of a commercial strategy decision support tool. The tool set comprises a number of elements, including a process model, and provides business intelligence insights into likely supplier behaviour. The tool has been developed by subject matter experts and has been tested with a number of UK Defence procurement teams. The paper will present the commercial model framework, the elements of the toolset and the results of testing.

  6. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  7. Computer assisted blast design and assessment tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.

    1995-12-31

    In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing;more » evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.« less

  8. Falls risk assessment begins with hello: lessons learned from the use of one home health agency's fall risk tool.

    PubMed

    Flemming, Patricia J; Ramsay, Katherine

    2012-10-01

    Identifying older adults at risk for falls is a challenge all home healthcare agencies (HHAs) face. The process of assessing for falls risk begins with the initial home visit. One HHA affiliated with an academic medical center describes its experience in development and use of a Falls Risk Assessment (FRA) tool over a 10-year period. The FRA tool has been modified since initial development to clarify elements of the tool based on research and to reflect changes in the Outcome and Assessment Information Set (OASIS) document. The primary purpose of this article is to share a validated falls risk assessment tool to facilitate identification of fall-related risk factors in the homebound population. A secondary purpose is to share lessons learned by the HHA during the 10 years using the FRA.

  9. Tissue enrichment analysis for C. elegans genomics.

    PubMed

    Angeles-Albores, David; N Lee, Raymond Y; Chan, Juancarlos; Sternberg, Paul W

    2016-09-13

    Over the last ten years, there has been explosive development in methods for measuring gene expression. These methods can identify thousands of genes altered between conditions, but understanding these datasets and forming hypotheses based on them remains challenging. One way to analyze these datasets is to associate ontologies (hierarchical, descriptive vocabularies with controlled relations between terms) with genes and to look for enrichment of specific terms. Although Gene Ontology (GO) is available for Caenorhabditis elegans, it does not include anatomical information. We have developed a tool for identifying enrichment of C. elegans tissues among gene sets and generated a website GUI where users can access this tool. Since a common drawback to ontology enrichment analyses is its verbosity, we developed a very simple filtering algorithm to reduce the ontology size by an order of magnitude. We adjusted these filters and validated our tool using a set of 30 gold standards from Expression Cluster data in WormBase. We show our tool can even discriminate between embryonic and larval tissues and can even identify tissues down to the single-cell level. We used our tool to identify multiple neuronal tissues that are down-regulated due to pathogen infection in C. elegans. Our Tissue Enrichment Analysis (TEA) can be found within WormBase, and can be downloaded using Python's standard pip installer. It tests a slimmed-down C. elegans tissue ontology for enrichment of specific terms and provides users with a text and graphic representation of the results.

  10. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  11. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  12. The Balanced Scorecard of acute settings: development process, definition of 20 strategic objectives and implementation.

    PubMed

    Groene, Oliver; Brandt, Elimer; Schmidt, Werner; Moeller, Johannes

    2009-08-01

    Strategy development and implementation in acute care settings is often restricted by competing challenges, the pace of policy reform and the existence of parallel hierarchies. To describe a generic approach to strategy development, illustrate the use of the Balanced Scorecard as a tool to facilitate strategy implementation and demonstrate how to break down strategic goals into measurable elements. Multi-method approach using three different conceptual models: Health Promoting Hospitals Standards and Strategies, the European Foundation for Quality Management (EFQM) Model and the Balanced Scorecard. A bundle of qualitative and quantitative methods were used including in-depth interviews, standardized organization-wide surveys on organizational values, staff satisfaction and patient experience. Three acute care hospitals in four different locations belonging to a German holding group. Chief executive officer, senior medical officers, working group leaders and hospital staff. Development and implementation of the Balanced Scorecard. Twenty strategic objectives with corresponding Balanced Scorecard measures. A stepped approach from strategy development to implementation is presented to identify key themes for strategy development, drafting a strategy map and developing strategic objectives and measures. The Balanced Scorecard, in combination with the EFQM model, is a useful tool to guide strategy development and implementation in health care organizations. As for other quality improvement and management tools not specifically developed for health care organizations, some adaptations are required to improve acceptability among professionals. The step-wise approach of strategy development and implementation presented here may support similar processes in comparable organizations.

  13. Comparative analysis and visualization of multiple collinear genomes

    PubMed Central

    2012-01-01

    Background Genome browsers are a common tool used by biologists to visualize genomic features including genes, polymorphisms, and many others. However, existing genome browsers and visualization tools are not well-suited to perform meaningful comparative analysis among a large number of genomes. With the increasing quantity and availability of genomic data, there is an increased burden to provide useful visualization and analysis tools for comparison of multiple collinear genomes such as the large panels of model organisms which are the basis for much of the current genetic research. Results We have developed a novel web-based tool for visualizing and analyzing multiple collinear genomes. Our tool illustrates genome-sequence similarity through a mosaic of intervals representing local phylogeny, subspecific origin, and haplotype identity. Comparative analysis is facilitated through reordering and clustering of tracks, which can vary throughout the genome. In addition, we provide local phylogenetic trees as an alternate visualization to assess local variations. Conclusions Unlike previous genome browsers and viewers, ours allows for simultaneous and comparative analysis. Our browser provides intuitive selection and interactive navigation about features of interest. Dynamic visualizations adjust to scale and data content making analysis at variable resolutions and of multiple data sets more informative. We demonstrate our genome browser for an extensive set of genomic data sets composed of almost 200 distinct mouse laboratory strains. PMID:22536897

  14. Managing and Communicating Operational Workflow: Designing and Implementing an Electronic Outpatient Whiteboard.

    PubMed

    Steitz, Bryan D; Weinberg, Stuart T; Danciu, Ioana; Unertl, Kim M

    2016-01-01

    Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings.

  15. Screening and Evaluation Tool (SET) Users Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pincock, Layne

    This document is the users guide to using the Screening and Evaluation Tool (SET). SET is a tool for comparing multiple fuel cycle options against a common set of criteria and metrics. It does this using standard multi-attribute utility decision analysis methods.

  16. Bioenergy Knowledge Discovery Framework Fact Sheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The Bioenergy Knowledge Discovery Framework (KDF) supports the development of a sustainable bioenergy industry by providing access to a variety of data sets, publications, and collaboration and mapping tools that support bioenergy research, analysis, and decision making. In the KDF, users can search for information, contribute data, and use the tools and map interface to synthesize, analyze, and visualize information in a spatially integrated manner.

  17. Partial and Synchronized Captioning: A New Tool for Second Language Listening Development

    ERIC Educational Resources Information Center

    Mirzaei, Maryam Sadat; Akita, Yuya; Kawahara, Tatsuya

    2014-01-01

    This study investigates a novel method of captioning, partial and synchronized, as a listening tool for second language (L2) learners. In this method, the term partial and synchronized caption (PSC) pertains to the presence of a selected set of words in a caption where words are synced to their corresponding speech signal, using a state-of-the-art…

  18. Public biobanks: calculation and recovery of costs.

    PubMed

    Clément, Bruno; Yuille, Martin; Zaltoukal, Kurt; Wichmann, Heinz-Erich; Anton, Gabriele; Parodi, Barbara; Kozera, Lukasz; Bréchot, Christian; Hofman, Paul; Dagher, Georges

    2014-11-05

    A calculation grid developed by an international expert group was tested across biobanks in six countries to evaluate costs for collections of various types of biospecimens. The assessment yielded a tool for setting specimen-access prices that were transparently related to biobank costs, and the tool was applied across three models of collaborative partnership. Copyright © 2014, American Association for the Advancement of Science.

  19. A Cultural-Historical Reading of How Play Is Used in Families as a Tool for Supporting Children's Emotional Development in Everyday Life

    ERIC Educational Resources Information Center

    Chen, Feiyan; Fleer, Marilyn

    2016-01-01

    Many studies have identified the positive "link" between imaginary play and emotion regulation in laboratory settings. However, little is known about "how" play and emotion regulation are related in everyday practice. This article examines how families use play as a tool to support young children's emotion regulation in…

  20. A New Roman World: Using Virtual Reality Technology as a Critical Teaching Tool.

    ERIC Educational Resources Information Center

    Kuo, Elaine W.; Levis, Marc R.

    The purpose of this study is to examine how technology, namely virtual reality (VR), can be developed as a critical pedagogical tool. More specifically, the study explores whether the use of VR can challenge the traditional lecture format and make the classroom a more student-centered environment. In this instance, VR is defined as a set of…

  1. Contingency diagrams as teaching tools

    PubMed Central

    Mattaini, Mark A.

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching. ImagesFigure 2Figure 3Figure 4 PMID:22478208

  2. Operational Assessment of Tools for Accelerating Leader Development (ALD): Volume 1, Capstone Report

    DTIC Science & Technology

    2009-06-01

    in units and user juries provided feedback on the tools. The pressures of the operational environment seriously limited the time available to work...following functions: account set-up, user authentication, learning management , usage monitoring, problem reporting, assessment data collection, data...especially sources of data) represented—demonstration/assessment manager , operations manager , Web site experts, users (target audience), data collectors

  3. Modeling small cell lung cancer (SCLC) biology through deterministic and stochastic mathematical models.

    PubMed

    Salgia, Ravi; Mambetsariev, Isa; Hewelt, Blake; Achuthan, Srisairam; Li, Haiqing; Poroyko, Valeriy; Wang, Yingyu; Sattler, Martin

    2018-05-25

    Mathematical cancer models are immensely powerful tools that are based in part on the fractal nature of biological structures, such as the geometry of the lung. Cancers of the lung provide an opportune model to develop and apply algorithms that capture changes and disease phenotypes. We reviewed mathematical models that have been developed for biological sciences and applied them in the context of small cell lung cancer (SCLC) growth, mutational heterogeneity, and mechanisms of metastasis. The ultimate goal is to develop the stochastic and deterministic nature of this disease, to link this comprehensive set of tools back to its fractalness and to provide a platform for accurate biomarker development. These techniques may be particularly useful in the context of drug development research, such as combination with existing omics approaches. The integration of these tools will be important to further understand the biology of SCLC and ultimately develop novel therapeutics.

  4. Supporting metabolomics with adaptable software: design architectures for the end-user.

    PubMed

    Sarpe, Vladimir; Schriemer, David C

    2017-02-01

    Large and disparate sets of LC-MS data are generated by modern metabolomics profiling initiatives, and while useful software tools are available to annotate and quantify compounds, the field requires continued software development in order to sustain methodological innovation. Advances in software development practices allow for a new paradigm in tool development for metabolomics, where increasingly the end-user can develop or redeploy utilities ranging from simple algorithms to complex workflows. Resources that provide an organized framework for development are described and illustrated with LC-MS processing packages that have leveraged their design tools. Full access to these resources depends in part on coding experience, but the emergence of workflow builders and pluggable frameworks strongly reduces the skill level required. Developers in the metabolomics community are encouraged to use these resources and design content for uptake and reuse. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Identification, summary and comparison of tools used to measure organizational attributes associated with chronic disease management within primary care settings

    PubMed Central

    Lukewich, Julia; Corbin, Renée; VanDenKerkhof, Elizabeth G; Edge, Dana S; Williamson, Tyler; Tranmer, Joan E

    2014-01-01

    Rationale, aims and objectives Given the increasing emphasis being placed on managing patients with chronic diseases within primary care, there is a need to better understand which primary care organizational attributes affect the quality of care that patients with chronic diseases receive. This study aimed to identify, summarize and compare data collection tools that describe and measure organizational attributes used within the primary care setting worldwide. Methods Systematic search and review methodology consisting of a comprehensive and exhaustive search that is based on a broad question to identify the best available evidence was employed. Results A total of 30 organizational attribute data collection tools that have been used within the primary care setting were identified. The tools varied with respect to overall focus and level of organizational detail captured, theoretical foundations, administration and completion methods, types of questions asked, and the extent to which psychometric property testing had been performed. The tools utilized within the Quality and Costs of Primary Care in Europe study and the Canadian Primary Health Care Practice-Based Surveys were the most recently developed tools. Furthermore, of the 30 tools reviewed, the Canadian Primary Health Care Practice-Based Surveys collected the most information on organizational attributes. Conclusions There is a need to collect primary care organizational attribute information at a national level to better understand factors affecting the quality of chronic disease prevention and management across a given country. The data collection tools identified in this review can be used to establish data collection strategies to collect this important information. PMID:24840066

  6. FDTD simulation tools for UWB antenna analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  7. Multidisciplinary Optimization for Aerospace Using Genetic Optimization

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Hahn, Edward E.; Herrera, Claudia Y.

    2007-01-01

    In support of the ARMD guidelines NASA's Dryden Flight Research Center is developing a multidisciplinary design and optimization tool This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Optimization has made its way into many mainstream applications. For example NASTRAN(TradeMark) has its solution sequence 200 for Design Optimization, and MATLAB(TradeMark) has an Optimization Tool box. Other packages, such as ZAERO(TradeMark) aeroelastic panel code and the CFL3D(TradeMark) Navier-Stokes solver have no built in optimizer. The goal of the tool development is to generate a central executive capable of using disparate software packages ina cross platform network environment so as to quickly perform optimization and design tasks in a cohesive streamlined manner. A provided figure (Figure 1) shows a typical set of tools and their relation to the central executive. Optimization can take place within each individual too, or in a loop between the executive and the tool, or both.

  8. Combinatorial therapy discovery using mixed integer linear programming.

    PubMed

    Pang, Kaifang; Wan, Ying-Wooi; Choi, William T; Donehower, Lawrence A; Sun, Jingchun; Pant, Dhruv; Liu, Zhandong

    2014-05-15

    Combinatorial therapies play increasingly important roles in combating complex diseases. Owing to the huge cost associated with experimental methods in identifying optimal drug combinations, computational approaches can provide a guide to limit the search space and reduce cost. However, few computational approaches have been developed for this purpose, and thus there is a great need of new algorithms for drug combination prediction. Here we proposed to formulate the optimal combinatorial therapy problem into two complementary mathematical algorithms, Balanced Target Set Cover (BTSC) and Minimum Off-Target Set Cover (MOTSC). Given a disease gene set, BTSC seeks a balanced solution that maximizes the coverage on the disease genes and minimizes the off-target hits at the same time. MOTSC seeks a full coverage on the disease gene set while minimizing the off-target set. Through simulation, both BTSC and MOTSC demonstrated a much faster running time over exhaustive search with the same accuracy. When applied to real disease gene sets, our algorithms not only identified known drug combinations, but also predicted novel drug combinations that are worth further testing. In addition, we developed a web-based tool to allow users to iteratively search for optimal drug combinations given a user-defined gene set. Our tool is freely available for noncommercial use at http://www.drug.liuzlab.org/. zhandong.liu@bcm.edu Supplementary data are available at Bioinformatics online.

  9. Analysis of the thermo-mechanical deformations in a hot forging tool by numerical simulation

    NASA Astrophysics Data System (ADS)

    L-Cancelos, R.; Varas, F.; Martín, E.; Viéitez, I.

    2016-03-01

    Although programs have been developed for the design of tools for hot forging, its design is still largely based on the experience of the tool maker. This obliges to build some test matrices and correct their errors to minimize distortions in the forged piece. This phase prior to mass production consumes time and material resources, which makes the final product more expensive. The forging tools are usually constituted by various parts made of different grades of steel, which in turn have different mechanical properties and therefore suffer different degrees of strain. Furthermore, the tools used in the hot forging are exposed to a thermal field that also induces strain or stress based on the degree of confinement of the piece. Therefore, the mechanical behaviour of the assembly is determined by the contact between the different pieces. The numerical simulation allows to analyse different configurations and anticipate possible defects before tool making, thus, reducing the costs of this preliminary phase. In order to improve the dimensional quality of the manufactured parts, the work presented here focuses on the application of a numerical model to a hot forging manufacturing process in order to predict the areas of the forging die subjected to large deformations. The thermo-mechanical model developed and implemented with free software (Code-Aster) includes the strains of thermal origin, strains during forge impact and contact effects. The numerical results are validated with experimental measurements in a tooling set that produces forged crankshafts for the automotive industry. The numerical results show good agreement with the experimental tests. Thereby, a very useful tool for the design of tooling sets for hot forging is achieved.

  10. Measuring ability to assess claims about treatment effects: the development of the 'Claim Evaluation Tools'.

    PubMed

    Austvoll-Dahlgren, Astrid; Semakula, Daniel; Nsangi, Allen; Oxman, Andrew David; Chalmers, Iain; Rosenbaum, Sarah; Guttersrud, Øystein

    2017-05-17

    To describe the development of the Claim Evaluation Tools, a set of flexible items to measure people's ability to assess claims about treatment effects. Methodologists and members of the community (including children) in Uganda, Rwanda, Kenya, Norway, the UK and Australia. In the iterative development of the items, we used purposeful sampling of people with training in research methodology, such as teachers of evidence-based medicine, as well as patients and members of the public from low-income and high-income countries. Development consisted of 4 processes: (1) determining the scope of the Claim Evaluation Tools and development of items; (2) expert item review and feedback (n=63); (3) cognitive interviews with children and adult end-users (n=109); and (4) piloting and administrative tests (n=956). The Claim Evaluation Tools database currently includes a battery of multiple-choice items. Each item begins with a scenario which is intended to be relevant across contexts, and which can be used for children (from age 10  and above), adult members of the public and health professionals. People with expertise in research methods judged the items to have face validity, and end-users judged them relevant and acceptable in their settings. In response to feedback from methodologists and end-users, we simplified some text, explained terms where needed, and redesigned formats and instructions. The Claim Evaluation Tools database is a flexible resource from which researchers, teachers and others can design measurement instruments to meet their own requirements. These evaluation tools are being managed and made freely available for non-commercial use (on request) through Testing Treatments interactive (testingtreatments.org). PACTR201606001679337 and PACTR201606001676150; Pre-results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  11. NASA Instrument Cost/Schedule Model

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  12. Setting research priorities by applying the combined approach matrix.

    PubMed

    Ghaffar, Abdul

    2009-04-01

    Priority setting in health research is a dynamic process. Different organizations and institutes have been working in the field of research priority setting for many years. In 1999 the Global Forum for Health Research presented a research priority setting tool called the Combined Approach Matrix or CAM. Since its development, the CAM has been successfully applied to set research priorities for diseases, conditions and programmes at global, regional and national levels. This paper briefly explains the CAM methodology and how it could be applied in different settings, giving examples and describing challenges encountered in the process of setting research priorities and providing recommendations for further work in this field. The construct and design of the CAM is explained along with different steps needed, including planning and organization of a priority-setting exercise and how it could be applied in different settings. The application of the CAM are described by using three examples. The first concerns setting research priorities for a global programme, the second describes application at the country level and the third setting research priorities for diseases. Effective application of the CAM in different and diverse environments proves its utility as a tool for setting research priorities. Potential challenges encountered in the process of research priority setting are discussed and some recommendations for further work in this field are provided.

  13. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  14. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  15. Development and evaluation of a comprehensive clinical decision support taxonomy: comparison of front-end tools in commercial and internally developed electronic health record systems

    PubMed Central

    Sittig, Dean F; Ash, Joan S; Feblowitz, Joshua; Meltzer, Seth; McMullen, Carmit; Guappone, Ken; Carpenter, Jim; Richardson, Joshua; Simonaitis, Linas; Evans, R Scott; Nichol, W Paul; Middleton, Blackford

    2011-01-01

    Background Clinical decision support (CDS) is a valuable tool for improving healthcare quality and lowering costs. However, there is no comprehensive taxonomy of types of CDS and there has been limited research on the availability of various CDS tools across current electronic health record (EHR) systems. Objective To develop and validate a taxonomy of front-end CDS tools and to assess support for these tools in major commercial and internally developed EHRs. Study design and methods We used a modified Delphi approach with a panel of 11 decision support experts to develop a taxonomy of 53 front-end CDS tools. Based on this taxonomy, a survey on CDS tools was sent to a purposive sample of commercial EHR vendors (n=9) and leading healthcare institutions with internally developed state-of-the-art EHRs (n=4). Results Responses were received from all healthcare institutions and 7 of 9 EHR vendors (response rate: 85%). All 53 types of CDS tools identified in the taxonomy were found in at least one surveyed EHR system, but only 8 functions were present in all EHRs. Medication dosing support and order facilitators were the most commonly available classes of decision support, while expert systems (eg, diagnostic decision support, ventilator management suggestions) were the least common. Conclusion We developed and validated a comprehensive taxonomy of front-end CDS tools. A subsequent survey of commercial EHR vendors and leading healthcare institutions revealed a small core set of common CDS tools, but identified significant variability in the remainder of clinical decision support content. PMID:21415065

  16. Development of a Fiber Laser Welding Capability for the W76, MC4702 Firing Set

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samayoa, Jose

    2010-05-12

    Development work to implement a new welding system for a Firing Set is presented. The new system is significant because it represents the first use of fiber laser welding technology at the KCP. The work used Six-Sigma tools for weld characterization and to define process performance. Determinations of workable weld parameters and comparison to existing equipment were completed. Replication of existing waveforms was done utilizing an Arbitrary Pulse Generator (APG), which was used to modulate the fiber laser’s exclusive continuous wave (CW) output. Fiber laser weld process capability for a Firing Set is demonstrated.

  17. A method of genotyping by pedigree-based training-set for identification of QTLs associated with cucumber fruit size

    USDA-ARS?s Scientific Manuscript database

    Large sets of genomic data are becoming available for cucumber (Cucumis sativus), yet there is no tool for whole genome genotyping. Creation of saturated genetic maps depends on development of good markers. The present cucumber genetic maps are based on several hundreds of markers. However they are ...

  18. 3D Digital Legos for Teaching Security Protocols

    ERIC Educational Resources Information Center

    Yu, Li; Harrison, L.; Lu, Aidong; Li, Zhiwei; Wang, Weichao

    2011-01-01

    We have designed and developed a 3D digital Lego system as an education tool for teaching security protocols effectively in Information Assurance courses (Lego is a trademark of the LEGO Group. Here, we use it only to represent the pieces of a construction set.). Our approach applies the pedagogical methods learned from toy construction sets by…

  19. Clinical Guide to Music Therapy in Physical Rehabilitation Settings

    ERIC Educational Resources Information Center

    Wong, Elizabeth

    2004-01-01

    Elizabeth Wong, MT-BC presents tools and information designed to arm the entry-level music therapist (or an experienced MT-BC new to rehabilitation settings) with basic knowledge and materials to develop or work in a music therapy program treating people with stroke, brain injury, and those who are ventilator dependent. Ms. Wong offers goals and…

  20. Fast Multiscale Algorithms for Information Representation and Fusion

    DTIC Science & Technology

    2011-07-01

    We are also developing convenient command-line invocation tools in addition to the previously developed APIs . Various real-world data sets...This knowledge is important in geolocation applications where knowing whether a received signal is line-of-sight or not is necessary for the

  1. Developmental Inventories Using Illiterate Parents as Informants: Communicative Development Inventory (CDI) Adaptation for Two Kenyan Languages

    ERIC Educational Resources Information Center

    Alcock, K. J.; Rimba, K.; Holding, P.; Kitsao-Wekulo, P.; Abubakar, A.; Newton, C. R. J. C.

    2015-01-01

    Communicative Development Inventories (CDIs, parent-completed language development checklists) are a helpful tool to assess language in children who are unused to interaction with unfamiliar adults. Generally, CDIs are completed in written form, but in developing country settings parents may have insufficient literacy to complete them alone. We…

  2. Automated Estimation Of Software-Development Costs

    NASA Technical Reports Server (NTRS)

    Roush, George B.; Reini, William

    1993-01-01

    COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.

  3. Modeling and Simulation of Phased Array Antennas to Support Next-Generation Satellite Design

    NASA Technical Reports Server (NTRS)

    Tchorowski, Nicole; Murawski, Robert; Manning, Robert; Fuentes, Michael

    2016-01-01

    Developing enhanced simulation capabilities has become a significant priority for the Space Communications and Navigation (SCaN) project at NASA as new space communications technologies are proposed to replace aging NASA communications assets, such as the Tracking and Data Relay Satellite System (TDRSS). When developing the architecture for these new space communications assets, it is important to develop updated modeling and simulation methodologies, such that competing architectures can be weighed against one another and the optimal path forward can be determined. There have been many simulation tools developed here at NASA for the simulation of single RF link budgets, or for the modeling and simulation of an entire network of spacecraft and their supporting SCaN network elements. However, the modeling capabilities are never fully complete and as new technologies are proposed, gaps are identified. One such gap is the ability to rapidly develop high fidelity simulation models of electronically steerable phased array systems. As future relay satellite architectures are proposed that include optical communications links, electronically steerable antennas will become more desirable due to the reduction in platform vibration introduced by mechanically steerable devices. In this research, we investigate how modeling of these antennas can be introduced into out overall simulation and modeling structure. The ultimate goal of this research is two-fold. First, to enable NASA engineers to model various proposed simulation architectures and determine which proposed architecture meets the given architectural requirements. Second, given a set of communications link requirements for a proposed satellite architecture, determine the optimal configuration for a phased array antenna. There is a variety of tools available that can be used to model phased array antennas. To meet our stated goals, the first objective of this research is to compare the subset of tools available to us, trading-off modeling fidelity of the tool with simulation performance. When comparing several proposed architectures, higher- fidelity modeling may be desirable, however, when iterating a proposed set of communication link requirements across ranges of phased array configuration parameters, the practicality of performance becomes a significant requirement. In either case, a minimum simulation - fidelity must be met, regardless of performance considerations, which will be discussed in this research. Given a suitable set of phased array modeling tools, this research then focuses on integration with current SCaN modeling and simulation tools. While properly modeling the antenna elements of a system are vital, this is only a small part of the end-to-end communication path between a satellite and the supporting ground station and/or relay satellite assets. To properly model a proposed simulation architecture, this toolset must be integrated with other commercial and government development tools, such that the overall architecture can be examined in terms of communications, reliability, and cost. In this research, integration with previously developed communication tools is investigated.

  4. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    NASA Astrophysics Data System (ADS)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be inferred. Lastly, this work draws attention to the use of the IDCT suite as an education resource for inspiring and training new students and engineers in the field of disaster risk reduction.

  5. Children and Young People-Mental Health Safety Assessment Tool (CYP-MH SAT) study: Protocol for the development and psychometric evaluation of an assessment tool to identify immediate risk of self-harm and suicide in children and young people (10–19 years) in acute paediatric hospital settings

    PubMed Central

    Walker, Gemma M; Carter, Tim; Aubeeluck, Aimee; Witchell, Miranda; Coad, Jane

    2018-01-01

    Introduction Currently, no standardised, evidence-based assessment tool for assessing immediate self-harm and suicide in acute paediatric inpatient settings exists. Aim The aim of this study is to develop and test the psychometric properties of an assessment tool that identifies immediate risk of self-harm and suicide in children and young people (10–19 years) in acute paediatric hospital settings. Methods and analysis Development phase: This phase involved a scoping review of the literature to identify and extract items from previously published suicide and self-harm risk assessment scales. Using a modified electronic Delphi approach, these items will then be rated according to their relevance for assessment of immediate suicide or self-harm risk by expert professionals. Inclusion of items will be determined by 65%–70% consensus between raters. Subsequently, a panel of expert members will convene to determine the face validity, appropriate phrasing, item order and response format for the finalised items. Psychometric testing phase: The finalised items will be tested for validity and reliability through a multicentre, psychometric evaluation. Psychometric testing will be undertaken to determine the following: internal consistency, inter-rater reliability, convergent, divergent validity and concurrent validity. Ethics and dissemination Ethical approval was provided by the National Health Service East Midlands—Derby Research Ethics Committee (17/EM/0347) and full governance clearance received by the Health Research Authority and local participating sites. Findings from this study will be disseminated to professionals and the public via peer-reviewed journal publications, popular social media and conference presentations. PMID:29654046

  6. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  7. Detections of Propellers in Saturn's Rings using Machine Learning: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Gordon, Mitchell K.; Showalter, Mark R.; Odess, Jennifer; Del Villar, Ambi; LaMora, Andy; Paik, Jin; Lakhani, Karim; Sergeev, Rinat; Erickson, Kristen; Galica, Carol; Grayzeck, Edwin; Morgan, Thomas; Knopf, William

    2015-11-01

    We report on the initial analysis of the output of a tool designed to identify persistent, non-axisymmetric features in the rings of Saturn. This project introduces a new paradigm for scientific software development. The preliminary results include what appear to be new detections of propellers in the rings of Saturn.The Planetary Data System (PDS), working with the NASA Tournament Lab (NTL), Crowd Innovation Lab at Harvard University, and the Topcoder community at Appirio, Inc., under the umbrella “Cassini Rings Challenge”, sponsored a set of competitions employing crowd sourcing and machine learning to develop a tool which could be made available to the community at large. The Challenge was tackled by running a series of separate contests to solve individual tasks prior to the major machine learning challenge. Each contest was comprised of a set of requirements, a timeline, one or more prizes, and other incentives, and was posted by Appirio to the Topcoder Community. In the case of the machine learning challenge (a “Marathon Challenge” on the Topcoder platform), members competed against each other by submitting solutions that were scored in real time and posted to a public leader-board by a scoring algorithm developed by Appirio for this contest.The current version of the algorithm was run against ~30,000 of the highest resolution Cassini ISS images. That set included 668 images with a total of 786 features previously identified as propellers in the main rings. The tool identified 81% of those previously identified propellers. In a preliminary, close examination of 130 detections identified by the tool, we determined that of the 130 detections, 11 were previously identified propeller detections, 5 appear to be new detections of known propellers, and 4 appear to be detections of propellers which have not been seen previously. A total of 20 valid detections from 130 candidates implies a relatively high false positive rate which we hope to reduce by further algorithm development. The machine learning aspect of the algorithm means that as our set of verified detections increases so does the pool of “ground-truth” data used to train the algorithm for future use.

  8. Development and validation of microsatellite markers for Brachiaria ruziziensis obtained by partial genome assembly of Illumina single-end reads

    PubMed Central

    2013-01-01

    Background Brachiaria ruziziensis is one of the most important forage species planted in the tropics. The application of genomic tools to aid the selection of superior genotypes can provide support to B. ruziziensis breeding programs. However, there is a complete lack of information about the B. ruziziensis genome. Also, the availability of genomic tools, such as molecular markers, to support B. ruziziensis breeding programs is rather limited. Recently, next-generation sequencing technologies have been applied to generate sequence data for the identification of microsatellite regions and primer design. In this study, we present a first validated set of SSR markers for Brachiaria ruziziensis, selected from a de novo partial genome assembly of single-end Illumina reads. Results A total of 85,567 perfect microsatellite loci were detected in contigs with a minimum 10X coverage. We selected a set of 500 microsatellite loci identified in contigs with minimum 100X coverage for primer design and synthesis, and tested a subset of 269 primer pairs, 198 of which were polymorphic on 11 representative B. ruziziensis accessions. Descriptive statistics for these primer pairs are presented, as well as estimates of marker transferability to other relevant brachiaria species. Finally, a set of 11 multiplex panels containing the 30 most informative markers was validated and proposed for B. ruziziensis genetic analysis. Conclusions We show that the detection and development of microsatellite markers from genome assembled Illumina single-end DNA sequences is highly efficient. The developed markers are readily suitable for genetic analysis and marker assisted selection of Brachiaria ruziziensis. The use of this approach for microsatellite marker development is promising for species with limited genomic information, whose breeding programs would benefit from the use of genomic tools. To our knowledge, this is the first set of microsatellite markers developed for this important species. PMID:23324172

  9. Development and content validation of the power mobility training tool.

    PubMed

    Kenyon, Lisa K; Farris, John P; Cain, Brett; King, Emily; VandenBerg, Ashley

    2018-01-01

    This paper outlines the development and content validation of the power mobility training tool (PMTT), an observational tool designed to assist therapists in developing power mobility training programs for children who have multiple, severe impairments. Initial items on the PMTT were developed based on a literature review and in consultation with therapists experienced in the use of power mobility. Items were trialled in clinical settings, reviewed, and refined. Items were then operationalized and an administration manual detailing scoring for each item was created. Qualitative and quantitative methods were used to establish content validity via a 15 member, international expert panel. The content validity ratio (CVR) was determined for each possible item. Of the 19 original items, 10 achieved minimum required CVR values and were included in the final version of the PMTT. Items related to manoeuvring a power mobility device were merged and an item related to the number of switches used concurrently to operate a power mobility device were added to the PMTT. The PMTT may assist therapists in developing training programs that facilitate the acquisition of beginning power mobility skills in children who have multiple, severe impairments. Implications for Rehabilitation The Power Mobility Training Tool (PMTT) was developed to help guide the development of power mobility intervention programs for children who have multiple, severe impairments. The PMTT can be used with children who access a power mobility device using either a joystick or a switch. Therapists who have limited experience with power mobility may find the PMTT to be helpful in setting up and conducting power mobility training interventions as a feasible aspect of a plan of care for children who have multiple, severe impairments.

  10. Assessment of nursing workload in adult psychiatric inpatient units: a scoping review.

    PubMed

    Sousa, C; Seabra, P

    2018-05-16

    No systematic reviews on measurement tools in adult psychiatric inpatient settings exist in the literature, and thus, further research is required on ways to identify approaches to calculate safe nurse staffing levels based on patients' care needs in adult psychiatric inpatient units. To identify instruments that enable an assessment of nursing workload in psychiatric settings. Method A scoping review was conducted. Four studies were identified, with five instruments used to support the calculation of staff needs and workload. All four studies present methodological limitations. Two instruments have already been adapted to this specific context, but validation studies are lacking. The findings indicate that the tools used to evaluate nursing workload in these settings require further development, with the concomitant need for more research to clarify the definition of nursing workload as well as to identify factors with the greatest impact on nursing workload. This review highlights the need to develop tools to assess workload in psychiatric inpatient units that embrace patient-related and non-patient-related activities. The great challenge is to enable a sensitive perception of workload resulting from nurses' psychotherapeutic interventions, an important component of treatment for many patients. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Evaluating online diagnostic decision support tools for the clinical setting.

    PubMed

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    Clinical decision support tools available at the point of care are an effective adjunct to support clinicians to make clinical decisions and improve patient outcomes. We developed a methodology and applied it to evaluate commercially available online clinical diagnostic decision support (DDS) tools for use at the point of care. We identified 11 commercially available DDS tools and assessed these against an evaluation instrument that included 6 categories; general information, content, quality control, search, clinical results and other features. We developed diagnostically challenging clinical case scenarios based on real patient experience that were commonly missed by junior medical staff. The evaluation was divided into 2 phases; an initial evaluation of all identified and accessible DDS tools conducted by the Clinical Information Access Portal (CIAP) team and a second phase that further assessed the top 3 tools identified in the initial evaluation phase. An evaluation panel consisting of senior and junior medical clinicians from NSW Health conducted the second phase. Of the eleven tools that were assessed against the evaluation instrument only 4 tools completely met the DDS definition that was adopted for this evaluation and were able to produce a differential diagnosis. From the initial phase of the evaluation 4 DDS tools scored 70% or more (maximum score 96%) for the content category, 8 tools scored 65% or more (maximum 100%) for the quality control category, 5 tools scored 65% or more (maximum 94%) for the search category, and 4 tools score 70% or more (maximum 81%) for the clinical results category. The second phase of the evaluation was focused on assessing diagnostic accuracy for the top 3 tools identified in the initial phase. Best Practice ranked highest overall against the 6 clinical case scenarios used. Overall the differentiating factor between the top 3 DDS tools was determined by diagnostic accuracy ranking, ease of use and the confidence and credibility of the clinical information. The evaluation methodology used here to assess the quality and comprehensiveness of clinical DDS tools was effective in identifying the most appropriate tool for the clinical setting. The use of clinical case scenarios is fundamental in determining the diagnostic accuracy and usability of the tools.

  12. Testing and refining the Science in Risk Assessment and Policy (SciRAP) web-based platform for evaluating the reliability and relevance of in vivo toxicity studies.

    PubMed

    Beronius, Anna; Molander, Linda; Zilliacus, Johanna; Rudén, Christina; Hanberg, Annika

    2018-05-28

    The Science in Risk Assessment and Policy (SciRAP) web-based platform was developed to promote and facilitate structure and transparency in the evaluation of ecotoxicity and toxicity studies for hazard and risk assessment of chemicals. The platform includes sets of criteria and a colour-coding tool for evaluating the reliability and relevance of individual studies. The SciRAP method for evaluating in vivo toxicity studies was first published in 2014 and the aim of the work presented here was to evaluate and develop that method further. Toxicologists and risk assessors from different sectors and geographical areas were invited to test the SciRAP criteria and tool on a specific set of in vivo toxicity studies and to provide feedback concerning the scientific soundness and user-friendliness of the SciRAP approach. The results of this expert assessment were used to refine and improve both the evaluation criteria and the colour-coding tool. It is expected that the SciRAP web-based platform will continue to be developed and enhanced to keep up to date with the needs of end-users. Copyright © 2018 John Wiley & Sons, Ltd.

  13. The ontology life cycle: Integrated tools for editing, publishing, peer review, and evolution of ontologies

    PubMed Central

    Noy, Natalya; Tudorache, Tania; Nyulas, Csongor; Musen, Mark

    2010-01-01

    Ontologies have become a critical component of many applications in biomedical informatics. However, the landscape of the ontology tools today is largely fragmented, with independent tools for ontology editing, publishing, and peer review: users develop an ontology in an ontology editor, such as Protégé; and publish it on a Web server or in an ontology library, such as BioPortal, in order to share it with the community; they use the tools provided by the library or mailing lists and bug trackers to collect feedback from users. In this paper, we present a set of tools that bring the ontology editing and publishing closer together, in an integrated platform for the entire ontology lifecycle. This integration streamlines the workflow for collaborative development and increases integration between the ontologies themselves through the reuse of terms. PMID:21347039

  14. Assessing fracture risk in people with MS: a service development study comparing three fracture risk scoring systems

    PubMed Central

    Dobson, Ruth; Leddy, Sara Geraldine; Gangadharan, Sunay; Giovannoni, Gavin

    2013-01-01

    Objectives Suboptimal bone health is increasingly recognised as an important cause of morbidity. Multiple sclerosis (MS) has been consistently associated with an increased risk of osteoporosis and fracture. Various fracture risk screening tools have been developed, two of which are in routine use and a further one is MS-specific. We set out to compare the results obtained by these in the MS clinic population. Design This was a service development study. The 10-year risk estimates of any fracture and hip fracture generated by each of the algorithms were compared. Setting The MS clinic at the Royal London Hospital. Participants 88 patients with a confirmed diagnosis of MS. Outcome measures Mean 10-year overall fracture risk and hip fracture risk were calculated using each of the three fracture risk calculators. The number of interventions that would be required as a result of using each of these tools was also compared. Results Mean 10-year fracture risk was 4.7%, 2.3% and 7.6% using FRAX, QFracture and the MS-specific calculator, respectively (p<0.0001 for difference). The agreement between risk scoring tools was poor at all levels of fracture risk. Conclusions The agreement between these three fracture risk scoring tools is poor in the MS population. Further work is required to develop and validate an accurate fracture risk scoring system for use in MS. Trial registration This service development study was approved by the Clinical Effectiveness Department at Barts Health NHS Trust (project registration number 156/12). PMID:23482989

  15. Comprehensive, powerful, efficient, intuitive: a new software framework for clinical imaging applications

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Hanson, Dennis P.; Robb, Richard A.

    2006-03-01

    One of the greatest challenges for a software engineer is to create a complex application that is comprehensive enough to be useful to a diverse set of users, yet focused enough for individual tasks to be carried out efficiently with minimal training. This "powerful yet simple" paradox is particularly prevalent in advanced medical imaging applications. Recent research in the Biomedical Imaging Resource (BIR) at Mayo Clinic has been directed toward development of an imaging application framework that provides powerful image visualization/analysis tools in an intuitive, easy-to-use interface. It is based on two concepts very familiar to physicians - Cases and Workflows. Each case is associated with a unique patient and a specific set of routine clinical tasks, or a workflow. Each workflow is comprised of an ordered set of general-purpose modules which can be re-used for each unique workflow. Clinicians help describe and design the workflows, and then are provided with an intuitive interface to both patient data and analysis tools. Since most of the individual steps are common to many different workflows, the use of general-purpose modules reduces development time and results in applications that are consistent, stable, and robust. While the development of individual modules may reflect years of research by imaging scientists, new customized workflows based on the new modules can be developed extremely fast. If a powerful, comprehensive application is difficult to learn and complicated to use, it will be unacceptable to most clinicians. Clinical image analysis tools must be intuitive and effective or they simply will not be used.

  16. Gemi: PCR Primers Prediction from Multiple Alignments

    PubMed Central

    Sobhy, Haitham; Colson, Philippe

    2012-01-01

    Designing primers and probes for polymerase chain reaction (PCR) is a preliminary and critical step that requires the identification of highly conserved regions in a given set of sequences. This task can be challenging if the targeted sequences display a high level of diversity, as frequently encountered in microbiologic studies. We developed Gemi, an automated, fast, and easy-to-use bioinformatics tool with a user-friendly interface to design primers and probes based on multiple aligned sequences. This tool can be used for the purpose of real-time and conventional PCR and can deal efficiently with large sets of sequences of a large size. PMID:23316117

  17. Fermilab computing at the Intensity Frontier

    DOE PAGES

    Group, Craig; Fuess, S.; Gutsche, O.; ...

    2015-12-23

    The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less

  18. Improving e-book access via a library-developed full-text search tool.

    PubMed

    Foust, Jill E; Bergen, Phillip; Maxeiner, Gretchen L; Pawlowski, Peter N

    2007-01-01

    This paper reports on the development of a tool for searching the contents of licensed full-text electronic book (e-book) collections. The Health Sciences Library System (HSLS) provides services to the University of Pittsburgh's medical programs and large academic health system. The HSLS has developed an innovative tool for federated searching of its e-book collections. Built using the XML-based Vivísimo development environment, the tool enables a user to perform a full-text search of over 2,500 titles from the library's seven most highly used e-book collections. From a single "Google-style" query, results are returned as an integrated set of links pointing directly to relevant sections of the full text. Results are also grouped into categories that enable more precise retrieval without reformulation of the search. A heuristic evaluation demonstrated the usability of the tool and a web server log analysis indicated an acceptable level of usage. Based on its success, there are plans to increase the number of online book collections searched. This library's first foray into federated searching has produced an effective tool for searching across large collections of full-text e-books and has provided a good foundation for the development of other library-based federated searching products.

  19. Clinical code set engineering for reusing EHR data for research: A review.

    PubMed

    Williams, Richard; Kontopantelis, Evangelos; Buchan, Iain; Peek, Niels

    2017-06-01

    The construction of reliable, reusable clinical code sets is essential when re-using Electronic Health Record (EHR) data for research. Yet code set definitions are rarely transparent and their sharing is almost non-existent. There is a lack of methodological standards for the management (construction, sharing, revision and reuse) of clinical code sets which needs to be addressed to ensure the reliability and credibility of studies which use code sets. To review methodological literature on the management of sets of clinical codes used in research on clinical databases and to provide a list of best practice recommendations for future studies and software tools. We performed an exhaustive search for methodological papers about clinical code set engineering for re-using EHR data in research. This was supplemented with papers identified by snowball sampling. In addition, a list of e-phenotyping systems was constructed by merging references from several systematic reviews on this topic, and the processes adopted by those systems for code set management was reviewed. Thirty methodological papers were reviewed. Common approaches included: creating an initial list of synonyms for the condition of interest (n=20); making use of the hierarchical nature of coding terminologies during searching (n=23); reviewing sets with clinician input (n=20); and reusing and updating an existing code set (n=20). Several open source software tools (n=3) were discovered. There is a need for software tools that enable users to easily and quickly create, revise, extend, review and share code sets and we provide a list of recommendations for their design and implementation. Research re-using EHR data could be improved through the further development, more widespread use and routine reporting of the methods by which clinical codes were selected. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  20. A Toolkit for bulk PCR-based marker design from next-generation sequence data: application for development of a framework linkage map in bulb onion (Allium cepa L.)

    PubMed Central

    2012-01-01

    Background Although modern sequencing technologies permit the ready detection of numerous DNA sequence variants in any organisms, converting such information to PCR-based genetic markers is hampered by a lack of simple, scalable tools. Onion is an example of an under-researched crop with a complex, heterozygous genome where genome-based research has previously been hindered by limited sequence resources and genetic markers. Results We report the development of generic tools for large-scale web-based PCR-based marker design in the Galaxy bioinformatics framework, and their application for development of next-generation genetics resources in a wide cross of bulb onion (Allium cepa L.). Transcriptome sequence resources were developed for the homozygous doubled-haploid bulb onion line ‘CUDH2150’ and the genetically distant Indian landrace ‘Nasik Red’, using 454™ sequencing of normalised cDNA libraries of leaf and shoot. Read mapping of ‘Nasik Red’ reads onto ‘CUDH2150’ assemblies revealed 16836 indel and SNP polymorphisms that were mined for portable PCR-based marker development. Tools for detection of restriction polymorphisms and primer set design were developed in BioPython and adapted for use in the Galaxy workflow environment, enabling large-scale and targeted assay design. Using PCR-based markers designed with these tools, a framework genetic linkage map of over 800cM spanning all chromosomes was developed in a subset of 93 F2 progeny from a very large F2 family developed from the ‘Nasik Red’ x ‘CUDH2150’ inter-cross. The utility of tools and genetic resources developed was tested by designing markers to transcription factor-like polymorphic sequences. Bin mapping these markers using a subset of 10 progeny confirmed the ability to place markers within 10 cM bins, enabling increased efficiency in marker assignment and targeted map refinement. The major genetic loci conditioning red bulb colour (R) and fructan content (Frc) were located on this map by QTL analysis. Conclusions The generic tools developed for the Galaxy environment enable rapid development of sets of PCR assays targeting sequence variants identified from Illumina and 454 sequence data. They enable non-specialist users to validate and exploit large volumes of next-generation sequence data using basic equipment. PMID:23157543

  1. A toolkit for bulk PCR-based marker design from next-generation sequence data: application for development of a framework linkage map in bulb onion (Allium cepa L.).

    PubMed

    Baldwin, Samantha; Revanna, Roopashree; Thomson, Susan; Pither-Joyce, Meeghan; Wright, Kathryn; Crowhurst, Ross; Fiers, Mark; Chen, Leshi; Macknight, Richard; McCallum, John A

    2012-11-19

    Although modern sequencing technologies permit the ready detection of numerous DNA sequence variants in any organisms, converting such information to PCR-based genetic markers is hampered by a lack of simple, scalable tools. Onion is an example of an under-researched crop with a complex, heterozygous genome where genome-based research has previously been hindered by limited sequence resources and genetic markers. We report the development of generic tools for large-scale web-based PCR-based marker design in the Galaxy bioinformatics framework, and their application for development of next-generation genetics resources in a wide cross of bulb onion (Allium cepa L.). Transcriptome sequence resources were developed for the homozygous doubled-haploid bulb onion line 'CUDH2150' and the genetically distant Indian landrace 'Nasik Red', using 454™ sequencing of normalised cDNA libraries of leaf and shoot. Read mapping of 'Nasik Red' reads onto 'CUDH2150' assemblies revealed 16836 indel and SNP polymorphisms that were mined for portable PCR-based marker development. Tools for detection of restriction polymorphisms and primer set design were developed in BioPython and adapted for use in the Galaxy workflow environment, enabling large-scale and targeted assay design. Using PCR-based markers designed with these tools, a framework genetic linkage map of over 800cM spanning all chromosomes was developed in a subset of 93 F(2) progeny from a very large F(2) family developed from the 'Nasik Red' x 'CUDH2150' inter-cross. The utility of tools and genetic resources developed was tested by designing markers to transcription factor-like polymorphic sequences. Bin mapping these markers using a subset of 10 progeny confirmed the ability to place markers within 10 cM bins, enabling increased efficiency in marker assignment and targeted map refinement. The major genetic loci conditioning red bulb colour (R) and fructan content (Frc) were located on this map by QTL analysis. The generic tools developed for the Galaxy environment enable rapid development of sets of PCR assays targeting sequence variants identified from Illumina and 454 sequence data. They enable non-specialist users to validate and exploit large volumes of next-generation sequence data using basic equipment.

  2. Mentoring as a Developmental Tool for Higher Education

    ERIC Educational Resources Information Center

    Knippelmeyer, Sheri A.; Torraco, Richard J.

    2007-01-01

    Higher education, a setting devoted to the enhancement of learning, inquiry, and development, continues to lack effective development for faculty. Mentoring relationships seek to provide enhancement, yet few mentoring programs exist. This literature review examines forms of mentoring, its benefits, barriers to implementation, means for successful…

  3. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  4. PLINK: A Tool Set for Whole-Genome Association and Population-Based Linkage Analyses

    PubMed Central

    Purcell, Shaun ; Neale, Benjamin ; Todd-Brown, Kathe ; Thomas, Lori ; Ferreira, Manuel A. R. ; Bender, David ; Maller, Julian ; Sklar, Pamela ; de Bakker, Paul I. W. ; Daly, Mark J. ; Sham, Pak C. 

    2007-01-01

    Whole-genome association studies (WGAS) bring new computational, as well as analytic, challenges to researchers. Many existing genetic-analysis tools are not designed to handle such large data sets in a convenient manner and do not necessarily exploit the new opportunities that whole-genome data bring. To address these issues, we developed PLINK, an open-source C/C++ WGAS tool set. With PLINK, large data sets comprising hundreds of thousands of markers genotyped for thousands of individuals can be rapidly manipulated and analyzed in their entirety. As well as providing tools to make the basic analytic steps computationally efficient, PLINK also supports some novel approaches to whole-genome data that take advantage of whole-genome coverage. We introduce PLINK and describe the five main domains of function: data management, summary statistics, population stratification, association analysis, and identity-by-descent estimation. In particular, we focus on the estimation and use of identity-by-state and identity-by-descent information in the context of population-based whole-genome studies. This information can be used to detect and correct for population stratification and to identify extended chromosomal segments that are shared identical by descent between very distantly related individuals. Analysis of the patterns of segmental sharing has the potential to map disease loci that contain multiple rare variants in a population-based linkage analysis. PMID:17701901

  5. iVirus: facilitating new insights in viral ecology with software and community data sets imbedded in a cyberinfrastructure.

    PubMed

    Bolduc, Benjamin; Youens-Clark, Ken; Roux, Simon; Hurwitz, Bonnie L; Sullivan, Matthew B

    2017-01-01

    Microbes affect nutrient and energy transformations throughout the world's ecosystems, yet they do so under viral constraints. In complex communities, viral metagenome (virome) sequencing is transforming our ability to quantify viral diversity and impacts. Although some bottlenecks, for example, few reference genomes and nonquantitative viromics, have been overcome, the void of centralized data sets and specialized tools now prevents viromics from being broadly applied to answer fundamental ecological questions. Here we present iVirus, a community resource that leverages the CyVerse cyberinfrastructure to provide access to viromic tools and data sets. The iVirus Data Commons contains both raw and processed data from 1866 samples and 73 projects derived from global ocean expeditions, as well as existing and legacy public repositories. Through the CyVerse Discovery Environment, users can interrogate these data sets using existing analytical tools (software applications known as 'Apps') for assembly, open reading frame prediction and annotation, as well as several new Apps specifically developed for analyzing viromes. Because Apps are web based and powered by CyVerse supercomputing resources, they enable scalable analyses for a broad user base. Finally, a use-case scenario documents how to apply these advances toward new data. This growing iVirus resource should help researchers utilize viromics as yet another tool to elucidate viral roles in nature.

  6. iVirus: facilitating new insights in viral ecology with software and community data sets imbedded in a cyberinfrastructure

    PubMed Central

    Bolduc, Benjamin; Youens-Clark, Ken; Roux, Simon; Hurwitz, Bonnie L; Sullivan, Matthew B

    2017-01-01

    Microbes affect nutrient and energy transformations throughout the world's ecosystems, yet they do so under viral constraints. In complex communities, viral metagenome (virome) sequencing is transforming our ability to quantify viral diversity and impacts. Although some bottlenecks, for example, few reference genomes and nonquantitative viromics, have been overcome, the void of centralized data sets and specialized tools now prevents viromics from being broadly applied to answer fundamental ecological questions. Here we present iVirus, a community resource that leverages the CyVerse cyberinfrastructure to provide access to viromic tools and data sets. The iVirus Data Commons contains both raw and processed data from 1866 samples and 73 projects derived from global ocean expeditions, as well as existing and legacy public repositories. Through the CyVerse Discovery Environment, users can interrogate these data sets using existing analytical tools (software applications known as ‘Apps') for assembly, open reading frame prediction and annotation, as well as several new Apps specifically developed for analyzing viromes. Because Apps are web based and powered by CyVerse supercomputing resources, they enable scalable analyses for a broad user base. Finally, a use-case scenario documents how to apply these advances toward new data. This growing iVirus resource should help researchers utilize viromics as yet another tool to elucidate viral roles in nature. PMID:27420028

  7. Computer Aided Detection of Breast Masses in Digital Tomosynthesis

    DTIC Science & Technology

    2008-06-01

    the suspicious CAD location were extracted. For the second set, 256x256 ROIs representing the - 8 - summed slab of 5 slices (5 mm) were extracted...region hotelling observer, digital tomosynthesis, multi-slice CAD algorithms, biopsy 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18...developing computer-aided detection ( CAD ) tools for mammography. Although these tools have shown promise in identifying calcifications, detecting

  8. Establishing User Needs--A Large-Scale Study into the Requirements of Those Involved in the Research Process

    ERIC Educational Resources Information Center

    Grimshaw, Shirley; Wilson, Ian

    2009-01-01

    The aim of the project was to develop a set of online tools, systems and processes that would facilitate research at the University of Nottingham. The tools would be delivered via a portal, a one-stop place providing a Virtual Research Environment for all those involved in the research process. A predominantly bottom-up approach was used with…

  9. TARA: Tool Assisted Requirements Analysis

    DTIC Science & Technology

    1988-05-01

    provided during the project and to aid tool integration . Chapter 6 provides a brief discussion of the experience of specifying the ASET case study in CORE...set of Prolog clauses. This includes the context-free grammar rules depicted in Figure 2.1, integrity constraints such as those defining the binding...Jeremaes (1986). This was developed originally for specifying database management ". semantics (for example, the preservation of integrity constraints

  10. Financial analysis of community-based forest enterprises with the Green Value tool

    Treesearch

    S. Humphries; Tom Holmes

    2016-01-01

    The Green Value tool was developed in response to the need for simplified procedures that could be used in the field to conduct financial analysis for community-based forest enterprises (CFEs). Initially our efforts focused on a set of worksheets that could be used by both researchers and CFEs to monitor and analyze costs and income for one production period. The...

  11. A GIS tool to analyze forest road sediment production and stream impacts

    Treesearch

    Ajay Prasad; David G. Tarboton; Charles H. Luce; Thomas A. Black

    2005-01-01

    A set of GIS tools to analyze the impacts of forest roads on streams considering sediment production, mass wasting risk, and fish passage barriers, has been developed. Sediment production for each road segment is calculated from slope, length, road surface condition and road-side drain vegetation gathered by a GPS inventory and by overlaying the road path on a Digital...

  12. LENS: web-based lens for enrichment and network studies of human proteins

    PubMed Central

    2015-01-01

    Background Network analysis is a common approach for the study of genetic view of diseases and biological pathways. Typically, when a set of genes are identified to be of interest in relation to a disease, say through a genome wide association study (GWAS) or a different gene expression study, these genes are typically analyzed in the context of their protein-protein interaction (PPI) networks. Further analysis is carried out to compute the enrichment of known pathways and disease-associations in the network. Having tools for such analysis at the fingertips of biologists without the requirement for computer programming or curation of data would accelerate the characterization of genes of interest. Currently available tools do not integrate network and enrichment analysis and their visualizations, and most of them present results in formats not most conducive to human cognition. Results We developed the tool Lens for Enrichment and Network Studies of human proteins (LENS) that performs network and pathway and diseases enrichment analyses on genes of interest to users. The tool creates a visualization of the network, provides easy to read statistics on network connectivity, and displays Venn diagrams with statistical significance values of the network's association with drugs, diseases, pathways, and GWASs. We used the tool to analyze gene sets related to craniofacial development, autism, and schizophrenia. Conclusion LENS is a web-based tool that does not require and download or plugins to use. The tool is free and does not require login for use, and is available at http://severus.dbmi.pitt.edu/LENS. PMID:26680011

  13. Structure and software tools of AIDA.

    PubMed

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write implementation-specific code which can be selected and loaded by a special source loader, being part of the AIDA software. This feature is also accessible for maintaining software on different sites and on different installations.

  14. Comparative Investigation on Tool Wear during End Milling of AISI H13 Steel with Different Tool Path Strategies

    NASA Astrophysics Data System (ADS)

    Adesta, Erry Yulian T.; Riza, Muhammad; Avicena

    2018-03-01

    Tool wear prediction plays a significant role in machining industry for proper planning and control machining parameters and optimization of cutting conditions. This paper aims to investigate the effect of tool path strategies that are contour-in and zigzag tool path strategies applied on tool wear during pocket milling process. The experiments were carried out on CNC vertical machining centre by involving PVD coated carbide inserts. Cutting speed, feed rate and depth of cut were set to vary. In an experiment with three factors at three levels, Response Surface Method (RSM) design of experiment with a standard called Central Composite Design (CCD) was employed. Results obtained indicate that tool wear increases significantly at higher range of feed per tooth compared to cutting speed and depth of cut. This result of this experimental work is then proven statistically by developing empirical model. The prediction model for the response variable of tool wear for contour-in strategy developed in this research shows a good agreement with experimental work.

  15. Chronic obstructive lung disease "expert system": validation of a predictive tool for assisting diagnosis.

    PubMed

    Braido, Fulvio; Santus, Pierachille; Corsico, Angelo Guido; Di Marco, Fabiano; Melioli, Giovanni; Scichilone, Nicola; Solidoro, Paolo

    2018-01-01

    The purposes of this study were development and validation of an expert system (ES) aimed at supporting the diagnosis of chronic obstructive lung disease (COLD). A questionnaire and a WebFlex code were developed and validated in silico. An expert panel pilot validation on 60 cases and a clinical validation on 241 cases were performed. The developed questionnaire and code validated in silico resulted in a suitable tool to support the medical diagnosis. The clinical validation of the ES was performed in an academic setting that included six different reference centers for respiratory diseases. The results of the ES expressed as a score associated with the risk of suffering from COLD were matched and compared with the final clinical diagnoses. A set of 60 patients were evaluated by a pilot expert panel validation with the aim of calculating the sample size for the clinical validation study. The concordance analysis between these preliminary ES scores and diagnoses performed by the experts indicated that the accuracy was 94.7% when both experts and the system confirmed the COLD diagnosis and 86.3% when COLD was excluded. Based on these results, the sample size of the validation set was established in 240 patients. The clinical validation, performed on 241 patients, resulted in ES accuracy of 97.5%, with confirmed COLD diagnosis in 53.6% of the cases and excluded COLD diagnosis in 32% of the cases. In 11.2% of cases, a diagnosis of COLD was made by the experts, although the imaging results showed a potential concomitant disorder. The ES presented here (COLD ES ) is a safe and robust supporting tool for COLD diagnosis in primary care settings.

  16. Communication strategies and volunteer management for the IAU-OAD

    NASA Astrophysics Data System (ADS)

    Sankatsing Nava, Tibisay

    2015-08-01

    The IAU Office of Astronomy for Development will be developing a new communication strategy to promote its projects in a way that is relevant to stakeholders and the general public. Ideas include a magazine featuring best practices within the field of astronomy for development and setting up a workflow of communication that integrates the different outputs of the office and effectively uses the information collection tools developed by OAD team members.To accomplish these tasks the OAD will also develop a community management strategy with existing tools to effectively harness the skills of OAD volunteers for communication purposes. This talk will discuss the new communication strategy of the OAD as well the expanded community management plans.

  17. Development of the SAFE Checklist Tool for Assessing Site-Level Threats to Child Protection: Use of Delphi Methods and Application to Two Sites in India

    PubMed Central

    Betancourt, Theresa S.; Zuilkowski, Stephanie S.; Ravichandran, Arathi; Einhorn, Honora; Arora, Nikita; Bhattacharya Chakravarty, Aruna; Brennan, Robert T.

    2015-01-01

    Background The child protection community is increasingly focused on developing tools to assess threats to child protection and the basic security needs and rights of children and families living in adverse circumstances. Although tremendous advances have been made to improve measurement of individual child health status or household functioning for use in low-resource settings, little attention has been paid to a more diverse array of settings in which many children in adversity spend time and how context contributes to threats to child protection. The SAFE model posits that insecurity in any of the following fundamental domains threatens security in the others: Safety/freedom from harm; Access to basic physiological needs and healthcare; Family and connection to others; Education and economic security. Site-level tools are needed in order to monitor the conditions that can dramatically undermine or support healthy child growth, development and emotional and behavioral health. From refugee camps and orphanages to schools and housing complexes, site-level threats exist that are not well captured by commonly used measures of child health and well-being or assessments of single households (e.g., SDQ, HOME). Methods The present study presents a methodology and the development of a scale for assessing site-level child protection threats in various settings of adversity. A modified Delphi panel process was enhanced with two stages of expert review in core content areas as well as review by experts in instrument development, and field pilot testing. Results Field testing in two diverse sites in India—a construction site and a railway station—revealed that the resulting SAFE instrument was sensitive to the differences between the sites from the standpoint of core child protection issues. PMID:26540159

  18. Development of the SAFE Checklist Tool for Assessing Site-Level Threats to Child Protection: Use of Delphi Methods and Application to Two Sites in India.

    PubMed

    Betancourt, Theresa S; Zuilkowski, Stephanie S; Ravichandran, Arathi; Einhorn, Honora; Arora, Nikita; Bhattacharya Chakravarty, Aruna; Brennan, Robert T

    2015-01-01

    The child protection community is increasingly focused on developing tools to assess threats to child protection and the basic security needs and rights of children and families living in adverse circumstances. Although tremendous advances have been made to improve measurement of individual child health status or household functioning for use in low-resource settings, little attention has been paid to a more diverse array of settings in which many children in adversity spend time and how context contributes to threats to child protection. The SAFE model posits that insecurity in any of the following fundamental domains threatens security in the others: Safety/freedom from harm; Access to basic physiological needs and healthcare; Family and connection to others; Education and economic security. Site-level tools are needed in order to monitor the conditions that can dramatically undermine or support healthy child growth, development and emotional and behavioral health. From refugee camps and orphanages to schools and housing complexes, site-level threats exist that are not well captured by commonly used measures of child health and well-being or assessments of single households (e.g., SDQ, HOME). The present study presents a methodology and the development of a scale for assessing site-level child protection threats in various settings of adversity. A modified Delphi panel process was enhanced with two stages of expert review in core content areas as well as review by experts in instrument development, and field pilot testing. Field testing in two diverse sites in India-a construction site and a railway station-revealed that the resulting SAFE instrument was sensitive to the differences between the sites from the standpoint of core child protection issues.

  19. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  20. SafetyAnalyst

    DOT National Transportation Integrated Search

    2009-01-01

    This booklet provides an overview of SafetyAnalyst. SafetyAnalyst is a set of software tools under development to help State and local highway agencies advance their programming of site-specific safety improvements. SafetyAnalyst will incorporate sta...

  1. Corridor incident management (CIM)

    DOT National Transportation Integrated Search

    2007-09-01

    The objective of the Corridor Incident Management (CIM) research project was to develop and demonstrate a set of multi-purpose methods, tools and databases to improve corridor incident management in Tennessee, relying primarily on resources already a...

  2. Computer-assisted knowledge acquisition for hypermedia systems

    NASA Technical Reports Server (NTRS)

    Steuck, Kurt

    1990-01-01

    The usage of procedural and declarative knowledge to set up the structure or 'web' of a hypermedia environment is described. An automated knowledge acquisition tool was developed that helps a knowledge engineer elicit and represent an expert's knowledge involved in performing procedural tasks. The tool represents both procedural and prerequisite, declarative knowledge that supports each activity performed by the expert. This knowledge is output and subsequently read by a hypertext scripting language to generate the link between blank, but labeled cards. Each step of the expert's activity and each piece of supporting declarative knowledge is set up as an empty node. An instructional developer can then enter detailed instructional material concerning each step and declarative knowledge into these empty nodes. Other research is also described that facilitates the translation of knowledge from one form into a form more readily useable by computerized systems.

  3. Development and Evaluation of e-CA, an Electronic Mobile-Based Food Record

    PubMed Central

    Bucher Della Torre, Sophie; Carrard, Isabelle; Farina, Eddy; Danuser, Brigitta; Kruseman, Maaike

    2017-01-01

    Measures that capture diet as validly and reliably as possible are cornerstones of nutritional research, and mobile-based devices offer new opportunities to improve and simplify data collection. The balance between precision and acceptability of these data collection tools remains debated, and rigorous validations are warranted. Our objective was to develop and evaluate an electronic mobile-based food record for a research setting. We developed e-CA, which includes almost 900 foods and beverages classified in 14 categories and 60 subcategories. e-CA was evaluated using three different methods: (1) usability and acceptability through a logbook and qualitative interviews; (2) dietary intake accuracy through comparison with 2 unannounced 24-h phone recalls on overlapping days; and (3) reliability and process comparison with a paper-based food record in a laboratory setting with a randomized design. e-CA proved to be intuitive and practical and was perceived as modern, trendy, and fun. Comparisons of e-CA with 24-h telephone recalls or paper-based food records in a laboratory setting with two small convenient samples showed good agreement but highlighted the well-known difficulty of estimating portion sizes and a necessary learning time to use the app. e-CA is a functional tool that has the potential to facilitate food intake measurement for research by increasing the pleasure of using the food record tool and reducing the perceived burden for the participants. It also decreases the workload, costs and the risk of transcription errors for researchers. PMID:28106767

  4. Development and Evaluation of e-CA, an Electronic Mobile-Based Food Record.

    PubMed

    Bucher Della Torre, Sophie; Carrard, Isabelle; Farina, Eddy; Danuser, Brigitta; Kruseman, Maaike

    2017-01-18

    Measures that capture diet as validly and reliably as possible are cornerstones of nutritional research, and mobile-based devices offer new opportunities to improve and simplify data collection. The balance between precision and acceptability of these data collection tools remains debated, and rigorous validations are warranted. Our objective was to develop and evaluate an electronic mobile-based food record for a research setting. We developed e-CA, which includes almost 900 foods and beverages classified in 14 categories and 60 subcategories. e-CA was evaluated using three different methods: (1) usability and acceptability through a logbook and qualitative interviews; (2) dietary intake accuracy through comparison with 2 unannounced 24-h phone recalls on overlapping days; and (3) reliability and process comparison with a paper-based food record in a laboratory setting with a randomized design. e-CA proved to be intuitive and practical and was perceived as modern, trendy, and fun. Comparisons of e-CA with 24-h telephone recalls or paper-based food records in a laboratory setting with two small convenient samples showed good agreement but highlighted the well-known difficulty of estimating portion sizes and a necessary learning time to use the app. e-CA is a functional tool that has the potential to facilitate food intake measurement for research by increasing the pleasure of using the food record tool and reducing the perceived burden for the participants. It also decreases the workload, costs and the risk of transcription errors for researchers.

  5. The development and validation of the clinicians' awareness towards cognitive errors (CATChES) in clinical decision making questionnaire tool.

    PubMed

    Chew, Keng Sheng; Kueh, Yee Cheng; Abdul Aziz, Adlihafizi

    2017-03-21

    Despite their importance on diagnostic accuracy, there is a paucity of literature on questionnaire tools to assess clinicians' awareness toward cognitive errors. A validation study was conducted to develop a questionnaire tool to evaluate the Clinician's Awareness Towards Cognitive Errors (CATChES) in clinical decision making. This questionnaire is divided into two parts. Part A is to evaluate the clinicians' awareness towards cognitive errors in clinical decision making while Part B is to evaluate their perception towards specific cognitive errors. Content validation for both parts was first determined followed by construct validation for Part A. Construct validation for Part B was not determined as the responses were set in a dichotomous format. For content validation, all items in both Part A and Part B were rated as "excellent" in terms of their relevance in clinical settings. For construct validation using exploratory factor analysis (EFA) for Part A, a two-factor model with total variance extraction of 60% was determined. Two items were deleted. Then, the EFA was repeated showing that all factor loadings are above the cut-off value of >0.5. The Cronbach's alpha for both factors are above 0.6. The CATChES questionnaire tool is a valid questionnaire tool aimed to evaluate the awareness among clinicians toward cognitive errors in clinical decision making.

  6. The Function Biomedical Informatics Research Network Data Repository.

    PubMed

    Keator, David B; van Erp, Theo G M; Turner, Jessica A; Glover, Gary H; Mueller, Bryon A; Liu, Thomas T; Voyvodic, James T; Rasmussen, Jerod; Calhoun, Vince D; Lee, Hyo Jong; Toga, Arthur W; McEwen, Sarah; Ford, Judith M; Mathalon, Daniel H; Diaz, Michele; O'Leary, Daniel S; Jeremy Bockholt, H; Gadde, Syam; Preda, Adrian; Wible, Cynthia G; Stern, Hal S; Belger, Aysenil; McCarthy, Gregory; Ozyurt, Burak; Potkin, Steven G

    2016-01-01

    The Function Biomedical Informatics Research Network (FBIRN) developed methods and tools for conducting multi-scanner functional magnetic resonance imaging (fMRI) studies. Method and tool development were based on two major goals: 1) to assess the major sources of variation in fMRI studies conducted across scanners, including instrumentation, acquisition protocols, challenge tasks, and analysis methods, and 2) to provide a distributed network infrastructure and an associated federated database to host and query large, multi-site, fMRI and clinical data sets. In the process of achieving these goals the FBIRN test bed generated several multi-scanner brain imaging data sets to be shared with the wider scientific community via the BIRN Data Repository (BDR). The FBIRN Phase 1 data set consists of a traveling subject study of 5 healthy subjects, each scanned on 10 different 1.5 to 4 T scanners. The FBIRN Phase 2 and Phase 3 data sets consist of subjects with schizophrenia or schizoaffective disorder along with healthy comparison subjects scanned at multiple sites. In this paper, we provide concise descriptions of FBIRN's multi-scanner brain imaging data sets and details about the BIRN Data Repository instance of the Human Imaging Database (HID) used to publicly share the data. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Using intranet-based order sets to standardize clinical care and prepare for computerized physician order entry.

    PubMed

    Heffner, John E; Brower, Kathleen; Ellis, Rosemary; Brown, Shirley

    2004-07-01

    The high cost of computerized physician order entry (CPOE) and physician resistance to standardized care have delayed implementation. An intranet-based order set system can provide some of CPOE's benefits and offer opportunities to acculturate physicians toward standardized care. INTRANET CLINICIAN ORDER FORMS (COF): The COF system at the Medical University of South Carolina (MUSC) allows caregivers to enter and print orders through the intranet at points of care and to access decision support resources. Work on COF began in March 2000 with transfer of 25 MUSC paper-based order set forms to an intranet site. Physician groups developed additional order sets, which number more than 200. Web traffic increased progressively during a 24-month period, peaking at more than 6,400 hits per month to COF. Decision support tools improved compliance with Centers for Medicare & Medicaid Services core indicators. Clinicians demonstrated a willingness to develop and use order sets and decision support tools posted on the COF site. COF provides a low-cost method for preparing caregivers and institutions to adopt CPOE and standardization of care. The educational resources, relevant links to external resources, and communication alerts will all link to CPOE, thereby providing a head start in CPOE implementation.

  8. Indico central - events organisation, ergonomics and collaboration tools integration

    NASA Astrophysics Data System (ADS)

    Benito Gonzélez López, José; Ferreira, José Pedro; Baron, Thomas

    2010-04-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  9. Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool

    PubMed Central

    Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.

    2011-01-01

    Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150

  10. Cost effectiveness of pediatric pneumococcal conjugate vaccines: a comparative assessment of decision-making tools.

    PubMed

    Chaiyakunapruk, Nathorn; Somkrua, Ratchadaporn; Hutubessy, Raymond; Henao, Ana Maria; Hombach, Joachim; Melegaro, Alessia; Edmunds, John W; Beutels, Philippe

    2011-05-12

    Several decision support tools have been developed to aid policymaking regarding the adoption of pneumococcal conjugate vaccine (PCV) into national pediatric immunization programs. The lack of critical appraisal of these tools makes it difficult for decision makers to understand and choose between them. With the aim to guide policymakers on their optimal use, we compared publicly available decision-making tools in relation to their methods, influential parameters and results. The World Health Organization (WHO) requested access to several publicly available cost-effectiveness (CE) tools for PCV from both public and private provenance. All tools were critically assessed according to the WHO's guide for economic evaluations of immunization programs. Key attributes and characteristics were compared and a series of sensitivity analyses was performed to determine the main drivers of the results. The results were compared based on a standardized set of input parameters and assumptions. Three cost-effectiveness modeling tools were provided, including two cohort-based (Pan-American Health Organization (PAHO) ProVac Initiative TriVac, and PneumoADIP) and one population-based model (GlaxoSmithKline's SUPREMES). They all compared the introduction of PCV into national pediatric immunization program with no PCV use. The models were different in terms of model attributes, structure, and data requirement, but captured a similar range of diseases. Herd effects were estimated using different approaches in each model. The main driving parameters were vaccine efficacy against pneumococcal pneumonia, vaccine price, vaccine coverage, serotype coverage and disease burden. With a standardized set of input parameters developed for cohort modeling, TriVac and PneumoADIP produced similar incremental costs and health outcomes, and incremental cost-effectiveness ratios. Vaccine cost (dose price and number of doses), vaccine efficacy and epidemiology of critical endpoint (for example, incidence of pneumonia, distribution of serotypes causing pneumonia) were influential parameters in the models we compared. Understanding the differences and similarities of such CE tools through regular comparisons could render decision-making processes in different countries more efficient, as well as providing guiding information for further clinical and epidemiological research. A tool comparison exercise using standardized data sets can help model developers to be more transparent about their model structure and assumptions and provide analysts and decision makers with a more in-depth view behind the disease dynamics. Adherence to the WHO guide of economic evaluations of immunization programs may also facilitate this process. Please see related article: http://www.biomedcentral.com/1741-7007/9/55.

  11. Novel tool wear monitoring method in milling difficult-to-machine materials using cutting chip formation

    NASA Astrophysics Data System (ADS)

    Zhang, P. P.; Guo, Y.; Wang, B.

    2017-05-01

    The main problems in milling difficult-to-machine materials are the high cutting temperature and rapid tool wear. However it is impossible to investigate tool wear in machining. Tool wear and cutting chip formation are two of the most important representations for machining efficiency and quality. The purpose of this paper is to develop the model of tool wear with cutting chip formation (width of chip and radian of chip) on difficult-to-machine materials. Thereby tool wear is monitored by cutting chip formation. A milling experiment on the machining centre with three sets cutting parameters was performed to obtain chip formation and tool wear. The experimental results show that tool wear increases gradually along with cutting process. In contrast, width of chip and radian of chip decrease. The model is developed by fitting the experimental data and formula transformations. The most of monitored errors of tool wear by the chip formation are less than 10%. The smallest error is 0.2%. Overall errors by the radian of chip are less than the ones by the width of chip. It is new way to monitor and detect tool wear by cutting chip formation in milling difficult-to-machine materials.

  12. Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa

    2013-01-01

    This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software

  13. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees

    PubMed Central

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-01-01

    A novel Protocol Ethics Tool Kit (‘Ethics Tool Kit’) has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  14. A Tropical Marine Microbial Natural Products Geobibliography as an Example of Desktop Exploration of Current Research Using Web Visualisation Tools

    PubMed Central

    Mukherjee, Joydeep; Llewellyn, Lyndon E; Evans-Illidge, Elizabeth A

    2008-01-01

    Microbial marine biodiscovery is a recent scientific endeavour developing at a time when information and other technologies are also undergoing great technical strides. Global visualisation of datasets is now becoming available to the world through powerful and readily available software such as Worldwind™, ArcGIS Explorer™ and Google Earth™. Overlaying custom information upon these tools is within the hands of every scientist and more and more scientific organisations are making data available that can also be integrated into these global visualisation tools. The integrated global view that these tools enable provides a powerful desktop exploration tool. Here we demonstrate the value of this approach to marine microbial biodiscovery by developing a geobibliography that incorporates citations on tropical and near-tropical marine microbial natural products research with Google Earth™ and additional ancillary global data sets. The tools and software used are all readily available and the reader is able to use and install the material described in this article. PMID:19172194

  15. Sentinel-2 ArcGIS Tool for Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Plesoianu, Alin; Cosmin Sandric, Ionut; Anca, Paula; Vasile, Alexandru; Calugaru, Andreea; Vasile, Cristian; Zavate, Lucian

    2017-04-01

    This paper addresses one of the biggest challenges regarding Sentinel-2 data, related to the need of an efficient tool to access and process the large collection of images that are available. Consequently, developing a tool for the automation of Sentinel-2 data analysis is the most immediate need. We developed a series of tools for the automation of Sentinel-2 data download and processing for vegetation health monitoring. The tools automatically perform the following operations: downloading image tiles from ESA's Scientific Hub or other venders (Amazon), pre-processing of the images to extract the 10-m bands, creating image composites, applying a series of vegetation indexes (NDVI, OSAVI, etc.) and performing change detection analyses on different temporal data sets. All of these tools run in a dynamic way in the ArcGIS Platform, without the need of creating intermediate datasets (rasters, layers), as the images are processed on-the-fly in order to avoid data duplication. Finally, they allow complete integration with the ArcGIS environment and workflows

  16. School environment assessment tools to address behavioural risk factors of non-communicable diseases: A scoping review.

    PubMed

    Saluja, Kiran; Rawal, Tina; Bassi, Shalini; Bhaumik, Soumyadeep; Singh, Ankur; Park, Min Hae; Kinra, Sanjay; Arora, Monika

    2018-06-01

    We aimed to identify, describe and analyse school environment assessment (SEA) tools that address behavioural risk factors (unhealthy diet, physical inactivity, tobacco and alcohol consumption) for non-communicable diseases (NCD). We searched in MEDLINE and Web of Science, hand-searched reference lists and contacted experts. Basic characteristics, measures assessed and measurement properties (validity, reliability, usability) of identified tools were extracted. We narratively synthesized the data and used content analysis to develop a list of measures used in the SEA tools. Twenty-four SEA tools were identified, mostly from developed countries. Out of these, 15 were questionnaire based, 8 were checklists or observation based tools and one tool used a combined checklist/observation based and telephonic questionnaire approach. Only 1 SEA tool had components related to all the four NCD risk factors, 2 SEA tools has assessed three NCD risk factors (diet/nutrition, physical activity, tobacco), 10 SEA tools has assessed two NCD risk factors (diet/nutrition and physical activity) and 11 SEA tools has assessed only one of the NCD risk factor. Several measures were used in the tools to assess the four NCD risk factors, but tobacco and alcohol was sparingly included. Measurement properties were reported for 14 tools. The review provides a comprehensive list of measures used in SEA tools which could be a valuable resource to guide future development of such tools. A valid and reliable SEA tool which could simultaneously evaluate all NCD risk factors, that has been tested in different settings with varying resource availability is needed.

  17. High Temperature Logging and Monitoring Instruments to Explore and Drill Deep into Hot Oceanic Crust.

    NASA Astrophysics Data System (ADS)

    Denchik, N.; Pezard, P. A.; Ragnar, A.; Jean-Luc, D.; Jan, H.

    2014-12-01

    Drilling an entire section of the oceanic crust and through the Moho has been a goal of the scientific community for more than half of a century. On the basis of ODP and IODP experience and data, this will require instruments and strategies working at temperature far above 200°C (reached, for example, at the bottom of DSDP/ODP Hole 504B), and possibly beyond 300°C. Concerning logging and monitoring instruments, progress were made over the past ten years in the context of the HiTI ("High Temperature Instruments") project funded by the european community for deep drilling in hot Icelandic geothermal holes where supercritical conditions and a highly corrosive environment are expected at depth (with temperatures above 374 °C and pressures exceeding 22 MPa). For example, a slickline tool (memory tool) tolerating up to 400°C and wireline tools up to 300°C were developed and tested in Icelandic high-temperature geothermal fields. The temperature limitation of logging tools was defined to comply with the present limitation in wireline cables (320°C). As part of this new set of downhole tools, temperature, pressure, fluid flow and casing collar location might be measured up to 400°C from a single multisensor tool. Natural gamma radiation spectrum, borehole wall ultrasonic images signal, and fiber optic cables (using distributed temperature sensing methods) were also developed for wireline deployment up to 300°C and tested in the field. A wireline, dual laterolog electrical resistivity tool was also developed but could not be field tested as part of HiTI. This new set of tools constitutes a basis for the deep exploration of the oceanic crust in the future. In addition, new strategies including the real-time integration of drilling parameters with modeling of the thermo-mechanical status of the borehole could be developed, using time-lapse logging of temperature (for heat flow determination) and borehole wall images (for hole stability and in-situ stress determination) as boundary conditions for the models. In all, and with limited integration of existing tools, to deployment of high-temperature downhole tools could contribute largely to the success of the long awaited Mohole project.

  18. Low Back Pain in 17 Countries, a Rasch Analysis of the ICF Core Set for Low Back Pain

    ERIC Educational Resources Information Center

    Roe, Cecilie; Bautz-Holter, Erik; Cieza, Alarcos

    2013-01-01

    Previous studies indicate that a worldwide measurement tool may be developed based on the International Classification of Functioning Disability and Health (ICF) Core Sets for chronic conditions. The aim of the present study was to explore the possibility of constructing a cross-cultural measurement of functioning for patients with low back pain…

  19. Visualizing planetary data by using 3D engines

    NASA Astrophysics Data System (ADS)

    Elgner, S.; Adeli, S.; Gwinner, K.; Preusker, F.; Kersten, E.; Matz, K.-D.; Roatsch, T.; Jaumann, R.; Oberst, J.

    2017-09-01

    We examined 3D gaming engines for their usefulness in visualizing large planetary image data sets. These tools allow us to include recent developments in the field of computer graphics in our scientific visualization systems and present data products interactively and in higher quality than before. We started to set up the first applications which will take use of virtual reality (VR) equipment.

  20. Moving from Analogue to High Definition e-Tools to Support Empowering Social Learning Approaches

    ERIC Educational Resources Information Center

    Charbonneau-Gowdy, Paula; Cechova, Ivana

    2009-01-01

    Traditional educational and training settings have dictated that the act of learning is an activity that is motivated by learners, directed by a teacher expert and based on information transfer and data manipulation. In this scenario, it has been assumed that learners more or less acquire knowledge or develop sets of skills as a result of such…

  1. SAGES: A Suite of Freely-Available Software Tools for Electronic Disease Surveillance in Resource-Limited Settings

    PubMed Central

    Lewis, Sheri L.; Feighner, Brian H.; Loschen, Wayne A.; Wojcik, Richard A.; Skora, Joseph F.; Coberly, Jacqueline S.; Blazes, David L.

    2011-01-01

    Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations. PMID:21572957

  2. KeyWare: an open wireless distributed computing environment

    NASA Astrophysics Data System (ADS)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  3. CUAHSI Data Services: Tools and Cyberinfrastructure for Water Data Discovery, Research and Collaboration

    NASA Astrophysics Data System (ADS)

    Seul, M.; Brazil, L.; Castronova, A. M.

    2017-12-01

    CUAHSI Data Services: Tools and Cyberinfrastructure for Water Data Discovery, Research and CollaborationEnabling research surrounding interdisciplinary topics often requires a combination of finding, managing, and analyzing large data sets and models from multiple sources. This challenge has led the National Science Foundation to make strategic investments in developing community data tools and cyberinfrastructure that focus on water data, as it is central need for many of these research topics. CUAHSI (The Consortium of Universities for the Advancement of Hydrologic Science, Inc.) is a non-profit organization funded by the National Science Foundation to aid students, researchers, and educators in using and managing data and models to support research and education in the water sciences. This presentation will focus on open-source CUAHSI-supported tools that enable enhanced data discovery online using advanced searching capabilities and computational analysis run in virtual environments pre-designed for educators and scientists so they can focus their efforts on data analysis rather than IT set-up.

  4. Use of software tools in the development of real time software systems

    NASA Technical Reports Server (NTRS)

    Garvey, R. C.

    1981-01-01

    The transformation of a preexisting software system into a larger and more versatile system with different mission requirements is discussed. The history of this transformation is used to illustrate the use of structured real time programming techniques and tools to produce maintainable and somewhat transportable systems. The predecessor system is a single ground diagnostic system; its purpose is to exercise a computer controlled hardware set prior to its deployment in its functional environment, as well as test the equipment set by supplying certain well known stimulas. The successor system (FTE) is required to perform certain testing and control functions while this hardware set is in its functional environment. Both systems must deal with heavy user input/output loads and a new I/O requirement is included in the design of the FTF system. Human factors are enhanced by adding an improved console interface and special function keyboard handler. The additional features require the inclusion of much new software to the original set from which FTF was developed. As a result, it is necessary to split the system into a duel programming configuration with high rates of interground communications. A generalized information routing mechanism is used to support this configuration.

  5. Identification, summary and comparison of tools used to measure organizational attributes associated with chronic disease management within primary care settings.

    PubMed

    Lukewich, Julia; Corbin, Renée; VanDenKerkhof, Elizabeth G; Edge, Dana S; Williamson, Tyler; Tranmer, Joan E

    2014-12-01

    Given the increasing emphasis being placed on managing patients with chronic diseases within primary care, there is a need to better understand which primary care organizational attributes affect the quality of care that patients with chronic diseases receive. This study aimed to identify, summarize and compare data collection tools that describe and measure organizational attributes used within the primary care setting worldwide. Systematic search and review methodology consisting of a comprehensive and exhaustive search that is based on a broad question to identify the best available evidence was employed. A total of 30 organizational attribute data collection tools that have been used within the primary care setting were identified. The tools varied with respect to overall focus and level of organizational detail captured, theoretical foundations, administration and completion methods, types of questions asked, and the extent to which psychometric property testing had been performed. The tools utilized within the Quality and Costs of Primary Care in Europe study and the Canadian Primary Health Care Practice-Based Surveys were the most recently developed tools. Furthermore, of the 30 tools reviewed, the Canadian Primary Health Care Practice-Based Surveys collected the most information on organizational attributes. There is a need to collect primary care organizational attribute information at a national level to better understand factors affecting the quality of chronic disease prevention and management across a given country. The data collection tools identified in this review can be used to establish data collection strategies to collect this important information. © 2014 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  6. A Python tool to set up relative free energy calculations in GROMACS.

    PubMed

    Klimovich, Pavel V; Mobley, David L

    2015-11-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.

  7. Workplace wellness using online learning tools in a healthcare setting.

    PubMed

    Blake, Holly; Gartshore, Emily

    2016-09-01

    The aim was to develop and evaluate an online learning tool for use with UK healthcare employees, healthcare educators and healthcare students, to increase knowledge of workplace wellness as an important public health issue. A 'Workplace Wellness' e-learning tool was developed and peer-reviewed by 14 topic experts. This focused on six key areas relating to workplace wellness: work-related stress, musculoskeletal disorders, diet and nutrition, physical activity, smoking and alcohol consumption. Each key area provided current evidence-based information on causes and consequences, access to UK government reports and national statistics, and guidance on actions that could be taken to improve health within a workplace setting. 188 users (93.1% female, age 18-60) completed online knowledge questionnaires before (n = 188) and after (n = 88) exposure to the online learning tool. Baseline knowledge of workplace wellness was poor (n = 188; mean accuracy 47.6%, s.d. 11.94). Knowledge significantly improved from baseline to post-intervention (mean accuracy = 77.5%, s.d. 13.71) (t(75) = -14.801, p < 0.0005) with knowledge increases evident for all included topics areas. Usability evaluation showed that participants perceived the tool to be useful (96.4%), engaging (73.8%) and would recommend it to others (86.9%). Healthcare professionals, healthcare educators and pre-registered healthcare students held positive attitudes towards online learning, indicating scope for development of further online packages relating to other important health parameters. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Appreciative Inquiry and Implementation Science in Leadership Development.

    PubMed

    Bleich, Michael R; Hessler, Christine

    2016-05-01

    Appreciative inquiry was developed to initiate and animate change. As implementation science gains a foothold in practice settings to bridge theory, evidence, and practice, appreciative inquiry takes on new meaning as a leadership intervention and training tool. J Contin Educ Nurs. 2016;47(5):207-209. Copyright 2016, SLACK Incorporated.

  9. DEVELOPING A CONSISTENT DECISION-MAKING FRAMEWORK BY USING THE U.S. EPA'S TRACI

    EPA Science Inventory

    The most effective way to achieve long-term environmental results is through the use of a consistent set of metrics and decision-making framework. The U.S. EPA has developed TRACI, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts, which allows...

  10. Moving at the Speed of Government: VIVO Implementation at EPA's Office of Research and Development

    EPA Science Inventory

    VIVO is a research and expertise discovery tool that supports collaboration across disciplines, geographic locations and organizational structures. This poster reviews the steps taken to set up an EPA/ORD VIVO instance including customization of the theme, data ingest and develop...

  11. GOSSIP, a New VO Compliant Tool for SED Fitting

    NASA Astrophysics Data System (ADS)

    Franzetti, P.; Scodeggio, M.; Garilli, B.; Fumana, M.; Paioro, L.

    2008-08-01

    We present GOSSIP (Galaxy Observed-Simulated SED Interactive Program), a new tool developed to perform SED fitting in a simple, user friendly and efficient way. GOSSIP automatically builds-up the observed SED of an object (or a large sample of objects) combining magnitudes in different bands and eventually a spectrum; then it performs a χ^2 minimization fitting procedure versus a set of synthetic models. The fitting results are used to estimate a number of physical parameters like the Star Formation History, absolute magnitudes, stellar mass and their Probability Distribution Functions. User defined models can be used, but GOSSIP is also able to load models produced by the most commonly used synthesis population codes. GOSSIP can be used interactively with other visualization tools using the PLASTIC protocol for communications. Moreover, since it has been developed with large data sets applications in mind, it will be extended to operate within the Virtual Observatory framework. GOSSIP is distributed to the astronomical community from the PANDORA group web site (http://cosmos.iasf-milano.inaf.it/pandora/gossip.html).

  12. Radiation Mitigation and Power Optimization Design Tools for Reconfigurable Hardware in Orbit

    NASA Technical Reports Server (NTRS)

    French, Matthew; Graham, Paul; Wirthlin, Michael; Wang, Li; Larchev, Gregory

    2005-01-01

    The Reconfigurable Hardware in Orbit (RHinO)project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. In the second year of the project, design tools that leverage an established FPGA design environment have been created to visualize and analyze an FPGA circuit for radiation weaknesses and power inefficiencies. For radiation, a single event Upset (SEU) emulator, persistence analysis tool, and a half-latch removal tool for Xilinx/Virtex-II devices have been created. Research is underway on a persistence mitigation tool and multiple bit upsets (MBU) studies. For power, synthesis level dynamic power visualization and analysis tools have been completed. Power optimization tools are under development and preliminary test results are positive.

  13. Combining multiple tools outperforms individual methods in gene set enrichment analyses.

    PubMed

    Alhamdoosh, Monther; Ng, Milica; Wilson, Nicholas J; Sheridan, Julie M; Huynh, Huy; Wilson, Michael J; Ritchie, Matthew E

    2017-02-01

    Gene set enrichment (GSE) analysis allows researchers to efficiently extract biological insight from long lists of differentially expressed genes by interrogating them at a systems level. In recent years, there has been a proliferation of GSE analysis methods and hence it has become increasingly difficult for researchers to select an optimal GSE tool based on their particular dataset. Moreover, the majority of GSE analysis methods do not allow researchers to simultaneously compare gene set level results between multiple experimental conditions. The ensemble of genes set enrichment analyses (EGSEA) is a method developed for RNA-sequencing data that combines results from twelve algorithms and calculates collective gene set scores to improve the biological relevance of the highest ranked gene sets. EGSEA's gene set database contains around 25 000 gene sets from sixteen collections. It has multiple visualization capabilities that allow researchers to view gene sets at various levels of granularity. EGSEA has been tested on simulated data and on a number of human and mouse datasets and, based on biologists' feedback, consistently outperforms the individual tools that have been combined. Our evaluation demonstrates the superiority of the ensemble approach for GSE analysis, and its utility to effectively and efficiently extrapolate biological functions and potential involvement in disease processes from lists of differentially regulated genes. EGSEA is available as an R package at http://www.bioconductor.org/packages/EGSEA/ . The gene sets collections are available in the R package EGSEAdata from http://www.bioconductor.org/packages/EGSEAdata/ . monther.alhamdoosh@csl.com.au mritchie@wehi.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  14. Development and Validation of a Standardized Tool for Prioritization of Information Sources.

    PubMed

    Akwar, Holy; Kloeze, Harold; Mukhi, Shamir

    2016-01-01

    To validate the utility and effectiveness of a standardized tool for prioritization of information sources for early detection of diseases. The tool was developed with input from diverse public health experts garnered through survey. Ten raters used the tool to evaluate ten information sources and reliability among raters was computed. The Proc mixed procedure with random effect statement and SAS Macros were used to compute multiple raters' Fleiss Kappa agreement and Kendall's Coefficient of Concordance. Ten disparate information sources evaluated obtained the following composite scores: ProMed 91%; WAHID 90%; Eurosurv 87%; MediSys 85%; SciDaily 84%; EurekAl 83%; CSHB 78%; GermTrax 75%; Google 74%; and CBC 70%. A Fleiss Kappa agreement of 50.7% was obtained for ten information sources and 72.5% for a sub-set of five sources rated, which is substantial agreement validating the utility and effectiveness of the tool. This study validated the utility and effectiveness of a standardized criteria tool developed to prioritize information sources. The new tool was used to identify five information sources suited for use by the KIWI system in the CEZD-IIR project to improve surveillance of infectious diseases. The tool can be generalized to situations when prioritization of numerous information sources is necessary.

  15. A framework for measurement and harmonization of pediatric multiple sclerosis etiologic research studies: The Pediatric MS Tool-Kit.

    PubMed

    Magalhaes, Sandra; Banwell, Brenda; Bar-Or, Amit; Fortier, Isabel; Hanwell, Heather E; Lim, Ming; Matt, Georg E; Neuteboom, Rinze F; O'Riordan, David L; Schneider, Paul K; Pugliatti, Maura; Shatenstein, Bryna; Tansey, Catherine M; Wassmer, Evangeline; Wolfson, Christina

    2018-06-01

    While studying the etiology of multiple sclerosis (MS) in children has several methodological advantages over studying etiology in adults, studies are limited by small sample sizes. Using a rigorous methodological process, we developed the Pediatric MS Tool-Kit, a measurement framework that includes a minimal set of core variables to assess etiological risk factors. We solicited input from the International Pediatric MS Study Group to select three risk factors: environmental tobacco smoke (ETS) exposure, sun exposure, and vitamin D intake. To develop the Tool-Kit, we used a Delphi study involving a working group of epidemiologists, neurologists, and content experts from North America and Europe. The Tool-Kit includes six core variables to measure ETS, six to measure sun exposure, and six to measure vitamin D intake. The Tool-Kit can be accessed online ( www.maelstrom-research.org/mica/network/tool-kit ). The goals of the Tool-Kit are to enhance exposure measurement in newly designed pediatric MS studies and comparability of results across studies, and in the longer term to facilitate harmonization of studies, a methodological approach that can be used to circumvent issues of small sample sizes. We believe the Tool-Kit will prove to be a valuable resource to guide pediatric MS researchers in developing study-specific questionnaire.

  16. Delirium diagnosis, screening and management

    PubMed Central

    Lawlor, Peter G.; Bush, Shirley H.

    2014-01-01

    Purpose of review Our review focuses on recent developments across many settings regarding the diagnosis, screening and management of delirium, so as to inform these aspects in the context of palliative and supportive care. Recent findings Delirium diagnostic criteria have been updated in the long-awaited Diagnostic Statistical Manual of Mental Disorders, fifth edition. Studies suggest that poor recognition of delirium relates to its clinical characteristics, inadequate interprofessional communication and lack of systematic screening. Validation studies are published for cognitive and observational tools to screen for delirium. Formal guidelines for delirium screening and management have been rigorously developed for intensive care, and may serve as a model for other settings. Given that palliative sedation is often required for the management of refractory delirium at the end of life, a version of the Richmond Agitation-Sedation Scale, modified for palliative care, has undergone preliminary validation. Summary Although formal systematic delirium screening with brief but sensitive tools is strongly advocated for patients in palliative and supportive care, it requires critical evaluation in terms of clinical outcomes, including patient comfort. Randomized controlled trials are needed to inform the development of guidelines for the management of delirium in this setting. PMID:25004177

  17. Visualization of International Solar-Terrestrial Physics Program (ISTP) data

    NASA Technical Reports Server (NTRS)

    Kessel, Ramona L.; Candey, Robert M.; Hsieh, Syau-Yun W.; Kayser, Susan

    1995-01-01

    The International Solar-Terrestrial Physics Program (ISTP) is a multispacecraft, multinational program whose objective is to promote further understanding of the Earth's complex plasma environment. Extensive data sharing and data analysis will be needed to ensure the success of the overall ISTP program. For this reason, there has been a special emphasis on data standards throughout ISTP. One of the key tools will be the common data format (CDF), developed, maintained, and evolved at the National Space Science Data Center (NSSDC), with the set of ISTP implementation guidelines specially designed for space physics data sets by the Space Physics Data Facility (associated with the NSSDC). The ISTP guidelines were developed to facilitate searching, plotting, merging, and subsetting of data sets. We focus here on the plotting application. A prototype software package was developed to plot key parameter (KP) data from the ISTP program at the Science Planning and Operations Facility (SPOF). The ISTP Key Parameter Visualization Tool is based on the Interactive Data Language (IDL) and is keyed to the ISTP guidelines, reading data stored in CDF. With the combination of CDF, the ISTP guidelines, and the visualization software, we can look forward to easier and more effective data sharing and use among ISTP scientists.

  18. Genetic algorithms for GNC settings and DACS design application to an asteroid Kinetic Impactor

    NASA Astrophysics Data System (ADS)

    Vernis, P.; Oliviero, V.

    2018-06-01

    This paper deals with an application of Genetic Algorithm (GA) tools in order to perform and optimize the settings phase of the Guidance, Navigation, and Control (GNC) data set for the endgame phase of a Kinetic Impactor (KI) targeting a medium-size Near Earth Object (NEO). A coupled optimization of the GNC settings and of the GC-oriented design of the Divert and Attitude Control System (DACS) is also proposed. The illustration of the developed principles is made considering the NEOShield study frame.

  19. WE-AB-207B-07: Dose Cloud: Generating “Big Data” for Radiation Therapy Treatment Plan Optimization Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Folkerts, MM; University of California San Diego, La Jolla, California; Long, T

    Purpose: To provide a tool to generate large sets of realistic virtual patient geometries and beamlet doses for treatment optimization research. This tool enables countless studies exploring the fundamental interplay between patient geometry, objective functions, weight selections, and achievable dose distributions for various algorithms and modalities. Methods: Generating realistic virtual patient geometries requires a small set of real patient data. We developed a normalized patient shape model (PSM) which captures organ and target contours in a correspondence-preserving manner. Using PSM-processed data, we perform principal component analysis (PCA) to extract major modes of variation from the population. These PCA modes canmore » be shared without exposing patient information. The modes are re-combined with different weights to produce sets of realistic virtual patient contours. Because virtual patients lack imaging information, we developed a shape-based dose calculation (SBD) relying on the assumption that the region inside the body contour is water. SBD utilizes a 2D fluence-convolved scatter kernel, derived from Monte Carlo simulations, and can compute both full dose for a given set of fluence maps, or produce a dose matrix (dose per fluence pixel) for many modalities. Combining the shape model with SBD provides the data needed for treatment plan optimization research. Results: We used PSM to capture organ and target contours for 96 prostate cases, extracted the first 20 PCA modes, and generated 2048 virtual patient shapes by randomly sampling mode scores. Nearly half of the shapes were thrown out for failing anatomical checks, the remaining 1124 were used in computing dose matrices via SBD and a standard 7-beam protocol. As a proof of concept, and to generate data for later study, we performed fluence map optimization emphasizing PTV coverage. Conclusions: We successfully developed and tested a tool for creating customizable sets of virtual patients suitable for large-scale radiation therapy optimization research.« less

  20. Managing and Communicating Operational Workflow

    PubMed Central

    Weinberg, Stuart T.; Danciu, Ioana; Unertl, Kim M.

    2016-01-01

    Summary Background Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. Objective To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). Methods The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. Results The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. Conclusions The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings. PMID:27081407

  1. Learning to recognize rat social behavior: Novel dataset and cross-dataset application.

    PubMed

    Lorbach, Malte; Kyriakou, Elisavet I; Poppe, Ronald; van Dam, Elsbeth A; Noldus, Lucas P J J; Veltkamp, Remco C

    2018-04-15

    Social behavior is an important aspect of rodent models. Automated measuring tools that make use of video analysis and machine learning are an increasingly attractive alternative to manual annotation. Because machine learning-based methods need to be trained, it is important that they are validated using data from different experiment settings. To develop and validate automated measuring tools, there is a need for annotated rodent interaction datasets. Currently, the availability of such datasets is limited to two mouse datasets. We introduce the first, publicly available rat social interaction dataset, RatSI. We demonstrate the practical value of the novel dataset by using it as the training set for a rat interaction recognition method. We show that behavior variations induced by the experiment setting can lead to reduced performance, which illustrates the importance of cross-dataset validation. Consequently, we add a simple adaptation step to our method and improve the recognition performance. Most existing methods are trained and evaluated in one experimental setting, which limits the predictive power of the evaluation to that particular setting. We demonstrate that cross-dataset experiments provide more insight in the performance of classifiers. With our novel, public dataset we encourage the development and validation of automated recognition methods. We are convinced that cross-dataset validation enhances our understanding of rodent interactions and facilitates the development of more sophisticated recognition methods. Combining them with adaptation techniques may enable us to apply automated recognition methods to a variety of animals and experiment settings. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Functional cohesion of gene sets determined by latent semantic indexing of PubMed abstracts.

    PubMed

    Xu, Lijing; Furlotte, Nicholas; Lin, Yunyue; Heinrich, Kevin; Berry, Michael W; George, Ebenezer O; Homayouni, Ramin

    2011-04-14

    High-throughput genomic technologies enable researchers to identify genes that are co-regulated with respect to specific experimental conditions. Numerous statistical approaches have been developed to identify differentially expressed genes. Because each approach can produce distinct gene sets, it is difficult for biologists to determine which statistical approach yields biologically relevant gene sets and is appropriate for their study. To address this issue, we implemented Latent Semantic Indexing (LSI) to determine the functional coherence of gene sets. An LSI model was built using over 1 million Medline abstracts for over 20,000 mouse and human genes annotated in Entrez Gene. The gene-to-gene LSI-derived similarities were used to calculate a literature cohesion p-value (LPv) for a given gene set using a Fisher's exact test. We tested this method against genes in more than 6,000 functional pathways annotated in Gene Ontology (GO) and found that approximately 75% of gene sets in GO biological process category and 90% of the gene sets in GO molecular function and cellular component categories were functionally cohesive (LPv<0.05). These results indicate that the LPv methodology is both robust and accurate. Application of this method to previously published microarray datasets demonstrated that LPv can be helpful in selecting the appropriate feature extraction methods. To enable real-time calculation of LPv for mouse or human gene sets, we developed a web tool called Gene-set Cohesion Analysis Tool (GCAT). GCAT can complement other gene set enrichment approaches by determining the overall functional cohesion of data sets, taking into account both explicit and implicit gene interactions reported in the biomedical literature. GCAT is freely available at http://binf1.memphis.edu/gcat.

  3. Robust Design Optimization via Failure Domain Bounding

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2007-01-01

    This paper extends and applies the strategies recently developed by the authors for handling constraints under uncertainty to robust design optimization. For the scope of this paper, robust optimization is a methodology aimed at problems for which some parameters are uncertain and are only known to belong to some uncertainty set. This set can be described by either a deterministic or a probabilistic model. In the methodology developed herein, optimization-based strategies are used to bound the constraint violation region using hyper-spheres and hyper-rectangles. By comparing the resulting bounding sets with any given uncertainty model, it can be determined whether the constraints are satisfied for all members of the uncertainty model (i.e., constraints are feasible) or not (i.e., constraints are infeasible). If constraints are infeasible and a probabilistic uncertainty model is available, upper bounds to the probability of constraint violation can be efficiently calculated. The tools developed enable approximating not only the set of designs that make the constraints feasible but also, when required, the set of designs for which the probability of constraint violation is below a prescribed admissible value. When constraint feasibility is possible, several design criteria can be used to shape the uncertainty model of performance metrics of interest. Worst-case, least-second-moment, and reliability-based design criteria are considered herein. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, these strategies are easily applicable to a broad range of engineering problems.

  4. Development and pilot testing of a psychosocial intervention program for young breast cancer survivors.

    PubMed

    Ahmed, Kauser; Marchand, Erica; Williams, Victoria; Coscarelli, Anne; Ganz, Patricia A

    2016-03-01

    To describe the development, pilot testing, and dissemination of a psychosocial intervention addressing concerns of young breast cancer survivors (YBCS). Intervention development included needs assessment with community organizations and interviews with YBCS. Based on evidence-based models of treatment, the intervention included tools for managing anxiety, fear of recurrence, tools for decision-making, and coping with sexuality/relationship issues. After pilot testing in a university setting, the program was disseminated to two community clinical settings. The program has two distinct modules (anxiety management and relationships/sexuality) that were delivered in two sessions; however, due to attrition, an all day workshop evolved. An author constructed questionnaire was used for pre- and post-intervention evaluation. Post-treatment scores showed an average increase of 2.7 points on a 10 point scale for the first module, and a 2.3 point increase for the second module. Qualitative feedback surveys were also collected. The two community sites demonstrated similar gains among their participants. The intervention satisfies an unmet need for YBCS and is a possible model of integrating psychosocial intervention with oncology care. This program developed standardized materials which can be disseminated to other organizations and potentially online for implementation within community settings. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. [Development and Use of Hidrosig

    NASA Technical Reports Server (NTRS)

    Gupta, Vijay K.; Milne, Bruce T.

    2003-01-01

    The NASA portion of this joint NSF-NASA grant consists of objective 2 and a part of objective 3. A major effort was made on objective 2, and it consisted of developing a numerical GIs environment called Hidrosig. This major research tool is being developed by the University of Colorado for conducting river-network-based scaling analyses of coupled water-energy-landform-vegetation interactions including water and energy balances, and floods and droughts, at multiple space-time scales.Objective 2: To analyze the relevant remotely sensed products from satellites, radars and ground measurements to compute the transported water mass for each complete Strahler stream using an 'assimilated water balance equation' at daily and other appropriate time scales. This objective requires analysis of concurrent data sets for Precipitation (PPT), Evapotranspiration (ET) and stream flows (Q) on river networks. To solve this major problem, our decision was to develop Hidrosig, a new Open-Source GIs software. A research group in Colombia, South America, developed the first version of Hidrosig, and Ricardo Mantilla was part of this effort as an undergraduate student before joining the graduate program at the University of Colorado in 2001. Hydrosig automatically extracts river networks from large DEMs and creates a "link-based" data structure, which is required to conduct a variety of analyses under objective 2. It is programmed in Java, which is a multi-platform programming language freely distributed by SUN under a GPL license. Some existent commercial tools like Arc-Info, RiverTools and others are not suitable for our purpose for two reasons. First, the source code is not available that is needed to build on the network data structure. Second, these tools use different programming languages that are not most versatile for our purposes. For example, RiverTools uses an IDL platform that is not very efficient for organizing diverse data sets on river networks. Hidrosig establishes a clear data organization framework that allows a simultaneous analysis of spatial fields along river network structures involving Horton- Strahler framework. Software tools for network extraction from DEMs and network-based analysis of geomorphologic and topologic variables were developed during the first year and a part of second year.

  6. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  7. Effectively using communication to enhance the provision of pediatric palliative care in an acute care setting.

    PubMed

    Hubble, Rosemary; Trowbridge, Kelly; Hubbard, Claudia; Ahsens, Leslie; Ward-Smith, Peggy

    2008-08-01

    The capability of effectively communicating is crucial when providing palliative care, especially when the patient is a child. Communication among healthcare professionals with the child and family members must be clear, concise, and consistent. Use of a communication tool provides documentation for conversations, treatment plans, and specific desires related to care. This paper describes communication theory, portrays the use of this theory to develop a communication tool, and illustrates the use of this tool by multidisciplinary members of a healthcare team to provide pediatric palliative care.

  8. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, Dennis L.

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  9. Engineering a mobile health tool for resource-poor settings to assess and manage cardiovascular disease risk: SMARThealth study.

    PubMed

    Raghu, Arvind; Praveen, Devarsetty; Peiris, David; Tarassenko, Lionel; Clifford, Gari

    2015-04-29

    The incidence of chronic diseases in low- and middle-income countries is rapidly increasing both in urban and rural regions. A major challenge for health systems globally is to develop innovative solutions for the prevention and control of these diseases. This paper discusses the development and pilot testing of SMARTHealth, a mobile-based, point-of-care Clinical Decision Support (CDS) tool to assess and manage cardiovascular disease (CVD) risk in resource-constrained settings. Through pilot testing, the preliminary acceptability, utility, and efficiency of the CDS tool was obtained. The CDS tool was part of an mHealth system comprising a mobile application that consisted of an evidence-based risk prediction and management algorithm, and a server-side electronic medical record system. Through an agile development process and user-centred design approach, key features of the mobile application that fitted the requirements of the end users and environment were obtained. A comprehensive analytics framework facilitated a data-driven approach to investigate four areas, namely, system efficiency, end-user variability, manual data entry errors, and usefulness of point-of-care management recommendations to the healthcare worker. A four-point Likert scale was used at the end of every risk assessment to gauge ease-of-use of the system. The system was field-tested with eleven village healthcare workers and three Primary Health Centre doctors, who screened a total of 292 adults aged 40 years and above. 34% of participants screened by health workers were identified by the CDS tool to be high CVD risk and referred to a doctor. In-depth analysis of user interactions found the CDS tool feasible for use and easily integrable into the workflow of healthcare workers. Following completion of the pilot, further technical enhancements were implemented to improve uptake of the mHealth platform. It will then be evaluated for effectiveness and cost-effectiveness in a cluster randomized controlled trial involving 54 southern Indian villages and over 16000 individuals at high CVD risk. An evidence-based CVD risk prediction and management tool was used to develop an mHealth platform in rural India for CVD screening and management with proper engagement of health care providers and local communities. With over a third of screened participants being high risk, there is a need to demonstrate the clinical impact of the mHealth platform so that it could contribute to improved CVD detection in high risk low resource settings.

  10. Common data elements for substance use disorders in electronic health records: the NIDA Clinical Trials Network experience.

    PubMed

    Ghitza, Udi E; Gore-Langton, Robert E; Lindblad, Robert; Shide, David; Subramaniam, Geetha; Tai, Betty

    2013-01-01

    Electronic health records (EHRs) are essential in improving quality and enhancing efficiency of health-care delivery. By 2015, medical care receiving service reimbursement from US Centers for Medicare and Medicaid Services (CMS) must show 'meaningful use' of EHRs. Substance use disorders (SUD) are grossly under-detected and under-treated in current US medical care settings. Hence, an urgent need exists for improved identification of and clinical intervention for SUD in medical settings. The National Institute on Drug Abuse Clinical Trials Network (NIDA CTN) has leveraged its infrastructure and expertise and brought relevant stakeholders together to develop consensus on brief screening and initial assessment tools for SUD in general medical settings, with the objective of incorporation into US EHRs. Stakeholders were identified and queried for input and consensus on validated screening and assessment for SUD in general medical settings to develop common data elements to serve as shared resources for EHRs on screening, brief intervention and referral to treatment (SBIRT), with the intent of supporting interoperability and data exchange in a developing Nationwide Health Information Network. Through consensus of input from stakeholders, a validated screening and brief assessment instrument, supported by Clinical Decision Support tools, was chosen to be used at out-patient general medical settings. The creation and adoption of a core set of validated common data elements and the inclusion of such consensus-based data elements for general medical settings will enable the integration of SUD treatment within mainstream health care, and support the adoption and 'meaningful use' of the US Office of the National Coordinator for Health Information Technology (ONC)-certified EHRs, as well as CMS reimbursement. Published 2012. This article is a U.S. Government work and is in the public domain in the USA.

  11. Pharmacokinetic de-risking tools for selection of monoclonal antibody lead candidates

    PubMed Central

    Dostalek, Miroslav; Prueksaritanont, Thomayant; Kelley, Robert F.

    2017-01-01

    ABSTRACT Pharmacokinetic studies play an important role in all stages of drug discovery and development. Recent advancements in the tools for discovery and optimization of therapeutic proteins have created an abundance of candidates that may fulfill target product profile criteria. Implementing a set of in silico, small scale in vitro and in vivo tools can help to identify a clinical lead molecule with promising properties at the early stages of drug discovery, thus reducing the labor and cost in advancing multiple candidates toward clinical development. In this review, we describe tools that should be considered during drug discovery, and discuss approaches that could be included in the pharmacokinetic screening part of the lead candidate generation process to de-risk unexpected pharmacokinetic behaviors of Fc-based therapeutic proteins, with an emphasis on monoclonal antibodies. PMID:28463063

  12. [Organising an investigation site: a national training reference document].

    PubMed

    Cornu, Catherine; David, Frédérique; Duchossoy, Luc; Hansel-Esteller, Sylvie; Bertoye, Pierre-Henri; Giacomino, Alain; Mouly, Stéphane; Diebolt, Vincent; Blazejewski, Sylvie

    2014-01-01

    Several surveys have shown a declining performance of French investigators in conducting clinical trials. This is partly due to insufficient and heterogeneous investigator training and site organisation. A multidisciplinary group was set up to propose solutions. We describe the tools developed to improve study site organisation. This working group was made up of clinical research experts from academia, industry, drug regulatory authorities, general practice, and consulting. Methods and tools were developed to improve site organisation. The proposed tools mainly focus on increasing investigators' awareness of their responsibilities, their research environment, the importance of a thorough feasibility analysis, and the implementation of active patient recruitment strategies. These tools should be able to improve site organisation and performances in conducting clinical trials. © 2014 Société Française de Pharmacologie et de Thérapeutique.

  13. Development and validation of a Haitian Creole screening instrument for depression

    PubMed Central

    Rasmussen, Andrew; Eustache, Eddy; Raviola, Giuseppe; Kaiser, Bonnie; Grelotti, David; Belkin, Gary

    2014-01-01

    Developing mental health care capacity in post-earthquake Haiti is hampered by the lack of assessments that include culturally bound idioms Haitians use when discussing emotional distress. The current study describes a novel emic-etic approach to developing a depression screening for Partners In Health/Zanmi Lasante. In Study 1 Haitian key informants were asked to classify symptoms and describe categories within a pool of symptoms of common mental disorders. Study 2 tested the symptom set that best approximated depression in a sample of depressed and not depressed Haitians in order to select items for the screening tool. The resulting 13-item instrument produced scores with high internal reliability that were sensitive to culturally-informed diagnoses, and interpretations with construct and concurrent validity (vis-à-vis functional impairment). Discussion focuses on the appropriate use of this tool and integrating emic perspectives into developing psychological assessments globally. The screening tool is provided as an Appendix. PMID:25080426

  14. Defense Facility Condition: Revised Guidance Needed to Improve Oversight of Assessments and Ratings

    DTIC Science & Technology

    2016-06-01

    are to implement the standardized process in part by assessing the condition of buildings, pavement , and rail using the same set of software tools...facility to current standards; costs for labor, equipment, materials, and currency exchange rates overseas; costs for project planning and design ...example, the services are to assess the condition of buildings, pavement , and rail using Sustainment Management System software tools developed by the

  15. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development

    DTIC Science & Technology

    2016-02-01

    proof in mathematics. For example, consider the proof of the Pythagorean Theorem illustrated at: http://www.cut-the-knot.org/ pythagoras / where 112...methods and tools have made significant progress in their ability to model software designs and prove correctness theorems about the systems modeled...assumption criticality” or “ theorem root set size” SITAPS detects potentially brittle verification cases. SITAPS provides tools and techniques that

  16. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    PubMed

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  17. Osteoporosis risk prediction using machine learning and conventional methods.

    PubMed

    Kim, Sung Kean; Yoo, Tae Keun; Oh, Ein; Kim, Deok Won

    2013-01-01

    A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women, and compared with the ability of a conventional clinical decision tool, osteoporosis self-assessment tool (OST). We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Surveys (KNHANES V-1). The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests (RF), artificial neural networks (ANN), and logistic regression (LR) based on various predictors associated with low bone density. The learning models were compared with OST. SVM had significantly better area under the curve (AUC) of the receiver operating characteristic (ROC) than ANN, LR, and OST. Validation on the test set showed that SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0%. We were the first to perform comparisons of the performance of osteoporosis prediction between the machine learning and conventional methods using population-based epidemiological data. The machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.

  18. Riemannian and Lorentzian flow-cut theorems

    NASA Astrophysics Data System (ADS)

    Headrick, Matthew; Hubeny, Veronika E.

    2018-05-01

    We prove several geometric theorems using tools from the theory of convex optimization. In the Riemannian setting, we prove the max flow-min cut (MFMC) theorem for boundary regions, applied recently to develop a ‘bit-thread’ interpretation of holographic entanglement entropies. We also prove various properties of the max flow and min cut, including respective nesting properties. In the Lorentzian setting, we prove the analogous MFMC theorem, which states that the volume of a maximal slice equals the flux of a minimal flow, where a flow is defined as a divergenceless timelike vector field with norm at least 1. This theorem includes as a special case a continuum version of Dilworth’s theorem from the theory of partially ordered sets. We include a brief review of the necessary tools from the theory of convex optimization, in particular Lagrangian duality and convex relaxation.

  19. Development of a tool to measure person-centered maternity care in developing settings: validation in a rural and urban Kenyan population.

    PubMed

    Afulani, Patience A; Diamond-Smith, Nadia; Golub, Ginger; Sudhinaraset, May

    2017-09-22

    Person-centered reproductive health care is recognized as critical to improving reproductive health outcomes. Yet, little research exists on how to operationalize it. We extend the literature in this area by developing and validating a tool to measure person-centered maternity care. We describe the process of developing the tool and present the results of psychometric analyses to assess its validity and reliability in a rural and urban setting in Kenya. We followed standard procedures for scale development. First, we reviewed the literature to define our construct and identify domains, and developed items to measure each domain. Next, we conducted expert reviews to assess content validity; and cognitive interviews with potential respondents to assess clarity, appropriateness, and relevance of the questions. The questions were then refined and administered in surveys; and survey results used to assess construct and criterion validity and reliability. The exploratory factor analysis yielded one dominant factor in both the rural and urban settings. Three factors with eigenvalues greater than one were identified for the rural sample and four factors identified for the urban sample. Thirty of the 38 items administered in the survey were retained based on the factors loadings and correlation between the items. Twenty-five items load very well onto a single factor in both the rural and urban sample, with five items loading well in either the rural or urban sample, but not in both samples. These 30 items also load on three sub-scales that we created to measure dignified and respectful care, communication and autonomy, and supportive care. The Chronbach alpha for the main scale is greater than 0.8 in both samples, and that for the sub-scales are between 0.6 and 0.8. The main scale and sub-scales are correlated with global measures of satisfaction with maternity services, suggesting criterion validity. We present a 30-item scale with three sub-scales to measure person-centered maternity care. This scale has high validity and reliability in a rural and urban setting in Kenya. Validation in additional settings is however needed. This scale will facilitate measurement to improve person-centered maternity care, and subsequently improve reproductive outcomes.

  20. deepTools: a flexible platform for exploring deep-sequencing data.

    PubMed

    Ramírez, Fidel; Dündar, Friederike; Diehl, Sarah; Grüning, Björn A; Manke, Thomas

    2014-07-01

    We present a Galaxy based web server for processing and visualizing deeply sequenced data. The web server's core functionality consists of a suite of newly developed tools, called deepTools, that enable users with little bioinformatic background to explore the results of their sequencing experiments in a standardized setting. Users can upload pre-processed files with continuous data in standard formats and generate heatmaps and summary plots in a straight-forward, yet highly customizable manner. In addition, we offer several tools for the analysis of files containing aligned reads and enable efficient and reproducible generation of normalized coverage files. As a modular and open-source platform, deepTools can easily be expanded and customized to future demands and developments. The deepTools webserver is freely available at http://deeptools.ie-freiburg.mpg.de and is accompanied by extensive documentation and tutorials aimed at conveying the principles of deep-sequencing data analysis. The web server can be used without registration. deepTools can be installed locally either stand-alone or as part of Galaxy. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  2. Design and Testing of Flight Control Laws on the RASCAL Research Helicopter

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Hindson, William S.; Moralez. Ernesto, III; Tucker, George E.; Dryfoos, James B.

    2001-01-01

    Two unique sets of flight control laws were designed, tested and flown on the Army/NASA Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) JUH-60A Black Hawk helicopter. The first set of control laws used a simple rate feedback scheme, intended to facilitate the first flight and subsequent flight qualification of the RASCAL research flight control system. The second set of control laws comprised a more sophisticated model-following architecture. Both sets of flight control laws were developed and tested extensively using desktop-to-flight modeling, analysis, and simulation tools. Flight test data matched the model predicted responses well, providing both evidence and confidence that future flight control development for RASCAL will be efficient and accurate.

  3. Predicting cancer prognosis using interactive online tools: A systematic review and implications for cancer care providers

    PubMed Central

    Rabin, Borsika A.; Gaglio, Bridget; Sanders, Tristan; Nekhlyudov, Larissa; Dearing, James W.; Bull, Sheana; Glasgow, Russell E.; Marcus, Alfred

    2013-01-01

    Cancer prognosis is of keen interest for cancer patients, their caregivers and providers. Prognostic tools have been developed to guide patient-physician communication and decision-making. Given the proliferation of prognostic tools, it is timely to review existing online cancer prognostic tools and discuss implications for their use in clinical settings. Using a systematic approach, we searched the Internet, Medline, and consulted with experts to identify existing online prognostic tools. Each was reviewed for content and format. Twenty-two prognostic tools addressing 89 different cancers were identified. Tools primarily focused on prostate (n=11), colorectal (n=10), breast (n=8), and melanoma (n=6), though at least one tool was identified for most malignancies. The input variables for the tools included cancer characteristics (n=22), patient characteristics (n=18), and comorbidities (n=9). Effect of therapy on prognosis was included in 15 tools. The most common predicted outcome was cancer specific survival/mortality (n=17). Only a few tools (n=4) suggested patients as potential target users. A comprehensive repository of online prognostic tools was created to understand the state-of-the-art in prognostic tool availability and characteristics. Use of these tools may support communication and understanding about cancer prognosis. Dissemination, testing, refinement of existing, and development of new tools under different conditions are needed. PMID:23956026

  4. Emotional development in adolescence: what can be learned from a high school theater program?

    PubMed

    Larson, Reed W; Brown, Jane R

    2007-01-01

    Grounded-theory analyses were used to formulate propositions regarding the processes of adolescent emotional development. Progress in understanding this difficult topic requires close examination of emotional experience in context, and to do this the authors drew on qualitative data collected over the course of a high school theater production. Participants' (ages 14-17) accounts of experiences in this setting demonstrated their capacity to actively extract emotional knowledge and to develop strategies for managing emotions. These accounts suggested that youth's repeated "hot" experience of unfolding emotional episodes in the setting provided material for this active process of learning. Youth also learned by drawing on and internalizing the emotion culture of the setting, which provided concepts, strategies, and tools for managing emotional episodes.

  5. Updates to the Demographic and Spatial Allocation Models to ...

    EPA Pesticide Factsheets

    EPA announced the availability of the draft report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) for a 30-day public comment period. The ICLUS version 2 (v2) modeling tool furthered land change modeling by providing nationwide housing development scenarios up to 2100. ICLUS V2 includes updated population and land use data sets and addressing limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. [2017 UPDATE] Get the latest version of ICLUS and stay up-to-date by signing up to the ICLUS mailing list. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.

  6. Predictive Modeling of Estrogen Receptor Binding Agents Using Advanced Cheminformatics Tools and Massive Public Data.

    PubMed

    Ribay, Kathryn; Kim, Marlene T; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao

    2016-03-01

    Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR models, particularly for the activity cliffs that induce prediction errors. The results of this study indicate that the response profile of chemicals from public data provides useful information for modeling and evaluation purposes. The public big data resources should be considered along with chemical structure information when predicting new compounds, such as unknown ERα binding agents.

  7. p3d--Python module for structural bioinformatics.

    PubMed

    Fufezan, Christian; Specht, Michael

    2009-08-21

    High-throughput bioinformatic analysis tools are needed to mine the large amount of structural data via knowledge based approaches. The development of such tools requires a robust interface to access the structural data in an easy way. For this the Python scripting language is the optimal choice since its philosophy is to write an understandable source code. p3d is an object oriented Python module that adds a simple yet powerful interface to the Python interpreter to process and analyse three dimensional protein structure files (PDB files). p3d's strength arises from the combination of a) very fast spatial access to the structural data due to the implementation of a binary space partitioning (BSP) tree, b) set theory and c) functions that allow to combine a and b and that use human readable language in the search queries rather than complex computer language. All these factors combined facilitate the rapid development of bioinformatic tools that can perform quick and complex analyses of protein structures. p3d is the perfect tool to quickly develop tools for structural bioinformatics using the Python scripting language.

  8. A National Solar Digital Observatory

    NASA Astrophysics Data System (ADS)

    Hill, F.

    2000-05-01

    The continuing development of the Internet as a research tool, combined with an improving funding climate, has sparked new interest in the development of Internet-linked astronomical data bases and analysis tools. Here I outline a concept for a National Solar Digital Observatory (NSDO), a set of data archives and analysis tools distributed in physical location at sites which already host such systems. A central web site would be implemented from which a user could search all of the component archives, select and download data, and perform analyses. Example components include NSO's Digital Library containing its synoptic and GONG data, and the forthcoming SOLIS archive. Several other archives, in various stages of development, also exist. Potential analysis tools include content-based searches, visualized programming tools, and graphics routines. The existence of an NSDO would greatly facilitate solar physics research, as a user would no longer need to have detailed knowledge of all solar archive sites. It would also improve public outreach efforts. The National Solar Observatory is operated by AURA, Inc. under a cooperative agreement with the National Science Foundation.

  9. Putting tools in the toolbox: Development of a free, open-source toolbox for quantitative image analysis of porous media.

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.

    2014-12-01

    X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.

  10. Improving geriatric prescribing in the ED: a qualitative study of facilitators and barriers to clinical decision support tool use.

    PubMed

    Vandenberg, Ann E; Vaughan, Camille P; Stevens, Melissa; Hastings, Susan N; Powers, James; Markland, Alayne; Hwang, Ula; Hung, William; Echt, Katharina V

    2017-02-01

    Clinical decision support (CDS) may improve prescribing for older adults in the Emergency Department (ED) if adopted by providers. Existing prescribing order entry processes were mapped at an initial Veterans Administration Medical Center site, demonstrating cognitive burden, effort and safety concerns. Geriatric order sets incorporating 2012 Beers guidelines and including geriatric prescribing advice and prepopulated order options were developed. Geriatric order sets were implemented at two sites as part of the multicomponent 'Enhancing Quality of Prescribing Practices for Older Veterans Discharged from the Emergency Department' quality improvement initiative. Facilitators and barriers to order sets use at the two sites were evaluated. Phone interviews were conducted with two provider groups (n = 20), those 'EQUiPPED' with the interventions (n = 10, 5 at each site) and Comparison providers who were only exposed to order sets through a clickable option on the ED order menu within the patient's medical record (n = 10, 5 at each site). All providers were asked about order set 'use' and 'usefulness'. Users (n = 11) were asked about 'usability'. Order set adopters described 'usefulness' in terms of 'safety' and 'efficiency', whereas order set consultants and order set non-users described 'usefulness' in terms of 'information' or 'training'. Provider 'autonomy', 'comfort' level with existing tools, and 'learning curve' were stated as barriers to use. Quantifying efficiency advantages and communicating safety benefit over preexisting practices and tools may improve adoption of CDS in ED and in other settings of care. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  11. Effects-based strategy development through center of gravity and target system analysis

    NASA Astrophysics Data System (ADS)

    White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen

    2003-09-01

    This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.

  12. Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2012-01-01

    This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.

  13. Functional specifications for AI software tools for electric power applications. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faught, W.S.

    1985-08-01

    The principle barrier to the introduction of artificial intelligence (AI) technology to the electric power industry has not been a lack of interest or appropriate problems, for the industry abounds in both. Like most others, however, the electric power industry lacks the personnel - knowledge engineers - with the special combination of training and skills AI programming demands. Conversely, very few AI specialists are conversant with electric power industry problems and applications. The recent availability of sophisticated AI programming environments is doing much to alleviate this shortage. These products provide a set of powerful and usable software tools that enablemore » even non-AI scientists to rapidly develop AI applications. The purpose of this project was to develop functional specifications for programming tools that, when integrated with existing general-purpose knowledge engineering tools, would expedite the production of AI applications for the electric power industry. Twelve potential applications, representative of major problem domains within the nuclear power industry, were analyzed in order to identify those tools that would be of greatest value in application development. Eight tools were specified, including facilities for power plant modeling, data base inquiry, simulation and machine-machine interface.« less

  14. Blood Sugar, Your Pancreas, and Unicorns: The Development of Health Education Materials for Youth With Prediabetes.

    PubMed

    Yazel-Smith, Lisa G; Pike, Julie; Lynch, Dustin; Moore, Courtney; Haberlin, Kathryn; Taylor, Jennifer; Hannon, Tamara S

    2018-05-01

    The obesity epidemic has led to an increase in prediabetes in youth, causing a serious public health concern. Education on diabetes risk and initiation of lifestyle change are the primary treatment modalities. There are few existing age-appropriate health education tools to address diabetes prevention for high-risk youth. To develop an age-appropriate health education tool(s) to help youth better understand type 2 diabetes risk factors and the reversibility of risk. Health education tool development took place in five phases: exploration, design, analysis, refinement, and process evaluation. The project resulted in (1) booklet designed to increase knowledge of risk, (2) meme generator that mirrors the booklet graphics and allows youth to create their own meme based on their pancreas' current mood, (3) environmental posters for clinic, and (4) brief self-assessment that acts as a conversation starter for the health educators. Patients reported high likability and satisfaction with the health education tools, with the majority of patients giving the materials an "A" rating. The process evaluation indicated a high level of fidelity and related measures regarding how the health education tools were intended to be used and how they were actually used in the clinic setting.

  15. Development and Testing of the Church Environment Audit Tool.

    PubMed

    Kaczynski, Andrew T; Jake-Schoffman, Danielle E; Peters, Nathan A; Dunn, Caroline G; Wilcox, Sara; Forthofer, Melinda

    2018-05-01

    In this paper, we describe development and reliability testing of a novel tool to evaluate the physical environment of faith-based settings pertaining to opportunities for physical activity (PA) and healthy eating (HE). Tool development was a multistage process including a review of similar tools, stakeholder review, expert feedback, and pilot testing. Final tool sections included indoor opportunities for PA, outdoor opportunities for PA, food preparation equipment, kitchen type, food for purchase, beverages for purchase, and media. Two independent audits were completed at 54 churches. Interrater reliability (IRR) was determined with Kappa and percent agreement. Of 218 items, 102 were assessed for IRR and 116 could not be assessed because they were not present at enough churches. Percent agreement for all 102 items was over 80%. For 42 items, the sample was too homogeneous to assess Kappa. Forty-six of the remaining items had Kappas greater than 0.60 (25 items 0.80-1.00; 21 items 0.60-0.79), indicating substantial to almost perfect agreement. The tool proved reliable and efficient for assessing church environments and identifying potential intervention points. Future work can focus on applications within faith-based partnerships to understand how church environments influence diverse health outcomes.

  16. A practitioner's guide to service development.

    PubMed

    Lees, Liz

    2010-11-01

    Service development and service improvement are complex concepts, but this should not prevent practitioners engaging in, or initiating, them. There is no set blueprint for service development so this article examines the process, describes the skills required, lists some change management tools and offers a guide to the stages involved. The article aims to demystify service development for those considering embarking on the process for the first time.

  17. OrthoSelect: a protocol for selecting orthologous groups in phylogenomics.

    PubMed

    Schreiber, Fabian; Pick, Kerstin; Erpenbeck, Dirk; Wörheide, Gert; Morgenstern, Burkhard

    2009-07-16

    Phylogenetic studies using expressed sequence tags (EST) are becoming a standard approach to answer evolutionary questions. Such studies are usually based on large sets of newly generated, unannotated, and error-prone EST sequences from different species. A first crucial step in EST-based phylogeny reconstruction is to identify groups of orthologous sequences. From these data sets, appropriate target genes are selected, and redundant sequences are eliminated to obtain suitable sequence sets as input data for tree-reconstruction software. Generating such data sets manually can be very time consuming. Thus, software tools are needed that carry out these steps automatically. We developed a flexible and user-friendly software pipeline, running on desktop machines or computer clusters, that constructs data sets for phylogenomic analyses. It automatically searches assembled EST sequences against databases of orthologous groups (OG), assigns ESTs to these predefined OGs, translates the sequences into proteins, eliminates redundant sequences assigned to the same OG, creates multiple sequence alignments of identified orthologous sequences and offers the possibility to further process this alignment in a last step by excluding potentially homoplastic sites and selecting sufficiently conserved parts. Our software pipeline can be used as it is, but it can also be adapted by integrating additional external programs. This makes the pipeline useful for non-bioinformaticians as well as to bioinformatic experts. The software pipeline is especially designed for ESTs, but it can also handle protein sequences. OrthoSelect is a tool that produces orthologous gene alignments from assembled ESTs. Our tests show that OrthoSelect detects orthologs in EST libraries with high accuracy. In the absence of a gold standard for orthology prediction, we compared predictions by OrthoSelect to a manually created and published phylogenomic data set. Our tool was not only able to rebuild the data set with a specificity of 98%, but it detected four percent more orthologous sequences. Furthermore, the results OrthoSelect produces are in absolut agreement with the results of other programs, but our tool offers a significant speedup and additional functionality, e.g. handling of ESTs, computing sequence alignments, and refining them. To our knowledge, there is currently no fully automated and freely available tool for this purpose. Thus, OrthoSelect is a valuable tool for researchers in the field of phylogenomics who deal with large quantities of EST sequences. OrthoSelect is written in Perl and runs on Linux/Mac OS X. The tool can be downloaded at (http://gobics.de/fabian/orthoselect.php).

  18. Desiderata for a Computer-Assisted Audit Tool for Clinical Data Source Verification Audits

    PubMed Central

    Duda, Stephany N.; Wehbe, Firas H.; Gadd, Cynthia S.

    2013-01-01

    Clinical data auditing often requires validating the contents of clinical research databases against source documents available in health care settings. Currently available data audit software, however, does not provide features necessary to compare the contents of such databases to source data in paper medical records. This work enumerates the primary weaknesses of using paper forms for clinical data audits and identifies the shortcomings of existing data audit software, as informed by the experiences of an audit team evaluating data quality for an international research consortium. The authors propose a set of attributes to guide the development of a computer-assisted clinical data audit tool to simplify and standardize the audit process. PMID:20841814

  19. Representing the work of medical protocols for organizational simulation.

    PubMed Central

    Fridsma, D. B.

    1998-01-01

    Developing and implementing patient care protocols within a specific organizational setting requires knowledge of the protocol, the organization, and the way in which the organization does its work. Computer-based simulation tools have been used in many industries to provide managers with prospective insight into problems of work process and organization design mismatch. Many of these simulation tools are designed for well-understood routine work processes in which there are few contingent tasks. In this paper, we describe theoretic that make it possible to simulate medical protocols using an information-processing theory framework. These simulations will allow medical administrators to test different protocol and organizational designs before actually using them within a particular clinical setting. PMID:9929231

  20. A knowledge based search tool for performance measures in health care systems.

    PubMed

    Beyan, Oya D; Baykal, Nazife

    2012-02-01

    Performance measurement is vital for improving the health care systems. However, we are still far from having accepted performance measurement models. Researchers and developers are seeking comparable performance indicators. We developed an intelligent search tool to identify appropriate measures for specific requirements by matching diverse care settings. We reviewed the literature and analyzed 229 performance measurement studies published after 2000. These studies are evaluated with an original theoretical framework and stored in the database. A semantic network is designed for representing domain knowledge and supporting reasoning. We have applied knowledge based decision support techniques to cope with uncertainty problems. As a result we designed a tool which simplifies the performance indicator search process and provides most relevant indicators by employing knowledge based systems.

Top